GB2557884A - Device control apparatus and method - Google Patents

Device control apparatus and method Download PDF

Info

Publication number
GB2557884A
GB2557884A GB1611054.6A GB201611054A GB2557884A GB 2557884 A GB2557884 A GB 2557884A GB 201611054 A GB201611054 A GB 201611054A GB 2557884 A GB2557884 A GB 2557884A
Authority
GB
United Kingdom
Prior art keywords
content
processing device
metadata
instructions
devices
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1611054.6A
Other versions
GB201611054D0 (en
Inventor
Lucas Barcias Jesus
Charles Boulter Joseph
Antony Hack Stephen
Lee Jones Michael
William Walker Andrew
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB1611054.6A priority Critical patent/GB2557884A/en
Publication of GB201611054D0 publication Critical patent/GB201611054D0/en
Publication of GB2557884A publication Critical patent/GB2557884A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D25/00Control of light, e.g. intensity, colour or phase
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4131Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/17Operational modes, e.g. switching from manual to automatic mode or prohibiting specific operations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A processing device 100 (and associated method of operation) to control associated devices comprising a processor (102 fig 2) , a metadata analysing unit (101 fig 2) unit to analyse qualitative metadata (figure 3) describing content to be displayed on a display 110 associated with the processing device 100 , the processor 102 further generating instructions enabling control of one or more associated devices 120a-g (e.g ambient light, thermostat temperature setting) , the issued instructions dependent on the analysed metadata, and a communication unit (103 figure 2) operable to transmit the issued instructions and control associated device operation. The communication unit may also receive information from the associated devices including functionality and physical location. The metadata may or may not be a part (integral with) displayed content, whilst the processing device (or processor) need not provide content to the display. Metadata may include information regarding events taking place in the content, the genre of the present content and ambience of the present or to be displayed portion, and this may be retrieved in advance of the display if for example temperature is to be lowered or increased. Instructions generated by the processor may include user preferences. Instructions may take form of control of light source to change brightness, colour or light source operation. The temperature may also be controlled. The processor may generate or use a map of the device locations. Computer programs may implement the processor method of operation.

Description

(71) Applicant(s):
Sony Interactive Entertainment Inc.
1-7-1 Konan, Minato-Ku 108-8270, Tokyo, Japan (72) Inventor(s):
Jesus Lucas Barcias Joseph Charles Boulter Stephen Antony Hack Michael Lee Jones Andrew William Walker (56) Documents Cited:
WO 2008/142616 A1 WO 2006/003624 A1 WO 2005/084339 A2 US 20160094878 A1 US 20130166042 A1 US 20100177247 A1 US 20060058925 A1 (58) Field of Search:
INT CL G05B, G05D, H04N, H05B Other: ONLINE: WPI, EPODOC, INSPEC (74) Agent and/or Address for Service:
D Young & Co LLP
120 Holborn, LONDON, EC1N 2DY, United Kingdom (54) Title of the Invention: Device control apparatus and method
Abstract Title: Media metadata analysing unit to control surrounding lights and temperature during a TV or moving picture performance (57) A processing device 100 (and associated method of operation) to control associated devices comprising a processor (102 fig 2), a metadata analysing unit (101 fig 2) unit to analyse qualitative metadata (figure 3) describing content to be displayed on a display 110 associated with the processing device 100 , the processor 102 further generating instructions enabling control of one or more associated devices 120a-g (e.g ambient light, thermostat temperature setting), the issued instructions dependent on the analysed metadata, and a communication unit (103 figure 2) operable to transmit the issued instructions and control associated device operation. The communication unit may also receive information from the associated devices including functionality and physical location. The metadata may or may not be a part (integral with) displayed content, whilst the processing device (or processor) need not provide content to the display. Metadata may include information regarding events taking place in the content, the genre of the present content and ambience of the present or to be displayed portion, and this may be retrieved in advance of the display if for example temperature is to be lowered or increased. Instructions generated by the processor may include user preferences. Instructions may take form of control of light source to change brightness, colour or light source operation. The temperature may also be controlled. The processor may generate or use a map of the device locations. Computer programs may implement the processor method of operation.
FIG, 1
120a
120a
120b
Display
110
Processing
Device
100
120f
120c 120d 120e
At least one drawing originally filed was informal and the print reproduced here is taken from a later filed formal copy.
1/3
08 16
Figure GB2557884A_D0001
FIG. 1
08 16
Figure GB2557884A_D0002
Γ™ I r I G 2
310 320 330 340
s %
Content Type Genre Events Ambiance
rib. ο
Analyse metadata
Generate instructions
08 16
Transmit instructions
DEVICE CONTROL APPARATUS AND METHOD
This disclosure relates to an apparatus and method for controlling devices.
In recent years there has been a growing interest in extending display-based entertainment experiences, such as playing video games or watching movies, into the physical environment. This is expected to increase the sense of immersion experienced by a user of entertainment content, and as a result increase the level of enjoyment that is provided.
A number of arrangements have been proposed in order to realise this increased sense of immersion. For example, in a gaming context, video cameras operable to view the player and their environment may be used to control interaction with a game; the EyeToy® video games were an early example of this in which a user’s actions could be tracked and interpreted to correspond to in-game actions. As an additional example, both gaming and viewing arrangements may allow the user to be able to view content in a 3D format for example.
This may improve the sense of immersion experienced by a viewer, but these arrangements do not provide for any physical interaction of the entertainment system with the environment and so do not fully achieve the above aim.
One example of an arrangement that interacts with the environment in a physical way is that of a television that provides ambient lighting. This is an arrangement that comprises a number of lighting elements that are configured to illuminate the wall surrounding the television. The colour and brightness of the ambient lighting is dependent upon the image that is displayed upon the screen, and use of ambient lighting is intended to give the feel of the on-screen image extending beyond the display. However the lighting is only applied to the wall upon which the display is mounted, and as a result interaction with the environment is limited and therefore fails to provide a high level of immersion for the viewer.
More recently, light bulbs that may be controlled via a compatible device, such as a mobile phone that is in communication with the lights via a network, have become more popular. Such lights may be controlled to vary their colour and output light intensity based upon a user input. Other uses of the lights have included lights to be controlled based upon a detected colour (from an image obtained by a camera), thus reducing the requirement for human control. In another arrangement, a predefined lighting scheme is supplied to an entertainment device with video content, and audio cues from content that is being viewed are used to control the timing of the lighting changes.
However, such arrangements may be disadvantageous in that control is either performed by user input or based upon a predefined set of instructions that require authoring for each piece of content. This therefore represents either the time-consuming requirement of authoring light scheme instructions for each piece of content, or a low-quality performance as a result of user-authoring of instructions; each of these is clearly undesirable. Pre-authored light scheme instructions are also considered to be inflexible; they are only suitable when each user has the same arrangement of lights with the same functionality as those for which the instructions were intended. Such instructions may therefore be rendered out-of-date in a short period of time, and will either not provide the correct experience or will need to be re-authored on a regular basis.
The following disclosure seeks to mitigate these problems by providing an arrangement in which a higher-quality set of instructions may be generated for associated devices with a reduced amount of user input. The instructions are generated locally for the user’s own arrangement of devices based upon information about the content. Therefore an immersive viewing experience that extends into the physical environment of the viewer may be provided in a manner in which the addition of new devices or functionality causes the experience to be enhanced rather than impaired.
This disclosure is defined by claims 1 and 11, with further respective aspects and features of the disclosure being defined in the appended claims.
Embodiments of the disclosure will now be described with reference to the accompanying drawings, in which:
Figure 1 schematically illustrates an arrangement of light sources;
Figure 2 schematically illustrates a processing device;
Figure 3 schematically illustrates a metadata format; and
Figure 4 schematically illustrates a method for controlling associated devices.
Figure 1 schematically illustrates an entertainment system, comprising a processing device 100, a display 110 and a plurality of light sources 120a-120g. The light sources 120a120g shall be collectively referred to as the light sources 120 in this disclosure.
The processing device 100 may be any device capable of performing processing to generate a set of instructions for controlling the light sources 120. In addition to this function, the processing device may be able to act as a source of video content or the like. For example, the processing device 100 may be a Sony® PlayStation 4®, or it may be a mobile device such as a mobile phone that possesses suitable functionality.
The display 110 is operable to display content to a viewer, this content being obtained either from the processing device 100 or from a different source, such as video on demand content obtained via the internet. Such content may be accessible by a display (such as a smart TV) without the use of the processing device 100; therefore the processing device 100 may generate instructions to control the light sources 120 but not be used to provide content to the display 110.
Each of the light sources 120 is operable to communicate with at least one of the processing device 100 or display 110 via a wired or wireless network (not shown). Other devices (not shown) may also be able to communicate with the light sources 120 via the network, such as mobile phones or laptops that may also be able to control the operation of the light sources 120.
The light sources 120 may be configured to be able to vary a number of their operational parameters, such as the colour and brightness of the output light, or a mode of operation (continuous or pulsed light, for example).
The processing device 100 may be provided with, or generate, a map of the location of the light sources 120. For example, a user may input the locations of light sources 120 individually, or the light sources 120 may be able to provide indications of their own location to the processing device 100. If the processing device 100 has a camera associated with it, test signals may be sent to the light sources 120 (causing them to turn on) and images could be captured of this interaction in order to build a map of light sources 120 in the environment.
The described system may be an example of an ‘internet of things’, a network in which objects are able to be sensed and controlled remotely. The processing device 100 may therefore able to determine the location and/or capabilities of the light sources 120. It is also considered that other devices (which may also belong to the internet of things) may also be controllable by the processing device 100, such as a thermostat that controls the temperature of the viewer’s home or any other compatible electronic devices or appliances in the viewer’s home.
While it is appreciated that many different types of devices may be controllable via instructions from the processing device, this disclosure provides examples primarily in terms of light sources being controlled. It would however be apparent to the skilled person, in view of this disclosure, as to how to extend the teachings of this document in order to control a range of associated devices.
In the described system, a user is able to view content provided by the processing device 100 via the display 110. The processing device 100 is operable to generate instructions to control the light sources 120 (or any other connected devices) in dependence upon the nature of the content that is being viewed; for example, instructions may be generated based upon type of viewing, genre of content or in-content events.
It is also considered that the user may view content on the display 110 that does not originate from the processing device 100; in such a scenario, information about the content may still be provided to the processing device in order for the processor to generate a set of instructions for the light sources 120 or any other connected devices.
In the present disclosure, ‘content’ may be used to refer to any of a number of different types of media. For example, television programmes, movies, music and games may all be suitable types of content for use with the disclosed arrangement. In the present disclosure embodiments will be described primarily with reference to movie content, although it would be apparent to the skilled person in view of this disclosure as to how to implement the disclosed arrangement for other types of media.
Figure 2 schematically illustrates the configuration of a processing device 100. The processing device 100 comprises a metadata analysing unit 101, a processor 102, a communication unit 103 and a storage unit 104.
The metadata analysing unit 101 is operable to analyse qualitative metadata that describes video content to be displayed on the display 110 associated with the processing device 100. The metadata itself is described in more detail below with reference to Figure 3.
The processor 102 is operable to generate instructions for controlling devices (such as the light sources 120) associated with the processing device 100, the instructions being dependent on the metadata analysed by the metadata analysing unit 101. These instructions may comprise a list of one or more changes to the operation of the light sources 120 along with the times at which these changes should be made. Alternatively, the instructions may simply be a list of one or more commands that may be sent directly to the devices at the appropriate times.
The communication unit 103 is operable to transmit the instructions generated by the processor 102 to control the operation of the associated devices. As noted above, the instructions may comprise a list of changes to the operation of the associated devices that should be made; therefore the instructions that are transmitted to the devices may be in one or more different respective formats to those that are initially generated, in order to be read by the devices receiving the instructions. This transmission of instructions may be via a wired network, a wireless network or some combination of the two.
The storage unit 104 is operable to store a program that may be executed by the processor 102 in order to generate appropriate instructions for controlling the associated devices in response to the metadata analysis performed by the metadata analysing unit 101. Content to be displayed may also be stored on the storage unit 104, although the content could instead be provided to the processing device 100 using removable media or by streaming the content rather than having it stored locally.
The metadata associated with a piece of content may be stored with that content and hence obtained from removable media or a content stream.
Alternatively, or in addition, the storage unit 104 may store the metadata associated with a piece of content. This may either be obtained when downloading (or otherwise obtaining) the piece of content or obtained as part of a separate process; for example, if an older DVD comprising content does not contain the corresponding metadata is read by the processing device 100, the appropriate metadata may be downloaded from a server.
Figure 3 schematically illustrates a piece of metadata 300 that describes a piece of content, such as a movie. This metadata is used to indicate properties of the content rather than directly providing instructions for the associated devices.
A piece of content may have a single piece of corresponding metadata relating to the entire piece of content. Such metadata may be provided with one or more appropriate indicators, such as one or more timestamps. The timestamps may be used to indicate when events occur (for example) and therefore indicate when the operation of associated devices should be modified. Events may include advert breaks, a transition from opening credits into a film, or a transition from a film into end credits.
Alternatively a plurality of pieces of metadata may be provided that each relate to a portion of the piece of content; for example, each scene of a film or each location in a game. Each of the plurality of pieces of content may comprise one or more time stamps or the like, as in the metadata described above, in order to be able to provide information about different portions of the content.
Content type information 310 details the type of content to which the metadata corresponds. For example this could distinguish between a TV programme and an advertisement embedded in the TV programme, or between a gameplay portion and a cutscene portion of a video game. The content type information may be provided with timestamps or the like to indicate a transition of the content type, rather than providing a separate piece of metadata for each set of adverts in addition to that provided for the content itself.
Genre information 320 is used to indicate the genre to which the content belongs; for example, it could indicate whether a movie was ‘horror’ or ‘comedy’. This information may be used to influence a large scale effect, such as an overall target type of environment effect. For example, this information could be used to identify an overall brightness or temperature of the environment.
Event information 330 is information about any in-content events that occur; for example a change of scene or, in the case of a horror movie, a part of the movie which is intended to startle the viewer. This information could be used to indicate that a sudden or rapid modification of associated devices is intended, for example.
Ambience information 340 may be used to give information about the general ‘feel’ of the content; for example it could be used to indicate whether a scene takes place during daytime or night-time, or whether it is set in the arctic or the tropics. This information could therefore be used by the processor 102 to influence a general brightness of the target environment lighting, for example, or to alter a thermostat or air conditioner’s settings.
In view of this disclosure, the skilled person will appreciate that the metadata may have a different format, or comprise additional or less information, rather than the metadata being limited to the above form. For example, data may also be provided to indicate an intended general level of illumination for a room or that only a specific type of associated device is to be used in modifying the environment (such as only lighting). Similarly for example, the ambience information 340 could be omitted altogether.
The metadata may take one or more of several forms.
In one instance, the metadata may be provided as an additional piece of data separate from but associated with the piece of content, such as in the form of a table optionally with one or more timestamps and corresponding information for the piece of content at the or each respective time. Hence for example, metadata relating to the content as a whole (such as genre) may be provided with associated timestamp information, whilst if desired, metadata relating to specific events or smaller time periods in the content may be provided in the table with associated timestamps. This format may be appropriate when using a legacy piece of content (such as an old game or film that does not comprise the appropriate metadata), as the metadata may then be provided separately to the content. For example, the metadata could be automatically downloaded from a server upon recognition of the playback of the legacy content by the processor, or a user could obtain the metadata themselves independently of the obtaining of the content.
Alternatively, or in addition, in another instance the metadata may be embedded in userdata sections of the corresponding content. Embedding the metadata in the content allows it to be made available at the appropriate times during playback of the piece of content. The meta data may still be provided as a single additional piece of data where content formats allow, or alternatively may be distributed across some or all of the duration of the content. In this latter case, timestamps may not be required if the metadata is located or distributed in the content at appropriate points in time, such as for example at the transition to advert breaks. Hence again for example, metadata relating to the content as a whole (such as genre) may be embedded at the start of the content while metadata relating to specific events or smaller time periods may be provided throughout the content at the appropriate times.
Such metadata is an example of qualitative metadata in that it describes the content with which it is associated, for example in descriptive, classifying, or categorical terms such as ‘daytime’, ‘horror’, ‘end credits’ or the like. This is in contrast to quantitative metadata, such as that described in the introduction (with reference to the arrangement in which metadata comprises instructions for lighting in conjunction with audio cues), which simply defines instructions for associated devices or values of parameters for those associated devices, without reliance on an intermediate reference to qualities of the content itself.
It will be appreciated that the metadata may encode these qualitative features by reference, for example, to a predetermined ‘decoding’ look up table. Hence as a non-limiting example the metadata may take values from 0-255 corresponding to up to 256 terms found in a look up table held by the processing device. Furthermore, the specific format of the meta data may vary depending on the medium and encoding used for the content. Consequently, the processing device may comprise more than one decoding look up table, selecting the appropriate decoding look up table in response to the format and/or encoding method of the respective content.
Figure 4 schematically illustrates a method in which this metadata is used to generate instructions for controlling devices associated with a processing device.
A step 401 comprises analysing metadata relating to content to be displayed on a display associated with the processing device. This analysis comprises reading the metadata that is received either with the content or with reference to content to be displayed, in order to extract information about the content such as that described above. This information is indicative of the experience that the user would expect upon viewing the content, and thus may be used to generate such an experience.
A step 402 comprises generating instructions for controlling devices associated with the processing device, the instructions being dependent on the analysed metadata. These instructions may be generated for each device or for the environment as a whole, with instructions for specific devices being derived from the more general environmental instructions.
An exemplary instruction generation method comprises the following steps, with a number of alternatives or modifications described below.
In a first step, the processing device polls the associated devices in order to generate a list of device IDs and capabilities. This may be done via direct communication or via a network. Alternatively, where some or all properties of a device cannot be obtained in this manner, a user may input these via a user interface to similar effect. Optionally, in either case properties of common/popular devices may be pre-loaded in a database to assist the process.
The processing device then consults a predetermined ‘function’ look-up table listing operations corresponding to the information contained in the analysed metadata; for example, this function look up table could associate horror with low lighting levels and temperatures, and comedy with warm lighting and normal temperature.
The processing device then identifies a correspondence between the capabilities of associated devices and the effects listed in the function look-up table.
The processing device then generates a set of instructions for the associated devices based upon the identified correspondence such that, when instructed, the associated devices generate the intended environmental effect.
A spatial arrangement of the associated devices may also be considered. For example, the function lookup table could associate horror with low lighting levels on the side of the room comprising the display, but with normal lighting levels behind the user. Consequently where a spatial arrangement of the devices is not known, the lighting I operations associated with the region closest to the display may be applied globally, or an average of the spatial values may be applied, according to designer / user choice. Meanwhile, where respective positions of the devices are known, their operation is assigned according to their position (in this example, whether they are on the side of the room comprising the display, or located behind the user).
Instructions may be generated in real time, or a small amount of time in advance. For example, instructions may be generated by consulting the metadata every few frames or when a change of scene is detected or indicated by a timestamp, and then determining an appropriate effect to apply to the environment. Alternatively, instructions for the whole piece of content may be generated when playback begins and then transmitted to the associated devices at an appropriate time, if the appropriate metadata is available at the start of playback.
Instructions may be generated ahead of time in view of the time taken to implement changes in the environment. For example, changes to the temperature of a room may need to be implemented much earlier than changes in the lighting as it takes much longer to heat or cool a room than it does to adjust lights. Therefore all the metadata may be processed to generate instructions well in advance of the display of the corresponding portion of the piece of content, or subsets of the metadata may be processed in advance for this purpose (such as heating instructions being generated before lighting instructions).
In one example, the processing device derives an intended effect to be applied to the environment (such as a target lighting scheme) from the metadata and then polls the associated devices in order to obtain information about their position, capabilities and/or current functionality (this poll may be optional for a given associated device if already performed previously, and/or if the processing device has on/off control of the device); individual instructions are then generated for each associated device in order to reproduce (or at least approximate) the intended effect in dependence upon the results of the polling. Alternatively, or in addition, the processing device may obtain information about the associated devices prior to the metadata analysis.
As suggested previously herein, the processing device may generate instructions using a mapping of the locations of respective associated devices in the environment; this mapping may be provided to, or generated by, the processing device in advance of content playback. The use of such a map enables a more useful set of instructions to be generated, as different regions of the environment can be identified and devices in different areas may be controlled differently in order to generate more complex effects. For example, with reference to Figure 1, light sources 120c, 120d and 120e may be emitting light whilst the other light sources do not. This will create a lighting gradient from the back of the room to the front of the room, which may be an effective way of providing illumination without distracting from the display; this may be useful when the display is used to show a night-time scene, for example.
Clearly however even where the locations of devices are known, not all environmental effects require spatial differentiation. For example, in some applications the position of the associated devices may be considered to be irrelevant for the purpose of generating instructions - such as only specifying a desired colour of light - and therefore the process of generating an intended effect for the environment with respect to the individual positions of associated devices is unnecessary.
It would be appreciated by the skilled person that a combination of these alternatives may be possible, such as using a different method for different device types. An example of this is generating an intended lighting effect for the environment (such as light at the back of a room but no light at the front), but generating instructions for heat-modifying devices could be generated separately (such as generating an instruction for all heat-modifying devices to be switched off, without a consideration of the location of the heaters/coolers or the spatial distribution of heaters and coolers in the environment). Alternatively, or in addition, different methods of generating instructions may be used for different portions of the same piece of content in dependence upon the content type or any other property.
The instructions may also be generated in dependence upon user preferences. For example, a user may define a minimum brightness of the environment or an aversion to a particular colour of light or mode of operation of the light sources. In the context of a variation of the temperature of an environment, the user may wish to specify a comfortable range in which the processing device is able to cause a variation of the temperature.
Finally, a step 403 comprises transmitting instructions to control the operation of the associated devices, the transmitted instructions being derived from the instructions generated by the processor.
In some embodiments, the processing device is not in direct communication with each of the associated devices; therefore it may output the instructions to an intermediate network device that is in direct communication with the associated devices. A combination of the two connection types may be appropriate in a number of embodiments; for example if the instructions include a variation in the operation of the display (such as a change of brightness) or if the display has lights associated with it (such as the ambient lighting described above).
Depending on the functionality of the associated devices, instructions may be transmitted to the devices at the time at which a change in operation is desired. Alternatively, if instructions are generated ahead of time they may be transmitted along with a timestamp or the like to indicate when a change in operation should be performed, if the recipient device has a corresponding timing facility. Similarly if a delayed response is expected (for example, for a change in temperature at the user’s position), instructions may be sent to appropriate devices in advance to compensate for the delay. A combination of these alternatives may be implemented by a processing device if some associated devices support such a feature but others do not.
As noted previously this disclosure provides examples primarily in terms of light sources and heat sources being controlled. It would however be apparent to the skilled person, in view of this disclosure, as to how to extend the teachings of this document in order to control a range of associated devices. For example, electric blinds I curtains may be controlled to provide privacy (or further lighting control). Similarly other devices such as vacuuming robots may be controlled to reduce ambient noise levels, in response to the dramatic impact of a scene, or the genre of a whole piece of content. For example a vacuum may be allowed to operate during a sports match, but not during a period drama or a love scene. Similarly a washing machine may be instructed to pause or delay a spin cycle during a dramatic part of a film.
The techniques described above may be implemented in hardware, software or combinations of the two. In the case that a software-controlled data processing apparatus is employed to implement one or more features of the embodiments, it will be appreciated that such software, and a storage or transmission medium such as a non-transitory machinereadable storage medium by which such software is provided, are also considered as embodiments of the disclosure.
io

Claims (15)

Claims:
1. A processing device for controlling associated devices, the processing device comprising:
a metadata analysing unit operable to analyse qualitative metadata that describes content to be displayed on a display associated with the processing device;
a processor operable to generate instructions for controlling one or more devices associated with the processing device, the instructions being responsive to the analysed metadata; and a communication unit operable to transmit the instructions to control the operation of the or each associated device.
2. A processing device according to claim 1, wherein the communication unit is operable to receive information from the or each associated device about their respective functionality and/or location.
3. A processing device according to claim 1, wherein the metadata is provided to the processing device with the corresponding content to be displayed or is obtained separately to the content to be displayed.
4. A processing device according to claim 1, wherein the processing device is not used to provide content to the associated display.
5. A processing device according to claim 1, wherein the metadata comprises information about one or more from the list consisting of:
i. events that take place in the content, ii. the genre to which the content belongs, and iii. the ambience of a portion of the content.
6. A processing device according to claim 1, wherein the instructions are generated in dependence upon user preferences.
7. A processing device according to claim 1, wherein the associated devices comprise one or more light sources.
8. A processing device according to claim 7, wherein the transmitted instructions include one or more from the list consisting of:
instruction to cause light sources to change one or more of the brightness of output light, ii. instruction to cause change to the colour of the output light, and iii. instruction to cause change to the operation mode of the light source.
9. A processing device according to claim 1, wherein the associated devices comprise one or more temperature control devices which are operable to vary the temperature of the environment in response to transmitted instructions.
10. A processing device according to claim 1, wherein the processor generates instructions using a map of the locations of respective associated devices in the environment.
11. A processing method for controlling devices associated with a processing device that performs the processing method, the processing method comprising:
analysing qualitative metadata descriptive of content to be displayed on a display associated with the processing device;
generating instructions for controlling devices associated with the processing device, the instructions being dependent on the analysed metadata; and transmitting the instructions to control the operation of the associated devices.
12. A processing method according to claim 11, wherein the metadata is provided to the processing device with the corresponding content to be displayed or is obtained separately to the content to be displayed
13. A processing method according to claim 1, wherein the metadata comprises information about one or more from the list consisting of:
i. events that take place in the content, ii. the genre to which the content belongs, and iii. the ambience of a portion of the content.
14. A computer program that, when executed by a computer, causes the computer to carry out the method of claim 11.
15. A machine-readable non-transitory storage medium which stores computer software according to claim 14.
Intellectual
Property
Office
Application No: Claims searched:
GB1611054.6
All
GB1611054.6A 2016-06-24 2016-06-24 Device control apparatus and method Withdrawn GB2557884A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1611054.6A GB2557884A (en) 2016-06-24 2016-06-24 Device control apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1611054.6A GB2557884A (en) 2016-06-24 2016-06-24 Device control apparatus and method

Publications (2)

Publication Number Publication Date
GB201611054D0 GB201611054D0 (en) 2016-08-10
GB2557884A true GB2557884A (en) 2018-07-04

Family

ID=56891676

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1611054.6A Withdrawn GB2557884A (en) 2016-06-24 2016-06-24 Device control apparatus and method

Country Status (1)

Country Link
GB (1) GB2557884A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10451952B2 (en) 2016-12-12 2019-10-22 Gracenote, Inc. Systems and methods to transform events and/or mood associated with playing media into lighting effects
US11071182B2 (en) 2019-11-27 2021-07-20 Gracenote, Inc. Methods and apparatus to control lighting effects

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005084339A2 (en) * 2004-03-02 2005-09-15 Color Kinetics Incorporated Entertainment lighting system
WO2006003624A1 (en) * 2004-06-30 2006-01-12 Koninklijke Philips Electronics, N.V. Ambient lighting derived from video content and with broadcast influenced by perceptual rules and user preferences
US20060058925A1 (en) * 2002-07-04 2006-03-16 Koninklijke Philips Electronics N.V. Method of and system for controlling an ambient light and lighting unit
WO2008142616A1 (en) * 2007-05-22 2008-11-27 Koninklijke Philips Electronics N.V. Method and unit for control of ambient lighting
US20100177247A1 (en) * 2006-12-08 2010-07-15 Koninklijke Philips Electronics N.V. Ambient lighting
US20130166042A1 (en) * 2011-12-26 2013-06-27 Hewlett-Packard Development Company, L.P. Media content-based control of ambient environment
US20160094878A1 (en) * 2014-09-29 2016-03-31 Sony Corporation Device and method for generating metadata log for video data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060058925A1 (en) * 2002-07-04 2006-03-16 Koninklijke Philips Electronics N.V. Method of and system for controlling an ambient light and lighting unit
WO2005084339A2 (en) * 2004-03-02 2005-09-15 Color Kinetics Incorporated Entertainment lighting system
WO2006003624A1 (en) * 2004-06-30 2006-01-12 Koninklijke Philips Electronics, N.V. Ambient lighting derived from video content and with broadcast influenced by perceptual rules and user preferences
US20100177247A1 (en) * 2006-12-08 2010-07-15 Koninklijke Philips Electronics N.V. Ambient lighting
WO2008142616A1 (en) * 2007-05-22 2008-11-27 Koninklijke Philips Electronics N.V. Method and unit for control of ambient lighting
US20130166042A1 (en) * 2011-12-26 2013-06-27 Hewlett-Packard Development Company, L.P. Media content-based control of ambient environment
US20160094878A1 (en) * 2014-09-29 2016-03-31 Sony Corporation Device and method for generating metadata log for video data

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10451952B2 (en) 2016-12-12 2019-10-22 Gracenote, Inc. Systems and methods to transform events and/or mood associated with playing media into lighting effects
US11543729B2 (en) 2016-12-12 2023-01-03 Gracenote, Inc. Systems and methods to transform events and/or mood associated with playing media into lighting effects
US11071182B2 (en) 2019-11-27 2021-07-20 Gracenote, Inc. Methods and apparatus to control lighting effects
US11470700B2 (en) 2019-11-27 2022-10-11 Gracenote Inc Methods and apparatus to control lighting effects

Also Published As

Publication number Publication date
GB201611054D0 (en) 2016-08-10

Similar Documents

Publication Publication Date Title
US11113897B2 (en) Systems and methods for presentation of augmented reality supplemental content in combination with presentation of media content
US10721527B2 (en) Device setting adjustment based on content recognition
US20240031632A1 (en) Systems and methods for displaying multiple media assets for a plurality of users
US20130061258A1 (en) Personalized television viewing mode adjustments responsive to facial recognition
US20170257664A1 (en) Method, electronic device, and system for immersive experience
EP2926626B1 (en) Method for creating ambience lighting effect based on data derived from stage performance
JP2010511986A (en) Ambient lighting
JP2009521170A (en) Script synchronization method using watermark
US11924302B2 (en) Media player for receiving media content from a remote server
KR20100114858A (en) Method and apparatus for representation of sensory effects using sensory device capabilities metadata
JP2014032669A (en) User device, second screen system and method for rendering second screen information on second screen
KR20100008775A (en) Method and apparatus for representation of sensory effects and computer readable record medium on which user sensory prreference metadata is recorded
JP2019536339A (en) Method and apparatus for synchronizing video content
JP2020174345A (en) System and camera device for capturing image
KR20100008777A (en) Method and apparatus for representation of sensory effects and computer readable record medium on which sensory device command metadata is recorded
JP2009022010A (en) Method and apparatus for providing placement information of content to be overlaid to user of video stream
GB2557884A (en) Device control apparatus and method
CN107787082A (en) Control method, correspondence system and the computer program product of light source
JP2011166314A (en) Display device and method of controlling the same, program, and recording medium
EP3288344B1 (en) A method of controlling lighting sources, corresponding system and computer program product
CN107430841B (en) Information processing apparatus, information processing method, program, and image display system
KR20190096685A (en) Display apparatus and control method for the same
KR20050116916A (en) Method for creating and playback of the contents containing environment information and playback apparatus thereof
CN107071534B (en) Method and system for interaction between user and set top box
US11282281B2 (en) Activation of extended reality actuators based on content analysis

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)