GB2499123A - Lighting control system - Google Patents

Lighting control system Download PDF

Info

Publication number
GB2499123A
GB2499123A GB1301762.9A GB201301762A GB2499123A GB 2499123 A GB2499123 A GB 2499123A GB 201301762 A GB201301762 A GB 201301762A GB 2499123 A GB2499123 A GB 2499123A
Authority
GB
United Kingdom
Prior art keywords
pan
audio
lighting fixtures
time
tilt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1301762.9A
Other versions
GB2499123B (en
GB201301762D0 (en
Inventor
Steve Warren
Richard Salzedo
Jb Toby
Adam Craig Proffitt
Jaspal Bhullar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AVOLITES Ltd
Original Assignee
AVOLITES Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AVOLITES Ltd filed Critical AVOLITES Ltd
Priority to GB1607976.6A priority Critical patent/GB2535909B/en
Publication of GB201301762D0 publication Critical patent/GB201301762D0/en
Publication of GB2499123A publication Critical patent/GB2499123A/en
Application granted granted Critical
Publication of GB2499123B publication Critical patent/GB2499123B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface

Abstract

Apparatus for controlling a plurality of lighting fixtures comprises a lighting console 11 including a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of lighting fixtures; a data store configured to store one or more predetermined control sequences of the plurality of lighting fixtures and a time-series dataset determining the state of the user inputs over time; and a media unit configured to present time-stamped audio/visual media, the apparatus being configured to, in a playback mode, synchronise the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media. A method of scaling a set of pan and/or tilt parameters for each of a plurality of lighting fixtures with known positions relative to one another to facilitate calibration of a lighting system for different stage arrangements is also disclosed.

Description

Liahfma Control System
The present invention relates to apparatus and method for the control and positioning of automated lighting fixtures and for the control and replay of entertainment lighting systems.
The use of automated lighting Mures and control equipment for controlling the fixtures has become prevafent within the entertainment industry. They are used for a variety of purposes, for example, lighting for concerts, theatres, and other such live events. Lighting at these events can involve moving light (such as spot lights, lasers, LEDs, special effects lights) from one position on a stage or performance area, to another position at a different time within the show or event. The beams of light from the automated lights can be moved in specific sequences to enhance the atmosphere of the event being performed or undertaken.
Control systems for the automated lighting fixtures allow the beams of light from the fixtures to be positioned and moved on the stage. The automation provided by the control systems can be utilised for various uses, such as the illumination of specific characters, actors, pieces of scenery, or the use of the beams of light as enhancements to the visual appearance of the entertainment form being undertaken.
The control systems can control the movement of the beams of light. For example, a pair of controlled mirrors can facilitate pan and tilt control by reflecting a beam of light, or a pair of motors can facilitate the pan and tilt of the entire light emitting unit. By sending each lighting fixture a timely stream of pan and tilt data, the beam of light may be manipulated and moved such that the end position upon which the beam of light falls (e.g. on a stage), may be moved dynamically and proportionally to the pan and tilt data being sent to the automated lighting fixture.
Serial data protocols exist that perform the delivery of the pan and tilt data to a series of automated lighting fixtures from a control system, such as a controlling computer or console. Such protocols include 8 bit serial data such as USITT DMX512A, AMX, or pyx, through to embedding higher level protocols that act as a carrier for multiple lower serial data protocols. Such higher level protocols include proprietary protocols such as Art-Net, ACN, Pathport, Shownet, which can use Ethernet as a carrier for DMX, Some automated lighting fixtures exist that can accept Ethernet connections directly. Some can accept DMX512 as the standard communications means.
The individual control of an automated lighting fixture can be dependent on a unique address for each fixture. The allocation of the unique address by the controlling console can be known as the "patching" of the automated lighting fixture to the controlling console. Control systems can provide automated means to assign unique addresses to each automated lighting fixture, however the operator may individually assign each address should they desire.
i
Once an operator has "patched" the automated lighting fixtures, in such a manner that they may be controlled by the console, the operator may move the light beams of the automated lighting fixtures into positions on the performance area. Individual automated lighting fixtures maybe assigned to "groups", such that the operator of the console controlling the lighting, may be able to apply control conditions to certain selections of automated lighting fixtures at the same time, Such groups may be automatically assigned by the controlling system at the time of patching. Groups may also be assigned by the operator manually.
The individual functions of the automated lighting fixtures, may be applied to a group, such that a specific group of automated lighting fixtures, may have their colours, positions, or projection symbols changed under a group command, as opposed to an individual command, The selection of the automated lighting fixtures through a group, or through individual control, can be decided by the operator of the controlling system.
To record a view of all of the positions and outputs of the automated lighting fixtures, the control system can capture data parameters being sent to all of the automated lighting fixtures and save this to a memory or storage device. This recorded data may be time stamped and can represents the position and beam status of all of the automated lighting fixtures at a particular time.
The stage or performance area may be defined as a rectangular area. This area may be defined using colloquial terms such as Upstage Left, Upstage Right, Downstage Left, and Downstage Right, The range of tilt and pan parameters for each of the automated lighting fixtures can be translated or transformed by the controller into positions on a stage for each beam of each automated lighting fixture.
Areas and items of common interest with regards to the shows production, such as specific stage positions such as the drum kit or the lead singer's microphone position, may be highlighted with each beam from each automated lighting fixture. Such positions within the performance area may be stored as a series of sets of pan and tilt information for the automated lighting Mures within the controlling computer or console memory. Such commonly used beam positions may be referred to colloquially as "Preset Focuses", with each beams focus being preset to a position on the stage accordingly. Preset Focuses may also be applied to other functions of the automated lighting fixtures, such as colours, gobo's, or focus and zoom attributes.
As each automated lighting fixture has its own unique address, detailed in the patch, a recorded sequence of movements (also known as a "chase") for an individual fixture may be applied to groups of automated lighting fixtures, such that multiple chases may be applied to multiple groups of automated lighting fixtures.
2
An operator of the control system may control an automated lighting fsxture, The control system may display a two-dimensional stage area for the operator. The operator may then choose to move a beam of light in relation to the displayed area. The control system may translate a position on the display, using automation algorithms that utilise mathematical functions, to control provide pan and tilt parameters to the automated lighting fixture which moves the beam of light to a position on the stage that is relative to the position on the display,
The operator of the control system controlling the automated lighting fixtures, may select and control each individual automated lighting fixture, or a group of Sighting fixtures, through, for example, a manual button press, interaction with a touch sensitive screen, deployment of a joystick, or the initiation of a rotary control encoder. In such a manner each attribute or parameter of each automated lighting fixture may be controlled.
Such control systems have been in existence for many years, e.g. control systems in US 7,839,391 and US 7,495,871 which can control automated lighting fixtures (e.g. such as those disclosed in US4392187 and US4982887).
The lighting needs for performances is becoming ever more complex and sophisticated. This has led to an increase in the number of automated lighting fixtures deployed for a performance, each fixture being able to output a number of different effects. Groups of lights can be utilised to provide special effects, which can require precise and actuate control of each individual light, Furthermore, the fixtures can be mounted in a number of ways and in an increasingly complex manner. For example, the fixtures can be mounted on trusses that may be orientated at various angles, or hung on wires or fixed on moving platforms. To further complicate matters, performances may be moved to different venues that may have differing physical stage and building dimensions, and so different rigging for the fixtures may be required from venue to venue. Due to these factors, the set up and calibration of each of the automated lighting fixtures can be very time consuming. Also, due to the sheer number of lights and their variable parameters (e.g. beam position, colour, size, etc) there may be instances where, due to an operator error or the settings of a stored lighting programme, light that is unsuitable may have been utilised during a live performance. There is therefore a need to provide quicker and simpler control for an operator of automated lighting fixtures.
According to a first aspect of the present disclosure there is provided apparatus for controlling a plurality of lighting fixtures comprising: a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of lighting fixtures; a data store configured to store one or more predetermined control sequences of the plurality of lighting fixtures and a time-series dataset determining the state of the user inputs
3
over time; and a media unit configured to present time-stamped audio/visual media, the apparatus being configured to, in a playback mode, synchronise the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media.
Suitably, the apparatus can further provide an edit mode in which the apparatus is configured to edit the one or more predetermined control sequences of the plurality of lighting fixtures in response to modification of the state of one or more user inputs by means of the user control interface.
Suitably, the apparatus can be switchabfe between the playback mode and the edit mode. Suitably, on switching from the playback mode to the edit mode at a particular time-stamp of the audio/visual media, the apparatus can be configured to edit the data in the time-series dataset for that particular timestamp.
Suitably, the time-series dataset can comprise one or more parameters of each user input, Suitably, said one or more parameters can be associated with the pan and/or tilt of an lighting fixture. Suitably, said one or more parameters can be associated with the light output by a lighting fixture. Suitably, the user control interface can be configured to, in response to a change in state caused by a user input, change one or more said parameters in relation to said change in state.
Suitably, the apparatus can comprise one or more inputs for receiving time-stamped video and/or audio signals.
Suitably, the apparatus can comprise a camera for recording time-stamped video.
Suitably, the apparatus can comprise an audio input for receiving a time-stamped audio signal.
Suitably, the time-stamped audio and/or video signals can be stored at the data store.
Suitably, the apparatus can comprise one or more inputs for cameras. Suitably, the apparatus can comprise one or more inputs for microphones. Suitably, the data from said inputs can be stored at the data store.
Suitably, the data store can be configured to store a representation of said time-stamped audio and/or video signals and/or said data from the inputs.
Suitably, the apparatus can comprise a display unit configured to display a virtual representation of the output from the lighting fixtures corresponding to said state of the user inputs. Suitably, the display unit can be configured to, when the apparatus is in the edit mode, display a virtual representation of the output from the lighting fixtures corresponding to modified state of the user inputs.
4
Suitably, the user control interface can comprise a touch screen configured to display at least some of the one or more said user inputs.
Suitably, the audio of the said presented audio/visual media can be a synthetic abstract of a five audio recording. Suitably, said synthetic abstract may be a modified version of the live audio recording.
Suitably, the video of said presented audio/visual media may be a synthetic abstract of a live video recording.
According to a second aspect of the present disclosure there is provided a method of reviewing control sequences defined at a controller for a plurality of lighting fixtures, the controller comprising a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of lighting fixtures, the method comprising the steps of: causing the controller to play out a predetermined set of lighting control sequences during a show; storing a time-series dataset determining the state of the user inputs over time during the show; and subsequently: presenting time-stamped audio/visual media of the show; and synchronising the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media,
According to a third aspect of the present disclosure there is provided apparatus for controlling a plurality of output devices comprising: a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of output devices; a data store configured to store a time-series dataset determining the state of the user inputs over time; and a media unit configured to present time-stamped audio/visual media, the apparatus being configured to, in a playback mode, synchronise the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media.
According to a fourth aspect of the present disclosure there is provided apparatus for controlling a plurality of lighting fixtures with known positions relative to one another and capable of providing light to a stage area, the apparatus comprising: a memory configured to store a set of pan and/or tilt parameters for each of the plurality of lights, the plurality of lighting fixtures comprising a first lighting fixture; a controller configured to control panning and/or tilting of the plurality of lighting fixtures in dependence on the stored set of pan and/or tilt parameters, and to scale said sets of pan and/or tilt parameters for the stage area by: determining first pan and/or tilt parameters representing the pan and/or tilt of the first lighting fixture when light therefrom is focused on a first reference point on the stage area; determining coordinates of the first reference point under a coordinate system defined for the stage area; and transforming said sets of pan and/or tilt parameters for the plurality
5
of lighting fixtures in dependence on their said known relative positions, said first pan and/or tilt parameters, the coordinates of the first reference point and a height of the first lighting fixture above the stage area.
Suitably, said stored set of pan and/or tilt parameters can be defined for a reference stage area whose size may differ from said stage area provided light by said lighting fixtures.
Suitably, the controller can be configured to pan and/or tilt each of the plurality of lighting fixtures in dependence on said scaling.
Suitably, the controller can be further configured to automatically pan and/or tilt the first lighting fixture in a predetermined pattern so as to scan for a Sight detector located at the first reference point, the apparatus being configured to receive a signal from the light detector.
Suitably, the controller can be configured to determine second pan and/or tilt parameters, the second parameters representing the pan and/or tilt of the first lighting fixture when light therefrom is focused on a second reference point on the stage area.
Suitably, said scaling can be further dependent on second pan and/or tilt parameters and determining coordinates of the second reference point under the coordinate system.
Suitably, the first and second reference points can be located towards the extremities of the stage area.
Suitably, the apparatus can comprise a display and being configured to, in dependence of said scaled sets of pan and/or tilt parameters, form a virtual representation of the lighting fixtures and the stage area.
Suitably, said sets of one or more pan and/or tilt parameters can define one or more predetermined control sequences of the plurality of lighting fixtures.
Suitably, the predetermined control sequences can define one or more of pan, tilt, mode, colour or intensity for each of the plurality of lighting fixtures.
According to a fifth aspect of the present disclosure there is provided a method of scaling a set of pan and/or tilt parameters for each of a plurality of lighting fixtures with known positions relative to one another and capable of providing light to a stage area, the plurality of lighting fixtures comprising a first lighting fixture, the method comprising; determining first pan and/or tilt parameters representing the pan and/or tilt of the first lighting fixture when light therefrom is focused on a first reference point on the stage area; defining coordinates of the first reference point under a coordinate
6
system defined for the stage area; and transforming said sets of pan and/or tilt parameters for the plurality of lighting fixtures in dependence on their said known relative positions, said first pan and/or tilt parameters, the coordinates of the first reference point and a height of the first lighting fixture above the stage area.
The present disclosure will now be described by way of example with reference to the accompanying drawings, in which:
Figure 1 shows a console and various devices.
Figure 2 shows an example of synchronisation with various devices.
Figure 3 shows an example of a calibration process.
Figure 4 shows an example display on a touch screen for a console.
Figure 5 shows an example of a stage setup.
Figure 8 shows an example of a stage.
Figure 7 shows an example of a touch screen display for controlling fixtures
Figure 8 shows an example of a hand movement on a touch screen display to control fixtures.
The following description is presented to enable any person skilled in the art to make and use the invention, and is provided in the context of a particular application. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art.
The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
Visual Svnc
As mentioned above, lighting of live shows is becoming increasingly complex and requires accurate and precise control and timing of the automated lighting fixtures. During a show, an operator may use a control panel on a lighting console to control the automated lighting fixtures. The control panel can allow a number of different parameters (e.g. position, colour, intensity, etc) of each light to be determined and controlled.
The control panel may comprise a number of different means for receiving a user input to provide control of each light fixture. For example, the control panel may
7
have user input means such as buttons, dials, joysticks, sliders, a touch screen that enables the control of the lighting fixtures. A user may interact with the control panel to physicaliy press, rotate, move, tap, etc the appropriate user input means to change the state of that input means, For example, a user may press a button to change its state from "off" to "on" or the user may slide a fader, to change its state from one level to another, the amount of slide being proportional to an intensity level, for example. Another example may be that the state of a display output to a touch screen may be changed by the operator pressing or dragging an icon.
During a show, the state of each user input over time may be recorded. The control panel can log the state of each user Input and/or any changes in state caused by the operator. The logged data can be stored along with a time so that the state of each of the user inputs at a particular time can be known. The logged data can be stored as time-series data, which can comprise values of parameters that change over time (e.g. the position of a fader over time). The time-series data may be stored in a data store (e.g. internal or external hard disk, internal non-volatile memory, removable non-volatile memory card, removable optical media, etc).
Also during the show, one or more cameras may record audio/visuals of the onstage performance. The audio/visual media (which may be audio or visual) may be time time-stamped during the recording. The camera may be a stand-alone camera, or built-in to the lighting console, or attached to the lighting console as a peripheral device via an input port. The audio/visual media may be recorded on a storage device in the camera or, if internal or attached to the console, on a storage device internal or attached to the console. The visual media may be a video or still images.
After the show, a user may wish to review the lighting and/or other aspect of the show. The console may comprise a display and/or an audio/visual output port for an external display that allows it to present the recorded show. At the same time, the console can cause the control panel to adopt the configuration it was in during the point in time that is being displayed. The console can synchronise according to time the time-stamped audio/visual media with the time-series data that logged the state of each of the user inputs so that the state of the user inputs can be played back along with the audio/visual media. In this playback mode, the user inputs will adopt the state that it was in at the point in time shown on the display.
The console can analyse the time-series data to determine the state of the user inputs at a particular time. Based on this analysis, the console can then cause the user inputs to adopt that state. For example, the console may cause a small motor to move a fader on the control panel when the time-series data indicates that that fader was used to change the intensity of a light. In another example, based on the time-series data, the console may cause a touch screen display to display the dragging of an icon that was used to pan and tilt a light. Thus, the control panel can
8
replay all the different actions that were carried out by the user during the show. The control panel may also adopt any state caused by the running of any preprogrammed sequences. Thus, a logged user input to start a pre-programmed sequence can cause the console to "playback" the pre-programmed sequence on the control panel.
This allows the user to review the lighting of show along with the actions carried out by the console operator. By synchronising the state of the console with the audio/visual media a user is able to spot any errors made by the operator or Identify any potential improvements.
For example, if the displayed audio/visual media shows that, at some point during the show, a lighting fixture erroneously provided a red light instead of a blue light, the user is able to identify the source of that error by looking the state of the console. If the error was caused by the operator, the synchronised console will show a change in the state of a button, for example, that controls the colour of that light. If there is no change in the state of that button, then the user has identified that the cause for the error is not due to the person operating the console during the show. The error may be, for example, in a pre-programmed lighting sequence stored on the console. The user may then switch the console to an edit mode so that he is able to edit that lighting program using the control panel so that the correct colour is used at when the lighting program is next run. Thus, in the edit mode, the user can edit a preprogrammed lighting sequence data stored on the console (which may be stored separately or on different file or different location to the user input state data).
The user can visually identify any point in the recorded time on the display and then, via the console, access and edit any field related to the lighting control at that identified point. This enables the user to modify, edit or improve any preprogrammed as aspects of the lighting control on the console whilst having a realtime visual reference.
As the state of the console is synchronised in time with the presentation of the audio/visual media, changing a time-related aspect of the playback of the show (e.g. fast-forwarding, rewinding, pausing) can cause the console to also change according to that aspect. For example, pausing the playback of the display causes the console to pause the "playback" of the state of the control panel at the same time,
The audio/visual media presented by the console may be a synthesised version of the live recording. For example, due to copyright restrictions, a performer may not agree to the recording of the music. Thus the console is able to synthesise a version the recorded music that is a modified version (i.e. different) to the actual live music and playback the synthesised version instead. Aspects of the music, such as the rhythm, melody, pitch and/or mood may still be identifiable in the synthesised version. This can allow the user to review aspects of the lighting, which may be
9
related to aspects of the music, without violating any of the performer's rights or preferences. Similarly, the console may also cause the video playback to be a synthesised version of the live video recording.
The console may be configured such that only aspects of the live performance is recorded and stored. For example, the console may record a synthesis of the audio input from a microphone (which may be internal or external to the console). Thus the audio of the live performance is not recorded, but a synthesised version of the audio is recoded. The audio input may be synthesised or modified in real-time and then the synthesised audio is stored in the data store. Similarly, the console may similarly record a visual synthesises of the live performance. For example, the console may record a synthesis of the video or image input from a camera (which may be internal or external to the console). The video Input may be synthesised or modified in real-time and then the synthesised video is stored in the data store. The video synthesis may involve a distortion or manipulation of the video input, for example, by altering the colour, scaling, warping or any other effect. The synthesised video can then be used to visualise, for example, the positioning of artists or objects on a stage and any associated lighting. Playback of the stored synthesis would then help a lighting operator or designer to design the lighting in the event of recording the performance without lights (e.g. during rehearsal). Playing back a synthesised version of a live performance with lights can help the lighting operator to assess the lighting of the show, e.g., to see when the wrong space was illuminated.
The stored synthesised audio and/or video data can also be used to produce a virtual show on a virtual system, which can allow the lighting operator to efficiently design and visualise various lighting options and effects for a given performance. For example, the synthesised visuals can be blended with a visualised simulation of various possible lighting effects to provide a virtual lighting production rehearsal of a show.
The console is capable of controlling various devices and synchronising time-stamped data from those devices, as depicted in Figure 1, The console 11 can communicate with and synchronise data from a camera 12 (which may be integrated into the console or external to the console), an audio device 13 such as a microphone or audio recorder, a media server 14 (which, for example, can provide media to on-stage screens), a wireless remote device 15 (which can remotely control certain functions of the console control panel) and an outboard controller 18 for any static or motion devices.
Figure 2 shows an example of the synchronisation of the time-series data for user input on the console (A) with a time-stamped video from a camera (B), a time-stamped audio from an audio device (C) and a time-stamped on-stage media from a
10
media server (D), The console can synchronise data from A~D to allow the user to determine the state of the various aspects of the show at a particular time. For example, at point E in time shown in the figure, the user is abie to determine the state of the various user inputs (e.g. the fader, wheel, button) on the control panel, the lighting of the show as a result of those states, the audio and the on-stage media. Thus the user is able to review various aspects of the show and how the lighting affects those aspects.
The console can maintain synchronisation of the various sources through a closed feedback loop. The synchronisation enables a user to visualise the effects of the lighting control system with the movement of the plurality of automated lighting fixtures. By synchronising the individual commands input by the user and executed by the lighting control system, with the presented audio/video media, it is possible to gain a visual understanding of the performance within which the plurality of automated lighting fixtures are being employed at a specific point in the running time of the production. Such a linkage of each button press on the lighting control panel by the user, and the underlying computations being effected allows for a system that can afford to the user the ability to review the video of the performance and to make offline edits to specific lighting control system commands to the plurality of automated lighting fixtures.
Furthermore it may be possible for the user to identify specific sequences and effects of the plurality of automated lighting fixtures within the performance, and to assign an overview name for these effects and sequences, and thus be able to recall these effects and sequences by a colloquial name for insertion or deletion from any point in the performance.
Furthermore it will be possible for the user to video record a rehearsal of the production or performance with no lighting and then to overlay the effect of the plurality of automated lighting fixtures, through techniques such as alpha blending, morphing etc. In such a way the user of the lighting control system will be able to judge the suitability of the performance lighting in an offline mode, utilising the rehearsal video footage as the background for their comparison.
A plurality of cameras may be attached to the console to provide recordings of a show from a plurality of angles. The video information can then be used to visually reference specific automated lighting fixtures actions' and the behaviour of other media (e.g. from the media server) and their associated performance at any point in the recording. Thus the console can provide a single point of reference for all visual, audio, and mechanical elements of the performance as reviewed by the cameras that are deployed to observe the stage or performance area. By recording every button press on the console at every chronological point throughout the show or event, the video being displayed is augmented by providing the exact tracked list of
11
controlling functions that have been deployed by the user of the console during the show.
The console allows offline visual editing, such that the user may return to a specific point in the proceedings of the event or show and be able to switch the console Into the exact functional settings that the control panel was at the previously recorded point in the event or show. Playback of the recorded data for the user input at the control panel allows the console to provide an output to attached plurality of automated lighting fixtures, By using the visual synchronisation, this data can be corrected offline or changed to adjust any specific parameters of the plurality of automated lighting fixtures and then run again as part of a live show,
The user may capture the effects of the beams of the plurality of automated lighting fixtures being executed by the console at a point in the chronological history of the audio/visual media for a show or event and then recreate the effects of the beams of the plurality of automated lighting fixtures, The captured effects may be given a moniker or name, and then this look or effect may be applied to a new show or event, with the automated lighting control system being able to capture and apply the mathematical function matrix that was applied to the plurality of moving lights, and to hide this complexity, through the association of a moniker or name.
The user may overlay two different audio/visual media, such that the user may be able to see the differences between old and the new lighting output and its effects upon the stage or performance area for two recorded shows. The user can record a video of the performance or event with one set of effects lighting present and then be able to implement a different set of lighting effects and to view them as an overlay or side-by-side of each other.
The above describes a lighting control console that is configured to control a plurality of lighting fixtures and synchronise audio/visual media with time-series user input data that represents the state of user inputs for that console over time. In a similar manner, other types of consoles that can control output devices (such as speakers, displays, winches, etc) may employ similar synchronisation of audio/visual media with user input data. For example, the console may be an audio console for controlling a plurality of speakers and can comprise: a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of speakers; a data store configured to store a time-series dataset determining the state of the user inputs over time; and a media unit configured to present time-stamped audio/visual media, the audio console being configured to, in a playback mode, synchronise the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media. Such an audio console can enable an operator to review the sound at a concert, for example. In
12
another example, the console may be a video or media server console for controlling one or more displays or a stage automation consoie for controlling stage objects (e.g. controlling electrical, hydraulic or pneumatic power to move stage machinery, stage elements, performer flyers or scenery, which would have been traditionally operated by stagehands and flymen).
Live track
As mentioned above, a number of lighting fixtures can be utilised in a show, and that number is ever Increasing. In some cases, for example when a show has moved to a new venue, each light may need to be re-calibrated, Due to the large number of lights, this can involve a time-consuming process of an operator using a lighting control console to move each light to a point on the stage (a focus point) and adjusting parameters (such as such as focus, zoom, colour temperature, intensity, etc) for each light to calibrate the lighting fixtures.
When the lighting fixtures are mounted on to the lighting fixture rigging, the positions of each of the fixtures may be input into the console, such that the position of each light relative to all of the other lights can be determined. Such data may be readily available or determined, for example the data can be determined from fixed mounting positions on trusses, or from the positions of trusses relative to one another or from locating fixtures/trusses via GPS. Thus the position of each light relative to one another can be known to the console. This fixture position data may be stored in a memory of the console. If a fixture position is changed, the change in position can be input into the console, which can then recalculate the position of the fixture relative the other fixtures.
When calibrating parameters of the lighting fixtures, the user may select any one of the lighting fixtures and move that selected fixture to a focus point. The focus point may be a point on the stage that is randomly selected by the user or a fixed point on the stage or performance area (e.g. a fixed point on the stage boundary such as bottom-stage-left). The selected lighting fixture may be panned and/or tilted via user input controls (such as a joystick) on the console such that light from the selected fixture is incident or focused on the focus point. Alternatively, the console may be configured to automatically search for the focus point by panning and/or tilting the selected fixture such that the light from the selected fixture Is scanned across the performance area until the light is focused on a light detector that is placed at the focus point. The console may be configured to (wirelessly or via a wired connection) communicate with the light detector and when the console receives a reading from the detector that is above a threshold value, the console determines that the light from the fixture is focused on the focus point. When the light is incident on the focus point, the user may then calibrate the light beam.
13
When fight from the selected fixture is focused on the focus point, the pan and/or tilt data or parameters used to manoeuvre the fixture to that position can be determined. Using the determined pan and/or tilt parameters, a height of the selected fixture relative to the stage and the position data, the console can calculate the amount of panning and/or tilting that is required to move the other lighting fixtures such that light from them is also focused on the focus point. The console may then use the calculated pan and/or tilt parameters to automatically position the fixtures so that the light from each fixture is focused on the focus point. A user can select one or more fixtures and the console can automatically pan and/or tilt the light from those selected fixtures to the focus point without any other input (e.g. from the user or light detector). The user can then calibrate light from each of the other fixtures without having to manually manoeuvre each fixture to the focus point or perform the scanning process, which can be time-consuming.
Figure 3 shows a simplified example of the calibration process. A first fixture (fixture A), which can be any fixture that is controlled by a controller of the console, can be selected by the user to be panned and/or tilted to downstage-right (D/S/R) focus point D using the user input control or scanning method described above. When light from fixture A is at the focus point D, the console can determine the amount of pan and/or tilt that is required for fixture A to get to that position from a zero position. The zero position may be, for example, a known pan and/or tilt value for the fixture to direct light downwards towards the stage along a vertical axis (as indicated by the dotted line in figure 3). The console determines the pan and/or tilt parameters for fixture A to direct its light from the zero position to the focus point. The console has position data that comprises the positions of fixtures B and C relative to fixture A and the height of fixture A relative to the stage. Using the pan and/or tilt parameters for fixture A and the position data, pan and/or tilt values can be determined for fixtures B and C so that the console can pan and/or tilt the fixtures such that light from those fixtures is focused on the focus point. As mentioned above, this enables the user to calibrate each of the fixtures at the focus point in an efficient manner.
Each fixture may have a set of stored pan and/or tilt parameters that allow it to move to certain preset positions on a stage or move in a predetermined and sequenced manner (e.g. as part of a "chase"). This sequence of movements may be predetermined for a particular show and the movements may be in relation to certain stage positions (e.g. D/S/L, centre stage, or any other position on the stage that the position of the light beam can be in relation to). However, the stage or the position of the stage in relation to one or more of the fixture may change (e.g. when moving a show to another venue). Thus a new set of pan and/or tilt parameters for each fixture would have to be determined and stored at the console. This can be a time-consuming and laborious process for the lighting operator,
14
The user of the console may direct light from a fixture (e.g. the first fixture mentioned above) to a certain stage position (e.g. D/S/R) using the user input controls on the control panel of the console or the scanning method mentioned above (by placing the light detector at that certain stage position). The console can then determine the pan and/or tilt parameters for the first fixture which represents the pan and/or tilt of the first fixture when light therefrom is focused on that certain stage position. Using the determined the pan and/or tilt parameters, the certain stage position, the fixture positional data mentioned above and the height of the first fixture, the console is capable of transforming the stored set of pan and/or tilt parameters for the other fixtures, Using the transformed set of parameters, the console is capable of panning and/or tilting the fixtures to the preset positions and in the sequenced manner for the new venue or stage positioning. Thus each of the stored set of pan and /or tilt parameters is scaled for the new venue without having to define a new set of pan and/or tilt parameters.
The positions on the stage may be defined by a coordinate system, e.g. GPS coordinates or simple x,y coordinates for a rectangular stage, which may have an origin at downstage-left (D/S/L), for example. The stored set of pan and/or tilt parameters may have been determined for a first stage (e.g. a reference stage) of a first size and when the show is moved to a second stage of a second, different size, the console can transform the stored parameters such that the amount of panning and/or tilting is scaled for the second stage.
The following describes an exemplary process for scaling the stored sets of pan and/or tilt parameters for each of the lighting fixtures for a new stage:
- Panning and/or tilting a first lighting fixture so that light from it is focused on a reference point on the stage, The pan and/or tilt parameters required to place the focused light on the reference point can be relative to a known pan and/or tilt orientation for each of the other lighting fixtures.
- The stored sets of pan and/or tilt parameters may be in relation to a coordinate system. The coordinates (under the coordinate system) of the reference point is determined. The reference position may be a fixed position on the stage such that its coordinates are known. E.g. the reference position may be fixed to D/S/L which may translate to a (0,0) coordinates under an x,y coordinate system. Alternatively, the reference position may not be fixed and the user can position the beam of light on any point on the stage, and then input on the console the coordinates of the light on the stage (e.g. via a touch screen showing a representation of the stage),
- The console can then transform the sets of pan and/or tilt parameters utilising the pan and/or tilt parameters, the height of the first fixture relative to the stage, the coordinates of the reference point, the position of each of the other
15
fixtures relative to the first fixture and known trigonometric principles so as to scale the sets of pan and/or tilt parameters for the new stage.
The accuracy of the transformation of the stored set of pan and/or tilt parameters for each fixture for a new venue can be enhanced by determining second pan and/or tilt parameters representing the pan and/or tilt of a selected lighting fixture when light therefrom is focused on a second stage position. Additionally using the determined second pan and/or tilt parameters and the second stage position in the transformation mentioned above (with the first stage position), the transformed set of pan and/or tilt parameters can provide more accurate scaling for the new stage, Similarly, further stage positions (e.g. a third and/or fourth position) may be used to enhance the scaling or may be required for asymmetrically shaped stage areas.
A second stage position can also help in defining the stage area, For example, for a rectangular stage, if the pan and/or tilt parameters for downstage-left (D/SL) and upstage-right (U/S/R) is known (and/or the other D/S/R and U/S/L opposing corners of the rectangular stage), the stage area can be defined. This can be used to further enhance the scaling of the stored set of pan and/or tilt parameters for the new stage.
By determining the pan/and or tilt parameters for a stage area (e.g. using focus point positions (D/SL) and (U/S/R) for a rectangular stage) and the height of the first fixture, the console is capable of determining the three-dimensional coordinates of that fixture in relation to the stage. The console is then capable of using the coordinates to produce a reference icon on a display (e.g. a touch screen on the console). This can act as visual reference and confirmation that the console has registered the coordinates of the fixture, A user may then be able to select a fixture or group of fixtures and select a point on screen to which light from those fixtures should be focused upon. For example, in Figure 4, the user may select group "A" fixtures to direct light to the "drums position". The console is then capable of scaling and translating the selected 2D on-screen coordinates to the actual 3D coordinates on stage and then translate those 3D stage coordinates into pan and/or tilt parameters for each fixture in group "A".
The icons "B" shown in Figure 4 may have coordinates that reference their position in 3D space. Any amount of fixtures can then be locked to a position icon. That icon can be moved and repositioned within the 3D space. If the icon is moved (e.g. because the object represented by that icon has moved on stage) then the console can recalculate the change in pan and or/tilt levels needed to keep the fixtures locked to the new icon position,
The console may comprise a screen that is touch sensitive and can support haptic interaction through gesticulation recognition, such that a plurality of automated
16
lighting fixtures may be controlled and accessed through hand movements upon the touch screen interface, Furthermore the user's operational viewing angle may also be switched through the manipulation of the touch sensitive user interface to allow for the manipulation of the plurality of beams of light in the three dimensional real world through the gesticulation and interaction with the 2 dimensional touch sensitive user interface.
Via the user inputs at the console (such as the touch screen), the user may place beams of light from selected automated lighting fixtures into a tracking mode, In the tacking mode, the controller automatically calculates the geometry and the pan and/or tilt parameters for each automated lighting fixture such that all the beams converge on one point in three dimensional space. This convergence point may be controlled by the user of the console. Furthermore, the user may define the edges of the performance stage area for the console, and utilise the convergence points of the plurality of automated lighting fixtures to define a set of geometric coordinates that each of the plurality of automated lighting fixtures may be bounded to operate within the performance space determined by the user of the console on the touch sensitive user interface. For example, a rectangular boundary may be defined by providing two convergence points at opposing corners of the rectangle. The user may also apply rules and bounds to the plurality of automated lighting fixtures that can dictate policy to each of the plurality of automated lighting systems should a boundary limit be exceeded. For example, when the automated lighting fixture is positioned such that a beam of light from it is directed outside of the boundary area, then the console may detect such positioning and automatically dim the light.
The console may receive external reference data to establish where the performance area may be located, such external data may include GPS co-ordinates, etc established on an external device.
Via the console, a user may define a space within the three dimensional performance area as represented on the touch sensitive user interface, where no automated lighting fixture may place their respective beams. This ensures that a desired area on stage (that is represented by the defined space) is kept dark and unlit from the beams of the plurality of automated lighting fixtures.
The console can calculate the physical height, or distance of the plurality of automated lighting fixtures from a known point within the performance space using a mathematical function that can be applied to either or both the focus and zoom attributes of the automated lighting fixtures. When a projection symbol is seen to be in focus on the floor, or the user's target surface within the three dimensional performance space, a mathematical function is employed to calculate the height of the automated lighting fixture that emits the projection symbol.
1?
The console may interact with a third party hand held device which comprises a camera or light meter, such as a mobile telephone or smart phone. Data produced by such a third party device may form a feedback loop of timely data, which can be received by the console, that represents a state of lighting at the third party device's location within the three dimensional performance space. The console can cause a beam from each of the plurality of automated lighting fixtures to pass over the three dimensional performance area in a prescribed manner, perhaps sweeping the stage in an "S" pattern, from left to right across the performance areas, increasing and moving up the performance area towards the back. When a beam passes over the top of the third party device, the control system can identify which beam is passing over the third party device at what point in time, thus allowing the console to work out the extent of light from each beam (e.g. its intensity) at the point in the three dimensional performance space where the third party device Is located and to what extent and effect each beam is having on the overall Sight output at this point in space. The console may receive from the third party device chrominance and luminance data for each beam of light being emitted from an automated lighting fixture. The console may change the colour temperature controls for any suitably equipped automated lighting fixture, so that the light from the plurality of automated lighting fixtures in the performance area emits a selected colour temperature, For example, a certain colour temperature for a fixture may be selected and the third party device may detect the coiour temperature of the light from that fixture which may be different to the selected colour temperature (perhaps due to a calibration error or LED colour shifting over time), The console can then adjust its output instructions to that fixture so that the colour temperature of the light from the fixture matches the selected colour temperature. The console may receive back from the third party device chrominance and luminance data with regards to the colour of an object (such as a stage costume, stage curtain etc), This colour or texture data may be applied to adjust the light output from the plurality of automated lighting fixtures.
The third party device may be positioned in specific locations within the three dimensional performance space such that the boundaries within which the beams of the plurality of automated lighting fixtures may operate may be defined and established by the location, or a series of locations of the third party user device.
A video capture device or camera may be positioned to observe the three dimensional performance area and communicate with the console to provide tracking of a specific actor, or motion object by one or more automated lighting fixture beams. The console may receive a stream of timely data via a radio frequency interface from third party devices such as an inertia! measurement unit, accelerometer, laser rangefinder, Electro Optical, or Infra Red tracking camera that enables the console to determine the movement of the specified actor, performer, or motion object.
18
Furthermore the console may provide the user the ability to define which of the plurality of automated lighting fixtures may track which actors, performers, or motion objects so that different and separate convergences of beams can be used to track different objects. Furthermore the console may, through the usage of the video capture device, allow the user the capability to assign various automated lighting fixtures to follow various motion objects, actors, or performers dependent upon the coiour of the clothing, or colour of the finish of the object or actor. Furthermore the console may define a boundary limit upon the plurality of automated lighting fixtures using a strip of reflective tape placed a surface within the three dimensional performance space, When a beam is seen, via the third party video capture device, to traverse across the reflective tape strip, the boundary can be imposed (e.g. such that light is not incident outside the boundary.
The camera can be used to identify a specific actor or motion object and the console may instruct a plurality of automated lighting fixtures to illuminate the specified actor, performer or motion object, such that the colour balance of the illuminated actor, performer, or motion object is immediately adjusted to suit the video camera device that is providing the video feed for review. The console nnay geometrically calculate the location of the video camera source, the actor, performer, or motion object upon the three dimensional performance area and each of the plurality of automated lighting fixtures within three dimensional space, and direct operations such that the plurality of automated lighting fixtures illuminates the correct actor, performer, or motion object.
Figure 5 shows an example of a stage with a plurality of automated lighting fixtures. A stage (1) can utilise a plurality of automated lighting fixtures (2,3,4), which can be located in three dimensional space with a height (22), and have a spacing (23,24,25) between each of the automated lighting fixtures. Each of the automated lighting fixtures can output a beam of light (5,8,7) on to the stage area, however it is possible to direct the beams of light from each automated lighting fixture onto one central convergence point (8) on the stage, which would in effect move the beams of light from their straight down positions of (5,8,7) through a travel of (9,10,11). It is possible for this convergence point to move in three dimensional space (12,13,14), across a predefined stage area (15,16,17,18) and that it is possible to bound the movement of the convergence point across the stage within these stage area coordinates (15,18,17,18). Furthermore it is possible to mark dynamic areas (15,19,20,21) of the stage (1), which are not to be entered by any of the beams of light (2,3,4).
Third party cameras (26,27,28) may be deployed in any configuration on or off the stage to allow the console to record a performance and/or understand the movement of the beams of the automated lighting fixtures (2,3,4). Furthermore, as shown in
19
figure 8, if a camera or light-meter (329) is placed within a specific area of the stage, for example (318,317,318,319) and has the ability to feedback (352) to the console (350), then it is possible for the console to sequentially move the beams of the automated lighting fixtures (2,3,4,302) until they are shining their respective beams on to this point.
Referring back to figure 5S using cameras and reflective objects, such as tape marks on the stage floor (29), it is possible to guide, and or limit a plurality of automated lighting fixtures outputs within certain boundaries. The usage of third party video cameras (28,27,28) allows for the complete comprehension of the effect of each button press on the control system, with the relevant light output seen on the stage, such that if the video cameras (28,27,28) can be fed back to the control system, then the video output can be synchronised with the console user inputs accordingly.
A schematic diagram of a display on a touch sensitive graphical user interface is shown in figure 7. Three automated lighting fixtures are detailed (102,103,104), which represent the three automated lighting fixtures seen above the stage (1) in figure 5, are realised on the plan view of the stage (101). The touch sensitive graphical user interface (100) shows the stage (115,116,117,118) from figure 1 (15,18,17,18) in plan view. The elevation of the view displayed on screen can be dependent on the user's choice of view to work with. The beams of the automated lighting fixtures (102,103,104) can be seen to be converging on one point (108), which is the pan view representation of figure 1 beam convergence point (8). Items of interest, or focal points (29,30) from figure 1 can also represented on the displayed stage (129,130) in figure 2.
Figure 8 shows a representation of a user's hand (201) above the touch sensitive user interface (200), which details the ability for the user's hand gesticulation to move three beams of light, landing positions upon a stage (202,203,204), to three new positions (207,208,209), such that the radius (206) of an arc (205) drawn through the beams of light (202,203,204), can be seen to be reduced down to a smaller radius (210) of a smaller arc (211) achieved by the user arching their hand and dragging the beams closer together. This demonstration of the user's hand gestures to control a plurality of automated lighting fixtures can be applied to not only the representation of the beams as seen in 3 dimensional space as depicted in figure 5, but also in any 2 dimensional elevation looking at the same stage as depicted in figure 7. Through a recognised touch screen gesture, a view may be switched on screen. The touch screen allows a user to view and interact with the plurality of automated lighting fixtures using simple two dimensional Cartesian coordinates. The console can convert those coordinates into Polar coordinate instructions that are then relayed to the automated lighting fixtures.
20
Some example touch-screen control gestures may include: a) specific fingers or thumbs may be utilised to control a start and end grouping or bound of a plurality of automated lighting fixtures, such that the plurality of automated lighting fixtures' beams may be moved together through both group acquiring fingers moving together; b) spreading or fanning out of the beams, or the bringing together, through a increase or decrease in the physical distance between the first and last fingers being utilised; c) rotating beams around a physical point by the rotation of the hand and subsequent selection fingers; d) controlling a point in three dimensional space, with regards the vertical index, around which each of the beams of the plurality of automated lighting fixtures may or may not converge, may also be controlled by a third fingers motion; e) positioning of specific plurality of automated lighting fixtures beams, and where they fall on the stage or performance area, may be controlled by the movement of all of the fingers travelling across the touch sensitive surface.
The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that aspects of the present invention may consist of any such individual feature or combination of features. Furthermore, aspects of the present invention may consist of features relating to Visual Sync in combination with features relating to Live Track: Visual Sync and Live Track are disclosed together herein and advantageous embodiments of the present invention are envisaged which draw from both Visual Sync and Live Track, in view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention,
21

Claims (1)

  1. Claims
    1. Apparatus for controlling a plurality of lighting fixtures comprising:
    a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of lighting fixtures;
    a data store configured to store one or more predetermined control sequences of the plurality of lighting fixtures and a time-series dataset determining the state of the user inputs over time; and a media unit configured to present time-stamped audio/visual media, the apparatus being configured to, in a playback mode, synchronise the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media.
    2. Apparatus as claimed in claim 1, the apparatus further providing an edit mode in which the apparatus is configured to edit the one or more predetermined control sequences of the plurality of lighting fixtures in response to modification of the state of one or more user inputs by means of the user control interface.
    3. Apparatus as claimed in claim 2, the apparatus being switchable between the playback mode and the edit mode.
    4. Apparatus as claimed in claim 2 or 3, on switching from the playback mode to the edit mode at a particular time-stamp of the audio/visual media, the apparatus being configured to edit the data in the time-series dataset for that particular timestamp.
    5. Apparatus as claimed in claim any of the above claims, the time-series dataset comprising one or more parameters of each user input,
    8. Apparatus as claimed in claim 5, said one or more parameters being associated with the pan and/or tilt of an lighting fixture.
    7. Apparatus as claimed in claim 5 or 8, said one or more parameters being associated with the light output by a lighting fixture.
    22
    8. Apparatus as claimed any one of claims 5-7, the user control interface configured to, in response to a change in state caused by a user input, change one or more said parameters in relation to said change in state.
    9. Apparatus as claimed in any one of the above claims, comprising one or more inputs for receiving time-stamped video and/or audio signals.
    10. Apparatus as claimed in any one of the above claims, comprising a camera for recording time-stamped video.
    11. Apparatus as claimed in any one of the above claims, comprising an audio input for receiving a time-stamped audio signal.
    12. Apparatus as claimed in any one of claims 9-11, the time-stamped audio and/or video signals being stored at the data store.
    13. Apparatus as claimed in any one of the above claims, comprising one or more inputs for cameras.
    14. Apparatus as claimed in any one of the above claims, comprising one or more inputs for microphones.
    15. Apparatus as claimed in claim 13 or 14, data from said inputs being stored at the data store.
    18. Apparatus as claimed in any one of the claims 9 to 15, the data store being configured to store a representation of said time-stamped audio and/or video signals and/or said data from the inputs,
    17. Apparatus as claimed in any one of the above claims, the apparatus comprising a display unit configured to display a virtual representation of the output from the lighting fixtures corresponding to said state of the user inputs.
    18. Apparatus as claimed in claim 17, the display unit being configured to, when the apparatus is in the edit mode, display a virtual representation of the output from the lighting fixtures corresponding to modified state of the user inputs.
    19.Apparatus as claimed in any one of the above claims, the user control interface comprising a touch screen configured to display at least some of the one or more said user inputs.
    23
    20, Apparatus as claimed in any of the above claims, the audio of the said presented audio/visual media being a synthetic abstract of a live audio recording.
    21. Apparatus as claimed in claim 20 said synthetic abstract being a modified version of the live audio recording,
    22, Apparatus as claimed in any of the above claims, the video of said presented audio/visual media being a synthetic abstract of a live video recording.
    23. Method of reviewing control sequences defined at a controller for a plurality of Sighting fixtures, the controller comprising a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of lighting fixtures, the method comprising the steps of:
    causing the controller to play out a predetermined set of lighting control sequences during a show;
    storing a time-series dataset determining the state of the user inputs over time during the show; and subsequently:
    presenting time-stamped audio/visual media of the show; and synchronising the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media.
    24. Apparatus for controlling a plurality of output devices comprising:
    a user control interface comprising user inputs each capable of adopting a plurality of different states, the user inputs being for controlling the plurality of output devices;
    a data store configured to store a time-series dataset determining the state of the user inputs over time; and a media unit configured to present time-stamped audio/visual media, the apparatus being configured to, in a playback mode, synchronise the presentation of the audio/visual media with the time-series dataset so as to cause the user inputs to adopt the stored state in synchrony with the audio/visual media.
    25, Apparatus for controlling a plurality of lighting fixtures with known positions relative to one another and capable of providing light to a stage area, the apparatus comprising:
    24
    a memory configured to store a set of pan and/or tilt parameters for each of the plurality of lights, the plurality of lighting fixtures comprising a first lighting fixture;
    a controller configured to control panning and/or tilting of the plurality of lighting fixtures in dependence on the stored set of pan and/or tilt parameters, and to scale said sets of pan and/or tilt parameters for the stage area by:
    determining first pan and/or tilt parameters representing the pan and/or tilt of the first lighting fixture when light therefrom is focused on a first reference point on the stage area;
    determining coordinates of the first reference point under a coordinate system defined for the stage area; and transforming said sets of pan and/or tilt parameters for the plurality of lighting fixtures in dependence on their said known relative positions, said first pan and/or tilt parameters, the coordinates of the first reference point and a height of the first lighting fixture above the stage area,
    28. Apparatus as claimed in claim 25, said stored set of pan and/or tilt parameters being defined for a reference stage area whose size may differ from said stage area provided light by said lighting fixtures.
    27. Apparatus as claimed in claim 25 or 28, the controller being configured to pan and/or tilt each of the plurality of lighting fixtures in dependence on said scaling.
    28. Apparatus as claimed in any one of claims 25 to 27, the controller being further configured to automatically pan and/or tilt the first lighting fixture in a predetermined pattern so as to scan for a light detector located at the first reference point, the apparatus being configured to receive a signal from the light detector,
    29. Apparatus as claimed in any one of claims 25 to 28, the controller being configured to determine second pan and/or tilt parameters, the second parameters representing the pan and/or tilt of the first lighting fixture when light therefrom is focused on a second reference point on the stage area,
    30. Apparatus as claimed in claim 29, said scaling being further dependent on second pan and/or tilt parameters and determining coordinates of the second reference point under the coordinate system.
    25
    31. Apparatus as claimed in claim 29 or 30, the first and second reference points being located towards the extremities of the stage area.
    32. Apparatus as claimed in any one of claims 25 to 31, the apparatus comprising a display and being configured to, in dependence of said scaled sets of pan and/or tilt parameters, form a virtual representation of the lighting fixtures and the stage area.
    33. Apparatus as claimed in any one of claim 25 to 32, said sets of one or more pan and/or tilt parameters defining one or more predetermined control sequences of the plurality of lighting fixtures.
    34. Apparatus as claimed in any one of claims 1 to 25 and 33, the predetermined control sequences defining one or more of pan, tilt, mode, colour or intensity for each of the plurality of lighting fixtures.
    35. A method of scaling a set of pan and/or tilt parameters for each of a plurality of lighting fixtures with known positions relative to one another and capable of providing light to a stage area, the plurality of lighting fixtures comprising a first lighting fixture, the method comprising:
    determining first pan and/or tilt parameters representing the pan and/or tilt of the first lighting fixture when light therefrom is focused on a first reference point on the stage area;
    defining coordinates of the first reference point under a coordinate system defined for the stage area; and transforming said sets of pan and/or tilt parameters for the plurality of lighting fixtures in dependence on their said known relative positions, said first pan and/or tilt parameters, the coordinates of the first reference point and a height of the first lighting fixture above the stage area.
    26
GB1301762.9A 2012-01-31 2013-01-31 Lighting control system Active GB2499123B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1607976.6A GB2535909B (en) 2012-01-31 2013-01-31 Lighting control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1201585.5A GB2500566A (en) 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition

Publications (3)

Publication Number Publication Date
GB201301762D0 GB201301762D0 (en) 2013-03-20
GB2499123A true GB2499123A (en) 2013-08-07
GB2499123B GB2499123B (en) 2016-08-03

Family

ID=45876343

Family Applications (3)

Application Number Title Priority Date Filing Date
GB1201585.5A Withdrawn GB2500566A (en) 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition
GB1607976.6A Active GB2535909B (en) 2012-01-31 2013-01-31 Lighting control system
GB1301762.9A Active GB2499123B (en) 2012-01-31 2013-01-31 Lighting control system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
GB1201585.5A Withdrawn GB2500566A (en) 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition
GB1607976.6A Active GB2535909B (en) 2012-01-31 2013-01-31 Lighting control system

Country Status (1)

Country Link
GB (3) GB2500566A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2535135A (en) * 2014-11-20 2016-08-17 Ambx Uk Ltd Light Control
WO2016198556A1 (en) * 2015-06-09 2016-12-15 Feeney Liam A visual tracking system and method
WO2017008023A1 (en) * 2015-07-08 2017-01-12 Production Resource Group, Llc Remotely controlled and monitored followspot
WO2017067764A1 (en) * 2015-10-19 2017-04-27 Philips Lighting Holding B.V. Harmonized light effect control across lighting system installations
DK201670601A1 (en) * 2016-06-12 2018-02-12 Apple Inc User interface for managing controllable external devices
AT519679A1 (en) * 2017-02-27 2018-09-15 Zactrack Gmbh Method for calibrating a rotating and pivoting stage equipment
US10708996B2 (en) 2015-08-20 2020-07-07 Signify Holding B.V. Spatial light effects based on lamp location
GB2581249A (en) * 2018-12-10 2020-08-12 Electronic Theatre Controls Inc Systems and methods for generating a lighting design
US10779085B1 (en) 2019-05-31 2020-09-15 Apple Inc. User interfaces for managing controllable external devices
US10820058B2 (en) 2018-05-07 2020-10-27 Apple Inc. User interfaces for viewing live video feeds and recorded video
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
GB2621929A (en) * 2022-07-29 2024-02-28 Electronic Theatre Controls Inc Method for creating XYZ focus paths with a user device
GB2622303A (en) * 2022-07-29 2024-03-13 Electronic Theatre Controls Inc Collision detection for venue lighting

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017194351A1 (en) * 2016-05-09 2017-11-16 Philips Lighting Holding B.V. Large area lighting aiming
US10670246B2 (en) 2017-04-03 2020-06-02 Robe Lighting S.R.O. Follow spot control system
US10678220B2 (en) 2017-04-03 2020-06-09 Robe Lighting S.R.O. Follow spot control system
CN107172776B (en) * 2017-05-27 2023-09-01 杭州罗莱迪思控制系统有限公司 Device and method for determining lighting effect by audience in night scene intelligent lighting system
EP3592119A1 (en) * 2018-06-08 2020-01-08 ROBE lighting s.r.o. Follow spot control system
CN111901947B (en) * 2020-08-03 2022-12-09 广州彩熠灯光股份有限公司 Method, system, device and medium for controlling stage light beam effect
CN113783993B (en) * 2021-09-10 2022-11-04 广州艾美网络科技有限公司 Stage lighting control method and device and stage lighting system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050077843A1 (en) * 2003-10-11 2005-04-14 Ronnie Benditt Method and apparatus for controlling a performing arts show by an onstage performer
US20060103333A1 (en) * 2004-11-18 2006-05-18 Robert Toms Stage lighting console
US20070174773A1 (en) * 2006-01-26 2007-07-26 International Business Machines Corporation System and method for controlling lighting in a digital video stream
US20080140231A1 (en) * 1999-07-14 2008-06-12 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for authoring and playing back lighting sequences

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2680263B2 (en) * 1994-05-10 1997-11-19 株式会社シアターデザイン Remote-controlled follow spotlight controller
US6255787B1 (en) * 1997-10-23 2001-07-03 High End Systems, Inc. Automated lighting control system utilizing laser and non-laser light sources
JP3677987B2 (en) * 1998-02-27 2005-08-03 松下電工株式会社 Tracking lighting system
US7620915B2 (en) * 2004-02-13 2009-11-17 Ludwig Lester F Electronic document editing employing multiple cursors
WO2007052197A1 (en) * 2005-11-01 2007-05-10 Koninklijke Philips Electronics N.V. Method, system and remote control for controlling the settings of each of a multitude of spotlights
DE102008006444A1 (en) * 2008-01-28 2009-07-30 Ma Lighting Technology Gmbh Method for operating a lighting console and lighting console
US20100238127A1 (en) * 2009-03-23 2010-09-23 Ma Lighting Technology Gmbh System comprising a lighting control console and a simulation computer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080140231A1 (en) * 1999-07-14 2008-06-12 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for authoring and playing back lighting sequences
US20050077843A1 (en) * 2003-10-11 2005-04-14 Ronnie Benditt Method and apparatus for controlling a performing arts show by an onstage performer
US20060103333A1 (en) * 2004-11-18 2006-05-18 Robert Toms Stage lighting console
US20070174773A1 (en) * 2006-01-26 2007-07-26 International Business Machines Corporation System and method for controlling lighting in a digital video stream

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2535135B (en) * 2014-11-20 2018-05-30 Ambx Uk Ltd Light Control
GB2535135A (en) * 2014-11-20 2016-08-17 Ambx Uk Ltd Light Control
US11076469B2 (en) 2015-06-09 2021-07-27 Liam Feeney Visual tracking system and method
US10405413B2 (en) 2015-06-09 2019-09-03 Liam Feeney Visual tracking system and method
US10575389B2 (en) 2015-06-09 2020-02-25 3D Stage Tracker Limited Visual tracking system and method
US11711880B2 (en) 2015-06-09 2023-07-25 Liam Feeney Visual tracking system and method
WO2016198556A1 (en) * 2015-06-09 2016-12-15 Feeney Liam A visual tracking system and method
US10302286B2 (en) 2015-07-08 2019-05-28 Production Resource Group, Llc Remotely controlled and monitored followspot
US20200025359A1 (en) * 2015-07-08 2020-01-23 Production Resource Group, Llc Remotely Controlled and Monitored Followspot
US9976731B2 (en) 2015-07-08 2018-05-22 Production Resource Group, Llc Remotely controlled and monitored followspot
US10036539B2 (en) 2015-07-08 2018-07-31 Production Resource Group, Llc Remotely controlled and monitored followspot
US10330292B2 (en) 2015-07-08 2019-06-25 Production Resource Group Llc Device for controlling a remotely located luminaire
WO2017008023A1 (en) * 2015-07-08 2017-01-12 Production Resource Group, Llc Remotely controlled and monitored followspot
US10708996B2 (en) 2015-08-20 2020-07-07 Signify Holding B.V. Spatial light effects based on lamp location
WO2017067764A1 (en) * 2015-10-19 2017-04-27 Philips Lighting Holding B.V. Harmonized light effect control across lighting system installations
US10353576B2 (en) 2016-06-12 2019-07-16 Apple Inc. User interface for managing controllable external devices
DK201670601A1 (en) * 2016-06-12 2018-02-12 Apple Inc User interface for managing controllable external devices
US10635303B2 (en) 2016-06-12 2020-04-28 Apple Inc. User interface for managing controllable external devices
DK201670602A1 (en) * 2016-06-12 2018-02-12 Apple Inc User interface for managing controllable external devices
US10564034B2 (en) 2017-02-27 2020-02-18 Zactrack Gmbh Method for calibrating a rotatable and pivotable piece of technical stage equipment
AT519679A1 (en) * 2017-02-27 2018-09-15 Zactrack Gmbh Method for calibrating a rotating and pivoting stage equipment
US10904628B2 (en) 2018-05-07 2021-01-26 Apple Inc. User interfaces for viewing live video feeds and recorded video
US10820058B2 (en) 2018-05-07 2020-10-27 Apple Inc. User interfaces for viewing live video feeds and recorded video
GB2581249B (en) * 2018-12-10 2021-10-20 Electronic Theatre Controls Inc Systems and methods for generating a lighting design
GB2581249A (en) * 2018-12-10 2020-08-12 Electronic Theatre Controls Inc Systems and methods for generating a lighting design
US10904029B2 (en) 2019-05-31 2021-01-26 Apple Inc. User interfaces for managing controllable external devices
US10779085B1 (en) 2019-05-31 2020-09-15 Apple Inc. User interfaces for managing controllable external devices
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
US11785387B2 (en) 2019-05-31 2023-10-10 Apple Inc. User interfaces for managing controllable external devices
US11824898B2 (en) 2019-05-31 2023-11-21 Apple Inc. User interfaces for managing a local network
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US11513667B2 (en) 2020-05-11 2022-11-29 Apple Inc. User interface for audio message
GB2621929A (en) * 2022-07-29 2024-02-28 Electronic Theatre Controls Inc Method for creating XYZ focus paths with a user device
GB2622303A (en) * 2022-07-29 2024-03-13 Electronic Theatre Controls Inc Collision detection for venue lighting

Also Published As

Publication number Publication date
GB2500566A (en) 2013-10-02
GB201201585D0 (en) 2012-03-14
GB2535909A (en) 2016-08-31
GB2499123B (en) 2016-08-03
GB201301762D0 (en) 2013-03-20
GB201607976D0 (en) 2016-06-22
GB2535909B (en) 2017-02-08

Similar Documents

Publication Publication Date Title
GB2499123A (en) Lighting control system
US11853635B2 (en) Configuration and operation of display devices including content curation
JP5404811B2 (en) Control system for controlling one or more controllable devices and method enabling such control
JP5825561B2 (en) Interactive lighting control system and method
US9526156B2 (en) System and method for theatrical followspot control interface
US20130249433A1 (en) Lighting controller
CN110162236B (en) Display method and device between virtual sample boards and computer equipment
US9928665B2 (en) Method and system for editing scene in three-dimensional space
US20150138188A1 (en) Method, apparatus and system for image processing
US9747714B2 (en) Method, device and computer software
US20200187334A1 (en) Systems and methods for generating a lighting design
US10732706B2 (en) Provision of virtual reality content
WO2013142024A1 (en) Controlling a device with visible light
CN110582146A (en) follow spot lamp control system
KR102371031B1 (en) Apparatus, system, method and program for video shooting in virtual production
US20160344946A1 (en) Screen System
JP2009290354A (en) Lighting device, and space production system
US20200184222A1 (en) Augmented reality tools for lighting design
CN111064946A (en) Video fusion method, system, device and storage medium based on indoor scene
US20140266766A1 (en) System and method for controlling multiple visual media elements using music input
US10032447B1 (en) System and method for manipulating audio data in view of corresponding visual data
JP2020102687A (en) Information processing apparatus, image processing apparatus, image processing method, and program
GB2399248A (en) Projection of supplementary image data onto a studio set
US20150115837A1 (en) Dimming console
KR101263881B1 (en) System for controlling unmanned broadcasting