GB2500566A - Automated lighting control system allowing three dimensional control and user interface gesture recognition - Google Patents

Automated lighting control system allowing three dimensional control and user interface gesture recognition Download PDF

Info

Publication number
GB2500566A
GB2500566A GB1201585.5A GB201201585A GB2500566A GB 2500566 A GB2500566 A GB 2500566A GB 201201585 A GB201201585 A GB 201201585A GB 2500566 A GB2500566 A GB 2500566A
Authority
GB
United Kingdom
Prior art keywords
automated lighting
lighting fixtures
automated
user
control system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1201585.5A
Other versions
GB201201585D0 (en
Inventor
Steve Warren
Jaspal Bhullar
Christopher Neil John Crockford
Richard Salzedo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AVOLITES Ltd
Original Assignee
AVOLITES Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AVOLITES Ltd filed Critical AVOLITES Ltd
Priority to GB1201585.5A priority Critical patent/GB2500566A/en
Publication of GB201201585D0 publication Critical patent/GB201201585D0/en
Priority to GB1607976.6A priority patent/GB2535909B/en
Priority to GB1301762.9A priority patent/GB2499123B/en
Publication of GB2500566A publication Critical patent/GB2500566A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An automated lighting control system for an entertainment lighting system allows the control of automated lighting fixtures 2, 3, 4 with regards their orientation in three dimensional space, using Polar and Cartesian co-ordinate manipulation via a touch sensitive graphical user interface, as well as third party image processing equipment to facilitate automation of various components of the automated lighting control system. A touch sensitive user interface (figure 2) is provided, on which is displayed a two-dimensional representation of an area illuminated by the lighting fixtures to allow a user to directly manipulate the lighting fixtures.

Description

Automated Lighting Control system allowing three dimensional control and user intertace gesture recognition.
BACKGROUND OF THE INVENTION
The present invention relates to a method and apparatus for the control and positioning of automated lighting fixtures, for the control and replay of entertainment lighting systems.
The use of automated lighting fixtures and ancillary control equipment has become prevalent within the entertainment industry. They are used for a variety of purposes, for the provision of key lighting that can be moved from one location upon the stage or performance are, to another at a different time within the show or event, through to effects lighting, where the beams of the automated lights are moved in specific sequences to enhance the atmosphere of the event being performed or undertaken.
Control systems exist that allow for the marshalling of the positioning of such beams, such that their automation can be utilised for various uses, such as the illumination of specific characters, actors, pieces of scenery, or the use of the beams of light as enhancements to the visual appearance of the entertainment form being undertaken.
The movement of such beams of light is normally controlled through the movement of a mirror reflecting the beam of light controlled by a pair of mirrors facilitating pan and tilt control, or by a pair of motors facilitating the pan and tilt of the entire light emitting unit. By sending each lighting fixture a timely stream of pan and tilt data, the beam of light may be manipulated and moved such that the end position upon which the beam of light falls, may be moved dynamically and proportionally to the pan and tilt data being sent to the automated lighting fixture.
Serial data protocols exist that perform the delivery of the pan and tilt data to a series of automated lighting fixtures from a controlling computer or console. Such protocols include 8 bit serial data such as USITT DMX512A, AMX, or PMX, through to embedding higher level protocols that act as a carrier for multiple lower serial data protocols. Such higher level protocols include proprietary protocols such as Art-Net, ACN, Pathport, Shownet, all who use Ethernet as a carrier for DMX. Some automated lighting fixtures exist that can accept Ethernet connections directly, but the predominance is for automated lighting fixtures to accept DMXS12 as the standard communications protocol.
The individual control of an automated lighting fixture is dependant upon a unique address being applied to each fixture. The allocation of the unique address by the controlling console is known colloquially as the "patching" of the automated lighting fixture to the controlling console. Control systems provide automated means to assign unique addresses to each automated lighting fixture, however the operator may individually assign each address should they desire. The concept of patching forms the lowest level of operational control between the controlling console and the automated lighting fixture itself.
Once an operator has patched" the automated lighting fixtures, in such a manner that they may be controlled by the console, the operator may move the beams of the automated lighting fixtures into positions on the performance area. Individual automated lighting fixtures maybe assigned to "groups", such that the operator of the console controlling the lighting, may be able to apply control conditions to certain selections of automated lighting fixtures at the same time.
Such groups may be automatically assigned by the controlling system at the time of patching, such that the operator may be presented with groups that represent odd numbered fixtures, as well as groups that represent even numbered fixtures. Groups may also be assigned by the operator manually.
The individual functions of the automated lighting fixtures, may be applied to group quantifier, such that a specific group of automated lighting fixtures, may have their colours, positions, or projection symbols, known colloquially as "Gobo's" changed under a group command, as opposed to an individual command. The selection of the automated lighting fixtures through a group, or through individual control, is the decision of the operator of the controlling system.
To record a snap shot view of all of the positions and outputs of the automated lighting fixtures, the control system captures all of the data parameters being sent to all of the automated lighting fixtures and saves this to its internal memory, this process is colloquially known as recording a "cue" or "memory". Such a cue or memory typically represents the positions and beam status of all of the automated lighting fixtures recorded in a typically static position.
To allow a high level understanding of where the automated lighting fixtures are located in three dimensional space, it is normal practice to define the movement space within which the systems will operate. There will normally be areas within the three dimensional space range of the automated lighting fixtures movement ability that are not desirable, likewise there will be areas of movement that are utilized the majority of the time. These may be defined using colloquial terms such as Upstage Left, Upstage Right, Downstage Left, and Downstage Right, thus the stage or performance area may be defined as a rectangular performance area within these bounds.
To ascertain where the desired space that will be used for the majority of the time is, with regards each automated lighting fixtures' pan and tilt ranges, stage positions for each beam of each automated lighting fixture are normally defined.
In a similar manner areas and items of common interest with regards to the shows production, such as specific stage positions such as the drum kit or the lead singer's microphone position, may be highlighted with each beam from each automated lighting fixture.
Such positional information groups that relate to a physical object or position within the performance area may be stored as a series of sets of pan and tilt information for the automated lighting fixtures, within the controlling computer or consoles memory.
Such commonly used beam positions may be referred to colloquially as "Preset Focuses", with each beams focus being preset to a position on the stage accordingly. Preset Focuses may also be applied to other functions of the automated lighting fixtures, such as colours, gobo's, or focus and zoom attributes.
In a similar manner to recording all the outputs for a selection of the automated lighting fixtures, under the system's control for a fixed position, it is possible to record multiple snap shots of positions of the beams of the automated lighting fixtures, such that said beams may be seen to move in a choreographed manner.
Moving beams of automated lighting fixtures between fixed cues or memories, in a timed sequence, may be recorded by the control system operator, as a series of colloquially known "linked" memories, or "chases".
As each automated lighting fixture has its own unique address, detailed in the patch, a chase may be applied to groups of automated lighting fixtures, such that multiple chases may be applied to multiple groups of automated lighting fixtures.
The operator of the control system may activate said automated lighting fixtures, and trigger their individual attributes to output specific beams of light by recalling the exact attributed value for each function of the automated lighting fixture from memory, and instructing each automated lighting fixture accordingly.
Automated lighting fixtures can be controlled by the application of automation algorithms that utilise mathematical functions to control their pan and tilt functions. Such algorithms may apply a mathematical offset to a row of automated lighting fixtures such that each fixtures has a specific offset applied to its positional control.
As an example, a sinusoidal waveform may be applied to either or both the pan and tilt controls of the automated lighting fixture, in order to create a movement waveform of beams of automated lighting fixtures, with each lighting fixture moving slightly after the previous to achieve an effect of staggered motion, which represents the mathematical waveform being applied in sequence to the pan and tilt controls of the automated lighting fixtures.
The application of a shape or mathematical complex to the animation of a number of beams from automated lighting fixtures, may be colloquially known as the application of a "shape" or "macro function".
In such a manner the operator of the control system may access and control the automated lighting fixtures, by selecting various groups, or individual control of the automated lighting fixtures, and then applying a present focus, or controlling each individual automated lighting fixtures attributes separately, in such a manner that the beams of lights form the individual automated lighting fixtures may be directed on to the performance area in a specified and controlled manner.
The beams of lights positions and colour and projection attributes, along with focusing information may be stored in the control systems' memory system through the form of a static memory, dynamic chase sequence, or an automated macro control which derives the control of the automated lighting fixtures from a mathematical function, or a macro assignment of numeric values across the range of the automated lighting fixtures themselves.
The operator of the control system controlling the automated lighting fixtures, may select and control each individual automated lighting fixture, or a selection of lighting fixtures, through a manual button press, interaction with a touch sensitive surface, deployment of a joystick, or the initiation of a rotary control encoder.
In such a manner each attribute of each automated lighting fixture may be controlled, such that the desired output of the automated lighting fixtures may be seen by the operator applied on the entertainment environment.
Such control systems have been in existence for many years, prior art exists in the form of both technological instantiations of both the control systems (US7,839,391, US7,495,671) and the automated lighting fixtures (US4392187, US4962687), that are thus controlled by such control systems. This invention presents new and novel control system that comprises of a plurality of means and techniques for the positioning, and operation of automated lighting fixtures from control systems, through unique man machine interfacing, and visual representation upon such interfacing.
SUMMARY OF THE INVENTION
According to a first aspect of the present invention there is provided a control system for the control and command of beams of light being emitted from automated lighting fixtures, such that the lighting control system comprising a screen that is touch sensitive and can support haptic interaction through gesticulation recognition, such that a plurality of automated lighting fixtures may be controlled and accessed through hand movements upon the interface. Furthermore the user's operational viewing angle may also be switched through the manipulation of the touch sensitive user interface to allow for the manipulation of the plurality of beams of light in the three dimensional real world through the gesticulation and interaction with the 2 dimensional touch sensitive user interface.
According to a second aspect of the present invention there is provided a methods for controlling a plurality of automated lighting fixtures, such that the user may place the beams of the automated lighting fixtures into a tracking mode, that the control system automatically calculates the geometry of each automated lighting fixture such that all the beams converge on one point in three dimensional space, such that this convergence point may be controlled by the user of the control system.
Furthermore that the user may define the edges of the performance stage area with the control system, utilising the convergence points of the plurality of automated lighting fixtures such that by doing so, and defining a set of geometric co-ordinates that each of the plurality of automated lighting fixtures may be bounded to operate within the performance space determined by the user of the control system on the touch sensitive user interface. The user may also apply rules and bounds to the plurality of automated lighting fixtures that can dictate policy to each of the plurality of automated lighting systems should a boundary limit be exceeded.
According to a third aspect of the present invention there is provided a methods for controlling a plurality of automated lighting fixtures such that if the geometry of the location and positioning of each of the plurality of automated lighting fixtures can be established within three dimensional space that represents the performance area, then the control system may afford to the user the facility to move the physical location of a singularity or plurality of automated lighting fixtures, such that their respective height and location in three dimensional space may be updated according to the physical venue within which the performance area is located. The control system, may take external reference data to establish where the performance area may be located, such external data may include GPS co-ordinates etc. established on the user's third party device.
According to a fourth aspect of the present invention there is provided a method for controlling automated lighting fixtures such that the user may group specific pluralities of automated lighting fixtures such that their beams may fall on the performance area in three dimensional space, in specific patterns or onto specific stage objects and actors, and that the control system will represent these objects through a specific indicia upon the user interface which the user may move around the user interface's two dimensional representation of the three dimensional performance space in such a manner to represent the updating of the location of the real world object within the real stage environment being worked with through the visualisation upon the touch sensitive user interface.
Furthermore the control system, equipped with the knowledge of where the automated lighting fixtures are located within the three dimensional performance space may automatically create groupings of pluralities of automated lighting fixtures for the user of the system, which the user may chose to work with.
According to a fifth aspect of the present invention there is provided a methods for controlling automated lighting fixtures through the lighting control system such that the user is afforded the ability to define a space within the three dimensional performance space as represented as such on the touch sensitive user interface, where no automated lighting fixture may place their respective beams, thus ensuring that the desired area in three dimensional performance space is kept dark and unlit from the beams of the plurality of automated lighting fixtures.
According to a sixth aspect of the present invention there is provided a methods for controlling automated lighting fixtures such that the lighting control system may afford to the user through the touch sensitive user interface, the ability to move a singularity of automated lighting fixtures from one position to the next, and for the control system to calculate for the user the ability of moving a pre determined subsequent plurality of automated lighting fixtures based upon their geometric location in the three dimensional space of the performance area, relevant to the change in positional data afforded to the first automated lighting fixture by the user, when working within a grouping of automated lighting fixtures.
According to a seventh aspect of the present invention there is provided a methods for controlling automated lighting fixtures such that the lighting control system that may calculate the physical height, or distance of a plurality of automated lighting fixtures from a known point within the performance space, using a mathematical function that can be applied to either or both the focus and zoom attributes of the automated lighting fixtures. In such a manner that when a projection symbol is seen to be in focus on the floor, or the user's target surface within the three dimensional performance space, that through a mathematical function is employed to calculate the height of each of the plurality of automated lighting fixtures that are emitting the projection symbol.
According to a eighth aspect of the present invention there is provided methods for controlling automated lighting fixtures that may afford interaction with a third party hand held device in the form of a camera or light meter, perhaps as found on such a device as mobile telephone, such that the data produced by such a third party device may form a feedback loop of timely data regarding the specific state of lighting at the third party devices location within the three dimensional performance space.
In a similar manner the lighting control system may access and control a plurality of automated lighting fixtures, in such a manner that each of the plurality of automated lighting fixtures' beams pass over the three dimensional performance area in a prescribed manner, perhaps sweeping the stage in an "S" pattern, from left to right across the performance areas, increasing and moving up the performance area towards the back, such that when the beams of each specific automated lighting system pass over the top of the third party device then the control system can identify which beam is passing at which time over the third party device at what point in time, thus affording the control system the ability to work out to what extent a singularity of automated lighting systems' beams are passing over the point in the three dimensional performance space where the third party device is located and to what extent and effect this beam is having on the overall light output at this point in space.
Furthermore the control system may receive back from the third party device chrominance and luminance data as each beam of light being emitted from a singularity of automated lighting fixtures may be registered. In such a manner the lighting control system may change the colour temperature controls for any suitably equipped automated lighting fixture, such that the totality of the plurality of automated lighting fixtures can be seen in the performance area to emit the same colour temperature characteristics.
Furthermore the control system may receive back from the third party device chrominance and luminance data with regards to the colour of a third party object, such as a stage costume, stage curtain etc, such that this colour or texture data may be applied to any of the plurality of automated lighting fixtures accordingly.
Furthermore the control system may afford the user the ability to position the third party device in specific locations within the three dimensional performance space such that the boundaries within which the beams of the plurality of automated lighting fixtures may operate may be defined and established by the location, or a series of locations of the third party user device. Having established the reference positional data from each of the plurality of automated lighting fixtures, with regards to each of the positions that the user has placed their third party device the lighting control system may update the positioning of the entire plurality of automated lighting fixtures for each grouping or focus where the plurality of lighting fixtures' beams are focused on specific locations upon the performance area.
According to a ninth aspect of the present invention there is provided methods for controlling automated lighting fixtures that by the positioning, by the user, of a third party video capture device, in any location where the three dimensional performance area may be observed by the camera device, such that a specific actor, or motion object on the three dimensional performance area may be followed by a the convergence of a plurality of automated lighting fixture beams, in such a manner that the convergence of beams is seen to "track" the performer.
Furthermore the lighting control system may receive a stream of timely data via a radio frequency interface from a third party devices such as an inertial measurement unit, accelerometer, laser rangefinder, Electro Optical, or Infra Red tracking camera that enables the control system to understand the movement of the specified actor, performer, or motion object upon the three dimensional performance area such that a singularity of beams of automated lighting fixtures may be delivered to that actor, performer, or motion object.
Furthermore the lighting control system may afford the user the ability to define which of the plurality of automated lighting fixtures may track which actors, performers, or motion objects upon the three dimensional performance area, thus affording the ability for a plurality of motion objects, actors, or performers to be followed by different and separate convergences of beams from the plurality of automated lighting fixtures being used to illuminate the three dimensional performance space.
Furthermore the lighting control system may, through the usage of the third party video capture device, afford the user the capability to assign various singularities of automated lighting fixtures to follow various motion objects, actors, or performers dependent upon the colour of the clothing, or colour of the finish of the object, actor or performing moving within the three dimensional performance area.
Furthermore the lighting control system may effect a boundary limit upon the plurality of automated lighting fixtures, if a strip of reflective tape is placed upon a surface within the three dimensional performance space, which a beam of one of the plurality of automated lighting fixtures is seen to traverse across the reflective tape strip, upon the third party video capture device.
According to a tenth aspect of the present invention there is provided methods for controlling automated lighting fixtures that by the positioning, by the user, of a third party video capture device, in any location where the three dimensional performance area may be observed by the camera device, such that a specific actor, or motion object upon the three dimensional performance area may be identified, and the control system may instruct a plurality of automated lighting fixtures to illuminate the specified actor, performer or motion object in such a manner that the colour balance of the illuminated actor, performer, or motion object is immediately adjusted to suit the video camera device that is providing the video feed for review. In such a manner the control system may geometrically calculate the location of the video camera source, the actor, performer, or motion object upon the three dimensional performance area and each of the plurality of automated lighting fixtures within three dimensional space, and direct operations such that the plurality of automated lighting fixtures illuminates the correct actor, performer, or motion object accordingly.
S According to the final aspect of the present invention there is provided methods for controlling automated lighting fixtures that by the positioning, by the user, of a third party video capture device, which through a closed feedback loop can provide visual means to synchronise the effects of the lighting control system with the movement of the plurality of automated lighting fixtures. By synchronising the individual command structures employed by the user and executed by the lighting control system, with the videographic input media, it is possible to gain a visual understanding of the performance within which the plurality of automated lighting fixtures are being employed as a function of a specific point in the running time of the production. Such a linkage of each button press on the lighting control system by the user, and the underlying computations being effected allows for a system that can afford to the user the ability to review the video of the performance and to make offline edits to specific lighting control system commands to the plurality of automated lighting fixtures.
Furthermore it will be possible for the user to identify specific sequences and effects of the plurality of automated lighting fixtures within the performance, and to assign an overview name for these effects and sequences, and thus be able to recall these effects and sequences by a colloquial name for insertion or deletion from any point in the performance.
Furthermore it will be possible for the user to video record a rehearsal of the production or performance with no lighting and thus to overlay the effect of the plurality of automated lighting fixtures, perhaps through techniques such as alpha blending, morphing etc. In such a way the user of the lighting control system will be able to judge the suitability of the performance lighting in an offline mode, utilising the rehearsal video footage as the background for their comparison.
DESCRIPTION OF THE DRAWINGS
The present invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is a schematic diagram of a stage detailing the location and output of a plurality of automated lighting fixtures.
Figure 2 is a schematic diagram of a touch sensitive graphical user interface, representing the stage as detailed in Figure 1.
Figure 3 is a schematic diagram of a users hand placed on top of the touch screen graphical user interface as detailed in Figure 2.
Figure 4 is a schematic diagram of a third party device containing optical sensor providing a closed feedback loop via Radio Frequency transmission.
DETAILED DESCRIPTION OF THE DRAWINGS
The following description is presented to enable any person skilled in the art to make and use the invention, and is provided in the context of a particular application. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art.
The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The present invention recognises that each event's stage (1) will utilise a different plurality of automated lighting fixtures as detailed in Figure 1 (2,3,4), and that such a plurality of automated lighting fixtures will be located in three dimensional space with a proportional height (22), perhaps relevant to some or all of plurality of automated lighting fixtures, but that there will be a three dimensional spacing (23,24,25) between each of the automated lighting fixtures.
Each of the automated lighting fixtures has the ability to output a beam of light (5,6,7) on to the stage area, however it is possible to focus the beams of light from each automated lighting fixture onto one central convergence point (8) on the stage, which would in effect move the beams of light from their straight down positions of (5,6,7) through a travel of (9,10,11).
The present invention further recognises that it is possible for this convergence point to move in three dimensional space (12,13,14), across a predefined stage area (15,16,17,18) and that it is possible to bound the movement of the convergence point across the stage within these stage area co-ordinates (15,16,17,18), furthermore it is possible to mark dynamic areas(15,19,20,21) of the stage (1), which are not to be entered by any of the beams of light (2,3,4).
The present invention recognises that the deployment of third party cameras (26,27,28) in any configuration on or off the stage, allows for the control system to understand the movement of the beams of the automated lighting fixtures (2,3,4), furthermore if a camera or light-meter (329) is placed within a specific area of the stage, for example (316317,318,319) and has the ability to feedback (352) to the lighting control system (350) then it is possible for the control system to sequentially move the beams of the automated lighting fixtures (2,3,4,302) until they are shining their respective beams on to this point.
The present invention recognises that using these cameras and third party reflective devices, such as tape marks on the stage floor (29) that it is possible to guide, and or limit a plurality of automated lighting fixtures outputs within certain boundaries.
The usage of third party video cameras (26,27,28) allows for the complete comprehension of the effect of each button press on the control system, with the relevant light output seen on the stage, such that if the video cameras (26,27,28) can be fed back to the control system, then the video output can be tied with the control systems user inputs accordingly.
A schematic diagram of a touch sensitive graphical user interface is shown in figure 2. Three automated lighting fixtures are detailed (102,103,104), which represent the three automated lighting fixtures seen above the stage (1) in figure 1, are realised on the plan view of the stage (101).
The touch sensitive graphical user interface (100) shows the stage (115,116,117,118) from figure 1 (15,16,17,18) in plan view, although this is not limited, and the elevation would be dependant upon the user's choice of view to work with. The beams of the automated lighting fixtures (102,103,104) can be seen to be converging on one point (108), which is the pan view representation of figure 1 beam convergence point (8). The present invention also recognises that the items of interest, or focal points (29,30) from figure 1 are also represented upon the stage (129,130) within figure 2.
A representation of a user's hand (201) is shown above the touch sensitive user interface (200), which details the ability for the user's hand gesticulation to move three beams of light, landing positions upon a stage (202,203,204), to three new positions (207,208,209), such that the radius (206) of an arc (205) drawn through the beams of light (202,203,204), can be seen to be reduced down to a smaller radius (210) of a smaller arc (211) achieved by the user arching their hand and dragging the beams closer together. This demonstration of the user's hand gestures to control a plurality of automated lighting fixtures can be applied to not only the representation of the beams as seen in 3 dimensional space as depicted in figure 1, but also in any 2 dimensional elevation looking at the same stage as depicted in figure 2.

Claims (19)

  1. CLAIMS1. An automated lighting control system for the programming and instruction of automated lighting fixtures comprising: A touch sensitive user interface upon which the user may gesticulate the direction and orientation of a singularity or a plurality of beams of light being emitted from one or more automated lighting fixtures) through the manipulation of indicia, of either the beams of light) a convergence point for beams of light, or through indicia representative of the entities upon the stage area to which the beams of light are to be focused upon; wherein the stage area shown upon the touch sensitive surface may be a two dimensional representation, perhaps in either plan or front elevation views, allowing the user to manipulate the plurality of automated lighting fixtures, as well as facilitating the ability to switch views, through a recognised touch screen gesture, thus allowing the user to view and interact with the plurality of automated lighting fixtures through the deployment of control position data derived from the simple two dimensional Cartesian co-ordinates, with relation to the view being employed by the user and controlled by the touch screen operation, converted into Polar co-ordinate instructions that are then relayed to the automated lighting fixtures.
  2. 2. An automated lighting control system for the programming and instruction of automated lighting fixtures as claimed in claim 1, such that a plurality of automated lighting fixtures may be controlled through the users hand placement on the touch sensitive user interface. In such a manner specific fingers or thumbs may be utilised to control a start and end grouping or bound, to a plurality of automated lighting fixtures, such that the plurality of automated lighting fixtures' beams may be moved together through both group acquiring fingers moving together.Such a system may facilitate the spreading or fanning out of the beams, or the bringing together, through a increase or decrease in the physical distance between the first and last fingers being utilised. Furthermore the beams of the plurality of automated lighting fixtures may be rotated around a physical point by the rotation of the hand and subsequent selection fingers.The controlling point in three dimensional space, with regards the vertical index, around which each of the beams of the plurality of automated lighting fixtures may or may not converge, may also be controlled by a third fingers motion. Finally the positioning of the specific plurality of automated lighting fixtures beams, and where they fall on the stage or performance area, may be controlled by the movement of all of the fingers travelling across the touch sensitive surface.In such a manner the user may grab control of a plurality of automated lighting fixtures and using their hand gestures manipulate the beams in three dimensional space with reference to each other and the position of the users hands upon the touch sensitive user interface.
  3. 3. An automated lighting control system for the programming and instruction of automated lighting fixtures as claimed in claim 1, such that a plurality of indicia representative of the real life objects, deployed within the performance area, may be displayed on a screen and manipulated by the user of the system. Such that the system may automatically create a number of groups of pluralities of beams of light that may be associated with the real life object in the performance area. Furthermore, if a plurality of beams of light being emitted from automated lighting fixtures are focused on the real life stage object, and the real life stage object is moved in physical space, then the indicia representative on the automated lighting control systems screen may be moved by the user, in a direction, determined by the user to represent the movement of the real life stage object, thus the plurality of beams of light being assigned to illuminate the real life stage object, will move accordingly to the new position of the real life stage object.
  4. 4. An automated lighting control system as claimed in claim 1 or 2, where the user may define and control the operating extremities of the stage area through end point definitions that represent the corners or boundaries of the performance area. Such end point boundaries representing unique mathematical reference points for each automated lighting fixtures pan and tilt control values, such that the location of each automated lighting fixture may be established with reference to the operating environment of the stage or performance area. In such a manner the user may move an indicia upon the control systems visual interface, representative of the cross point of a plurality of beams of light, in three dimensional space, from automated lighting fixtures, such that the indicia effects the control of the polar co-ordinates of the automated lighting fixtures, such that their beams' convergence is kept constant, but at a cross point controlled by the user of the automated lighting control system.
  5. 5. An automated lighting control system as defined in any preceding claim, that allows for the user to define rules relating to the control of a plurality of automated lighting fixtures, such that if the beams of the automated lighting fixtures, cross or land on a specific area within the stage or performance area as defined by pre-determined boundary co-ordinates, that specific automated lighting fixtures attributes' output changes during such boundary incursion. In such a manner areas around or on the peripheries of the performance or stage area may be protected from the beams of the plurality of automated lighting fixtures landing with specific attributes being applied.
  6. 6. An automated lighting control system as defined in any preceding claim that allows user to effect a change in physical height between some or all of the plurality of automated lighting fixtures and the boundary positions on the performance or stage area, thus allowing an increase or decrease in the operating heights of the automated lighting fixtures to be calculated and thus applied to all operational cue information to be effected throughout a show or performance.Furthermore if the location of the user, or the lighting control system is known, perhaps through Global Positioning Systems in a third party device or product that are linked wirelessly to the lighting control system, then the stage or performance boundary area co-ordinates may be applied according to the location, likewise the vertical heights of the plurality of automated lighting fixtures may be predetermined and calculations to the performances cueing data may be implemented prior to, or during the user's interaction with the lighting control system.
  7. 7. An automated lighting control system as defined in any preceding claim that allows user to through the manipulation and selection of a specific section of the stage or performance area, as defined by a performance boundary previously determined by the user and programmed into the automated lighting control system, may effect that all the beams of the plurality of automated lighting fixtures that are illuminating the selected specific section of the performance area may have their beams controlled and set to a specific attribute of the automated lighting fixtures. In such a manner the user may change the colour, or projection symbols for half or a proportion of the performance area, without specifically knowing which automated lighting fixtures are illuminating it at that point in time.
  8. 8. An automated lighting control system as defined in any preceding claim that allows for the control system to calculate the polar co-ordinate offset for a plurality of automated lighting fixtures, with regards their specific locations in three dimensional space, based on previously declared physical spacing between automated lighting fixtures within the three dimensional performance space. Such that if the user indicates what is required to reposition one automated lighting fixture in three dimensional space, the system can suggest a polar co-ordinate offset to apply to each automated lighting fixture in turn such that the console can assist the user by prepositioning each of the selected fixtures at an appropriate point in space, and on the representative user interface, based on the first fixtures movement execution to a reference point in that three dimensional space. Such a movement may be temporarily stored within the automated lighting control system, such that the automated lighting fixtures position may be known to the control system to be in different three dimensional space locations at different points in time throughout the shows duration.
  9. 9. An automated lighting control system as defined in any preceding claim that allows for the user to determine the height or length of beam from a single automated lighting fixture, by referencing a specific attribute or function of the light, such that the beam of the automated lighting fixture, may be focused using a projection symbol, which will reveal the focal length of the beam from the automated lighting fixture itself, to the surface upon which the projection symbol is being illuminated. In such a manner the automated lighting control system may gain an understanding of the length of the specific automated lighting fixtures beam of light being emitted, and thus calculate the height of the automated lighting fixture from the projected surface, thus establishing one dimension of the automated lighting fixtures position in three dimensional space, the other two dimensions being known by the automated lighting console through analysis of the automated lighting fixtures' Pan and Tilt values.
  10. 10. An automated lighting control system as defined in any preceding claim that allows for the user to place a third party device that contains a light meter in the form of a camera and a wireless communications means, such that the thirds party device may communicate an increase in light level intensity being received through the light meter, or camera within the device, back to the automated lighting control system, such that the automated lighting control system may employ a sequence of control, such that a plurality of automated lighting fixtures may, either individually or as a group, move across the stage or performance area in a predetermined pattern, allowing for the 3 party device to report back to the automated lighting system when the beam of an individual automated lighting fixture is passing over the top of the third party device. In such a manner the automated lighting control system may calculate for at least two dimensions (pan and tilt) where each of the automated lighting fixtures is located in 3D space.The usage of the third party device may also be deployed by the user to specific positions on the stage or performance area to effect a focus of a plurality of automated lighting fixtures upon that stage position, which may be represented on the automated lighting control system by representative indicia, or may permit the user to mark out the boundaries of the stage or performance area through positioning the third party device at the physical edges of the stage or performance area and identifying and reporting back when the beams of the lights are at these temporal and user defined extents.
  11. 11. An automated lighting control system as defined in any preceding claim that allows for the user to place a third party device that contains a light meter in the form of a camera and a wireless communications means, such that the thirds party device may utilise the camera or light meter function and communicate back to the automated lighting control system the colour temperature of the light being received into the third party device. In a similar manner the third party device may be utilised to identify the illuminated reflective colour of a singularity or plurality of automated lighting fixtures' beams as they fall on the surface where the third party device may be located. In the case where a plurality of automated lighting fixtures are illuminating a single surface, and one automated lighting fixtures colour output is incorrect, perhaps owing to the age of the lightsource, the third party device may be utilised to talk back to the console, identify the specific automated lighting fixture, and adjust the specific automated lighting console's beams colour output accordingly until a match is achieved as viewed through the camera or lightmeter of the third party device. In a similar manner the user may utilise the camera or lightmeter functionality of the third party device to acquire the colour or image of an object and pass this back to the automated lighting control system, such that the output from a plurality of automated lighting fixtures may be tailored to the colour or projection symbol identified and captured by the third party device.
  12. 12. An automated lighting control system as defined in any preceding claim that allows for the user to deploy a camera system either above the stage or performance area, or at the front of the stage or performance area, or a plurality of camera systems with low light functionality and performance to allow a visual closed feedback loop to be achieved to the automated lighting console such that the automated lighting console may control a specific plurality of automated lighting fixtures such that they may follow an object or actor across the stage or performance area, by analysing the motion of the object or actor on the camera's video output, and calculating continuously updated movement commands for each of the specific plurality of automated lighting fixtures assigned to the specific actor. In such a manner, groups of automated lighting fixtures may be seen to appear to track and follow specific actors in a live fashion. The tracking system may be augmented by automated lighting control system by the deployment of a spectral colour change to calculate which beams from which plurality of automated lighting fixtures are illuminating which objects or actors, and to ensure that the correct automated lighting fixture is illuminating the correct object or actor in the correct colour accordingly.
  13. 13. An automated lighting control system as defined in any preceding claim that allows for the user to deploy a reflective tape on the stage surface, which may be deployed to mark out the limitations of the plurality of automated lighting fixtures travels. Such reflective tape may work well under low level or infra red light frequencies, such that a camera deployed above the stage or performance area may be able to visually distinguish the bounds of the performance area such that if a singularity of automated lighting fixtures' beams travels past or touches the edge of the boundary delineated by the reflective tape, the automated lighting control system may identify the specific automated lighting fixture and prevent the beam of the automated lighting fixture from cross the boundary as defined by the reflective tape.
  14. 14. An automated lighting control system as defined in any preceding claim that allows for the user to control the tracking of an object or actor by the beams of a specific plurality of automated lighting fixtures. Such motion of the object or actor within the stage or performance area may be augmented by the deployment of a third party device upon the object that contains an accelerometer or Inertial Measurement Unit (IMU) to augment the data being presented visually via the plurality of camera systems pointing at the stage or performance area.Alternatively an active camera system incorporating Laser Range Finding and object tracking may be deployed at a known location in three dimensional space to the plurality of automated lighting fixtures, and pass its object track and range data back to the automated lighting system to facilitate the convergence of a plurality of automated lighting fixtures beams upon the subject being targeted by the laser ranging and object tracking system.
  15. 15. An automated lighting control system as defined in any preceding claim that allows for the user to take in a video camera feed into the automated lighting control system, and to calculate where the camera that is acquiring the feed is located, through the deployment of changes to specific automated lighting fixtures within the plurality of automated lighting fixtures being deployed to illuminate the performance area. In such a manner the user may implement a rule based logic, such that wherever the video camera feed is looking, the automated lighting console will respond by ensuring that the persons or objects illuminated in the field of view of the camera producing the video feed, will be illuminated by the specific automated lighting fixtures active within the camera's frustrum with a specific type or effect of light as defined by the user.
  16. 16. An automated lighting control system as defined in any preceding claim that allows for the user to attach to the system a plurality of camera systems, such that the cameras' imagery are utilised to capture a snapshot in real time of the show or events performance from a plurality of angles, such that this video information can then be used to reference specific automated lighting fixtures actions' and the behaviour of other multimedia assets, and their associated performance at the point relevant to the frame of camera video data that is being received by the system. In such a manner the automated lighting control system can be seen to effect a single point of reference for all visual, audio, and mechanical elements of the performance as reviewed by the cameras that are deployed to observe the stage or performance area.Furthermore the system may augment the video footage of the show or event, by recording every button press on the console, such that at any chronological point throughout the show or event, the video being relayed into the system is augmented by an exacting tracked list of controlling functions that have been deployed by the user.
  17. 17. A method as claimed in claim 16, wherein the system may be used as an offline visual editing tool, such that the user may return to a specific point in the proceedings of the event or show and be able to switch the system into the exact functional settings that the control system was at the previously recorded point in the event or show. As such at any specific point in the show or event may be reached from the video recall method, and the output from the automated lighting control system will output to the attached plurality of automated lighting fixtures, the correct data to place them in the exact state associated with the chronological position of the video archived. Once the offline scene has been recreated from the automated lighting fixture data at the chronological point of the video, the user may then change specific parameters of the plurality of automated lighting fixtures.
  18. 18. A method as claimed in claim 16, wherein the user may capture the effects of the beams of the plurality of automated lighting fixtures being executed by the automated lighting control system at a point in the chronological history of the video associated with the show or event, and recreate the effects of the beams of the plurality of automated lighting fixtures, give this effect a moniker or name, and then apply this lock or effect to a new show or event, with the automated lighting control system being able to capture and apply the mathematical function matrix that was applied to the plurality of moving lights, and to hide this complexity, through the association of a moniker or name.
  19. 19. A method as claimed in claim 16, wherein the user may capture the video data of the show or event and overlay the live video data, such that the user may be able to see the differences between the old and the new lighting output and its effects upon the stage or performance area. In such a manner the user would be able to record a video of the performance or event with no effects lighting present, but would then post the completion of recording the event, be able to implement lighting effects and to view them on the combined live video and post live video recreation of the event itself.
GB1201585.5A 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition Withdrawn GB2500566A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1201585.5A GB2500566A (en) 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition
GB1607976.6A GB2535909B (en) 2012-01-31 2013-01-31 Lighting control system
GB1301762.9A GB2499123B (en) 2012-01-31 2013-01-31 Lighting control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1201585.5A GB2500566A (en) 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition

Publications (2)

Publication Number Publication Date
GB201201585D0 GB201201585D0 (en) 2012-03-14
GB2500566A true GB2500566A (en) 2013-10-02

Family

ID=45876343

Family Applications (3)

Application Number Title Priority Date Filing Date
GB1201585.5A Withdrawn GB2500566A (en) 2012-01-31 2012-01-31 Automated lighting control system allowing three dimensional control and user interface gesture recognition
GB1301762.9A Active GB2499123B (en) 2012-01-31 2013-01-31 Lighting control system
GB1607976.6A Active GB2535909B (en) 2012-01-31 2013-01-31 Lighting control system

Family Applications After (2)

Application Number Title Priority Date Filing Date
GB1301762.9A Active GB2499123B (en) 2012-01-31 2013-01-31 Lighting control system
GB1607976.6A Active GB2535909B (en) 2012-01-31 2013-01-31 Lighting control system

Country Status (1)

Country Link
GB (3) GB2500566A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2535135A (en) * 2014-11-20 2016-08-17 Ambx Uk Ltd Light Control
WO2017067764A1 (en) 2015-10-19 2017-04-27 Philips Lighting Holding B.V. Harmonized light effect control across lighting system installations
WO2017194351A1 (en) * 2016-05-09 2017-11-16 Philips Lighting Holding B.V. Large area lighting aiming
CN110582146A (en) * 2018-06-08 2019-12-17 罗布照明公司 follow spot lamp control system
CN111901947A (en) * 2020-08-03 2020-11-06 广州彩熠灯光股份有限公司 Stage light beam effect control method, system, device and medium
CN113783993A (en) * 2021-09-10 2021-12-10 广州艾美网络科技有限公司 Stage lighting control method and device and stage lighting system

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016198556A1 (en) 2015-06-09 2016-12-15 Feeney Liam A visual tracking system and method
US10330292B2 (en) 2015-07-08 2019-06-25 Production Resource Group Llc Device for controlling a remotely located luminaire
EP3338517B1 (en) 2015-08-20 2023-06-07 Signify Holding B.V. Spatial light effects based on lamp location
DK179593B1 (en) 2016-06-12 2019-02-25 Apple Inc. User interface for managing controllable external devices
AT519679A1 (en) 2017-02-27 2018-09-15 Zactrack Gmbh Method for calibrating a rotating and pivoting stage equipment
EP3393213B1 (en) 2017-04-03 2022-10-12 ROBE lighting s.r.o. Follow spot control system
US10678220B2 (en) 2017-04-03 2020-06-09 Robe Lighting S.R.O. Follow spot control system
CN107172776B (en) * 2017-05-27 2023-09-01 杭州罗莱迪思控制系统有限公司 Device and method for determining lighting effect by audience in night scene intelligent lighting system
US10904628B2 (en) 2018-05-07 2021-01-26 Apple Inc. User interfaces for viewing live video feeds and recorded video
GB2581249B (en) * 2018-12-10 2021-10-20 Electronic Theatre Controls Inc Systems and methods for generating a lighting design
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
US10904029B2 (en) 2019-05-31 2021-01-26 Apple Inc. User interfaces for managing controllable external devices
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US20240035648A1 (en) * 2022-07-29 2024-02-01 Electronic Theatre Controls, Inc. Method for creating xyz focus paths with a user device
US11805588B1 (en) * 2022-07-29 2023-10-31 Electronic Theatre Controls, Inc. Collision detection for venue lighting

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08138871A (en) * 1994-05-10 1996-05-31 Shiataa Design:Kk Remote control for follow spotlight
JPH11251074A (en) * 1998-02-27 1999-09-17 Matsushita Electric Works Ltd Tracking lighting system
US20090190327A1 (en) * 2008-01-28 2009-07-30 Michael Adenau Method For Operating A Lighting Control Console And Lighting Control Console
US20100060607A1 (en) * 2004-02-13 2010-03-11 Ludwig Lester F User interface mouse with touchpad responsive to gestures and multi-touch
US20100238127A1 (en) * 2009-03-23 2010-09-23 Ma Lighting Technology Gmbh System comprising a lighting control console and a simulation computer

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6255787B1 (en) * 1997-10-23 2001-07-03 High End Systems, Inc. Automated lighting control system utilizing laser and non-laser light sources
US20080140231A1 (en) * 1999-07-14 2008-06-12 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for authoring and playing back lighting sequences
US20050077843A1 (en) * 2003-10-11 2005-04-14 Ronnie Benditt Method and apparatus for controlling a performing arts show by an onstage performer
US7221109B2 (en) * 2004-11-18 2007-05-22 Robert Toms Stage lighting console
EP1946618B9 (en) * 2005-11-01 2013-12-25 Koninklijke Philips N.V. Method, system and remote control for controlling the settings of each of a multitude of spotlights
US20070174773A1 (en) * 2006-01-26 2007-07-26 International Business Machines Corporation System and method for controlling lighting in a digital video stream

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08138871A (en) * 1994-05-10 1996-05-31 Shiataa Design:Kk Remote control for follow spotlight
JPH11251074A (en) * 1998-02-27 1999-09-17 Matsushita Electric Works Ltd Tracking lighting system
US20100060607A1 (en) * 2004-02-13 2010-03-11 Ludwig Lester F User interface mouse with touchpad responsive to gestures and multi-touch
US20090190327A1 (en) * 2008-01-28 2009-07-30 Michael Adenau Method For Operating A Lighting Control Console And Lighting Control Console
US20100238127A1 (en) * 2009-03-23 2010-09-23 Ma Lighting Technology Gmbh System comprising a lighting control console and a simulation computer

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2535135A (en) * 2014-11-20 2016-08-17 Ambx Uk Ltd Light Control
GB2535135B (en) * 2014-11-20 2018-05-30 Ambx Uk Ltd Light Control
WO2017067764A1 (en) 2015-10-19 2017-04-27 Philips Lighting Holding B.V. Harmonized light effect control across lighting system installations
WO2017194351A1 (en) * 2016-05-09 2017-11-16 Philips Lighting Holding B.V. Large area lighting aiming
CN110582146A (en) * 2018-06-08 2019-12-17 罗布照明公司 follow spot lamp control system
CN110582146B (en) * 2018-06-08 2022-03-25 罗布照明公司 Follow spot lamp control system
CN111901947A (en) * 2020-08-03 2020-11-06 广州彩熠灯光股份有限公司 Stage light beam effect control method, system, device and medium
CN113783993A (en) * 2021-09-10 2021-12-10 广州艾美网络科技有限公司 Stage lighting control method and device and stage lighting system
CN113783993B (en) * 2021-09-10 2022-11-04 广州艾美网络科技有限公司 Stage lighting control method and device and stage lighting system

Also Published As

Publication number Publication date
GB201301762D0 (en) 2013-03-20
GB201201585D0 (en) 2012-03-14
GB2535909B (en) 2017-02-08
GB2499123B (en) 2016-08-03
GB201607976D0 (en) 2016-06-22
GB2499123A (en) 2013-08-07
GB2535909A (en) 2016-08-31

Similar Documents

Publication Publication Date Title
GB2500566A (en) Automated lighting control system allowing three dimensional control and user interface gesture recognition
US10306134B2 (en) System and method for controlling an equipment related to image capture
US10317775B2 (en) System and techniques for image capture
Underkoffler et al. Emancipated pixels: real-world graphics in the luminous room
KR101187500B1 (en) Light projection device and illumination device
JP5652705B2 (en) Dimming control device, dimming control method, and dimming control program
US20180247463A1 (en) Information processing apparatus, information processing method, and program
KR20200123049A (en) Virtual reality control system
JP2010522922A (en) System and method for tracking electronic devices
EP3393213B1 (en) Follow spot control system
US20120236158A1 (en) Virtual directors' camera
GB2581249A (en) Systems and methods for generating a lighting design
CN110582146A (en) follow spot lamp control system
KR101242089B1 (en) Interactive stage system apatrtus and simulation method of the same
JP2000132306A (en) Device and method for inputting information and game device
JP5258387B2 (en) Lighting device, space production system
US20200184222A1 (en) Augmented reality tools for lighting design
CN107798723B (en) Target tracking control method and device
CN111064946A (en) Video fusion method, system, device and storage medium based on indoor scene
CN111050128A (en) Video fusion method, system, device and storage medium based on outdoor scene
CN115187108A (en) Distributed color ranking method and system based on virtual stage
JP4546953B2 (en) Wheel motion control input device for animation system
JP5842144B2 (en) Dimming control device, dimming control method, and dimming control program
JP2021105639A5 (en)
US11805588B1 (en) Collision detection for venue lighting

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)