GB2354602A - Digital controlling system for electronic lighting devices - Google Patents

Digital controlling system for electronic lighting devices Download PDF

Info

Publication number
GB2354602A
GB2354602A GB9920969A GB9920969A GB2354602A GB 2354602 A GB2354602 A GB 2354602A GB 9920969 A GB9920969 A GB 9920969A GB 9920969 A GB9920969 A GB 9920969A GB 2354602 A GB2354602 A GB 2354602A
Authority
GB
Grant status
Application
Patent type
Prior art keywords
audio information
used
lighting devices
electronic lighting
activity indicators
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9920969A
Other versions
GB9920969D0 (en )
Inventor
Peter Stefan Jones
Original Assignee
Peter Stefan Jones
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J17/00Apparatus for performing colour-music
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHTING NOT OTHERWISE PROVIDED FOR
    • H05B37/00Circuit arrangements for electric light sources in general
    • H05B37/02Controlling
    • H05B37/0209Controlling the instant of the ignition or of the extinction
    • H05B37/0227Controlling the instant of the ignition or of the extinction by detection only of parameters other than ambient light, e.g. by sound detectors, by passive infra-red detectors
    • H05B37/0236Controlling the instant of the ignition or of the extinction by detection only of parameters other than ambient light, e.g. by sound detectors, by passive infra-red detectors by detection of audible sound

Abstract

A digital real-time, controlling system for electronic lighting devices that analyses audio information to determine key indicators such as timing, texture, feel, timbre and style. The analysed information can be used to control electronic lighting devices and to schedule sets of commands for electronic lighting devices based on analysis of the audio information. The result is a digital real-time electronic lighting device controller that synchionises lighting control to the audio performance sympathetically with a high degree of autonomy, yet still provides a varied and context sensitive lighting show.

Description

1 2354602 DIGITAL CONTROLLING SYSTEM FOR ELECTRONIC LIGHTING DEVICES

Field

This invention relates to a digital controlling system for electronic lighting devices.

Backgro, Electronic lighting systems have developed greatly both in complexity and performance over the past few years. Most stage, TV or night club lighting systems operate with a number of very versatile, electronically operated lighting lo fixtures that provide exciting visual effects such as pariming, tilting, flashing and rotating. The entirety of individual fixtures - normally combined with a lighting console - comprise a lighting system. Lighting systems are in the main, operated via a lighting console which provides commands to each of the fixtures and can be programmed to achieve certain effects at specific times. Very powerful effects can be achieved via pre-programmed lighting routines, which are normally programmed to complement a musical or sound performance. Lighting Directors and/or console operators use faders. and other controls on the console to create a show that uses lighting, smoke and other visual effects during the performance to enhance or highlight parts of the audio performance. It is important that the lighting effects are in time with the audio performance to maximise the effect and one of the main aspects of a well created lighting show, is the manner in which it relates to an audio performance.

A lighting operator may create a lighting show by creating routines, saving them on the console and overlaying them until the entire lighting show is synchromsed. These type of pre-programmed lighting routines can provide lighting effects that are perfectly synchronised to the audio performance when the audio performance is provided by tape, record, computer, CD or other forms of absolute reproduction. This is because the lighting routines are programmed against an accurate time code generated by the console or some external source. In many pre-set performances, a lighting operator will also use impromptu manual controls on the console in conjunction with the pre-programmed routine, to create a more live feel to the pre-set lighting show. Pre-programmed routines work well for predetermined 2 time-accurate performances as above, but by nature can not cope well with performances where the audio may change style, timing, timbre or feel without any, or with little warning. This happens more than is commonly thought, musicians may add or leave out musical phrases, or may extend a song. Even in the nightclub environment where the music is essentially based on the same, repetitive beat, a DJ may -scratch or spin a record for -audio effect that changes the essential beat of the performance. Part of a DJ's role is to change records according to how he or she feels the audience will react, in this case the timbre or feel of the music is paramount and the lighting effects can make or break the effect a DJ is trying toachieve.

Commonly a lighting operator will use a combination of pre-programmed routines and manual control of the lighting console to provide a light show that is sympathetic to a live audio performance. This may involve setting a general colour theme for a particular part of an audio performance with individual fixtures set to flash or move in time with the beat, perhaps by setting the fixture to 'sound to light' mode or by setting an automatic flash on the console which is adjusted to be in time with the audio performance. Some consoles offer a 'time bend' facility so that the operator can adjust the timing of a pre-set lighting show to keep it in time with the audio performance. Different pre-set lighting themes are available from the console so that an operator can switch themes in time with the audio performance. Naturally the amount of live effects an operator can add is limited and live effects are prone to timing problems.

Lighting Directors and/or operators are often employed for larger audio performances of all musical types, whereas the smaller performance is more likely to rely on the individual sound synchronising capabilities of individual lighting fixtures which are minimal. This is relevant to both temporary or mobile stage performances, or in fixed installations such as night-clubs or studios. Even in larger clubs or studios, it is more often the case that the entire lighting system is used to only a fraction of its capabilities with operators usually changing pre-set themes to change the lighting mood with a degree of manual override for extra lighting effects such as strobe or lasers.

A lighting show intended to enhance an audio performance needs to be in time and in sympathy with the audio whilst offering a myriad of effects that regularly change. Performers and lighting operators are constantly striving for new special 3 effects and closer integration of sound and light to create more effective overall performances. Current lighting systems do little to help an operator tie the lighting performance to the timing and feel of the audio, preferring to base events on a fixed time basis.

Further to this, current lighting systems do not have the ability to examine an audio performance to enable it to generate a light show that is linked to the timing, feel, and texture of an audio performance and create a sympathetic lighting show.

In addition, lighting consoles are dedicated pieces of equipment, normally expensive and cumbersome, for the smaller artists or lighting operators it is io impracticable to include a lighting system in a music making set-up or studio, meaning that they can not arrange or produce the light show as they may do with music.

SUNWARY OF THE INVENTION A first aspect of the invention provides a digital real-time, controlling system for electronic lighting devices comprising a computer 70, one or more information input devices 3 to receive and convert audio information from audio sources 1, one or more controlling output devices 60 to control electronic lighting devices 62, and a software program 80.

The software program 80 analyses the audio information from the information input devices 3 to determine key indicators including but not limited to the timing, timbre, instrument identification, style, amplitude, frequency, rate-of change, duration, tone, occurrence and feel of an audio performance. The resulting analytical data is used to create triggers in the form of data bus 10 which can be used as a time clock for sequenced events and commands. Additionally the analytical data can be used to determine the texture or feel of the audio information to create additional commands, event triggers or schedules. The analytical data, or derivatives of it can also be patched to any controllable feature of any lighting device 62. The result is a digital real-time electronic lighting device controller that synchronises lighting control to the audio performance with a high degree of autonomy, yet still provides a varied and context sensitive lighting show.

4 Another aspect of this invention is the use of data bus 10 that can be treated with logical, mathematical or other ftinctions and used to com mand the system to change overall lighting effects in synchronicity with the audio performance.

Yet another aspect of the invention is the creation and recognition of a real-time, time code that contains key indicator information relating directly to the audio performance. This aspect of the invention provides a means of remotely operating equipment via a suitable medium for example the internet or wireless, and further, that such time code or data stream may be embedded or included in musical or other recordings or saveable data formats BREEF DESCRIPTION OF THE DRAWINGS

These and other more detailed and specific objects and features of the present invention are more fully disclosed in the following specification, reference being had to the accompanying drawings, in which:

FIGURE 1 is a block diagram of a general embodiment of the system, with a more detailed preferred embodiment of the input and output sections of the system.

FIGURE 2 is a detailed block diagram of a preferred embodiment of the audio analy-sing and trigger generation section of the system.

FIGURE 3 is a detailed block diagram of a preferred embodiment of the first level of operational control, "performance".

FIGURE 4 is a detailed block diagram of a preferred embodiment of the second level of operational control, "acf'.

FIGURE 5 is a detailed block diagram of a preferred embodiment of the third level of operational control, "scene".

FIGURE 6 is a detailed block diagram of a preferred embodiment of the fourth levelof operational control, "mood".

FIGURE 6b is a more detailed block diagram of a preferred embodiment of one aspect of the fourth level of operational control, "mood".

DETAILED DESCREPTION OF THE PREFERRED EMBODEVIENTS Figure I shows a preferred embodiment of the system, comprising; A computer 70 on which the analysing and controlling software 80 resides; One or more information input devices 3, which are electronically linked to one or more audio sources I via one or more connections 2; One or more output devices 60 are electronically linked to electronic lighting equipment 62 which may be one or more individual lighting equipment device via one or more connections 61; The audio sources I can be in analogue or digital electronic form such as are generated by microphones or other acoustic transducers, audio mixing or treating devices, broadcast, computer generated or as generated by any other sound generator devices. Audio sources I may be separate sources such as one source for each instrument or musical element of an audio performance, or composite audio sources of an audio performance such as stereo 'left and right', or any combination thereof Connections 2 can be electrically conductive wires, fibre optic cables, wireless links or any combination thereof Furthermore connections 2 may utilise a combination of communications interfaces such as the Internet or modem connections.

Input devices 3 may be computer sound cards, multiple input DSP cards or other internal or external audio input devices. Input devices 3 may feature one or more audio inputs and such inputs may be in analogue or digital form via wire, fibre optic cable, wireless links or any combination thereof The resulting data stream from input devices 3 forms input data bus 4.

The following FIGURES illustrate processes undertaken, and preferred embodiments of the software system 80, of the present invention.

As illustrated in FIGURE 2, input data bus 4 is fed to the audio analysing section 5 of the software system 80, The audio analysing section 5 of the preferred embodiment of the present invention is an FFT spectral analyser but can be any form 6 of analyser including but not limited to, sampling, wavelets, or pattern identification analysers. Data ftom the spectral analyser 5 forms spectral data bus 6.

A plurality of trigger channels 7 receive analysed information from.

spectral data bus 6 whereupon parameters can be changed and functions performed by function & parameter controls 7.1 and triggers generated by trigger generator 7.2. An important function of trigger channels 7 is to monitor a specified frequency band of the audio information from sources I and create triggers according to the function & parameter control 7.1 settings. The inventor recognises that the extent that the function & parameter controls 7.1 monitor and affect the analysed information 6 is of io great benefit to the ability to create clear and relevant triggers via trigger generators 7.2. Accordingly, the function & parameter controls 7.1 comprise adjustable functions, parameters and features including, but not limited to; central frequency to monitor, 'Q' or width of the central frequency band, amplitude or level adjustment and compression, minimum and maximum threshold, variable noise gate, and other sound separating or identifying techniques. Furthermore, the function & parameter controls 7.1 may contain sample and lookup functions whereby certain sets of monitored audio information are compared to a library of stored sets for identification and eventual trigger generation. Function & parameter controls 7.1 also provide the facility to receive data directly from a specified input or plurality of inputs of input device 3 thereby by-passing spectral analyser 5, this Direct Injection mode (DI) enables one or more specified audio source I to be fed directly to one or more specified trigger channel 7. Trigger generators 7.2 create a trigger based on the information that has been affected and/or treated based on the function and parameter setting of the function & parameter controls 7. 1. Triggers from the trigger generators 7.2 form data bus 10.

FGURE 3 illustrates a preferred embodiment for the first level of operational control titled "performance" 20, which is the highest level of operation and contains its own set of sub levels as described later in this description. A perforinance 20 contains a timing analyser 20.1 and a set of function, parameter & main controls 20.2. An alternative embodiment also contains a texture detector 3 1. 1.

Data bus 10 provides performances 20 with triggers from trigger channels 7, which are analysed by timing analyser 20.1 to provide timing information to timing bus 25.

7 Timing analyser 20.1 calculates the main timing elements of the audio sources I which includes but is not limited to Beats Per Minute (BPM) or time signature by analysing the triggers in data bus 10. An alternative embodiment analyses the information in spectral data bus 6 to calculate BPM. Timing analyser 20.1 also creates other timing information including but not limited to; multiples or divisions of BPM, BPM count, and pre or post delay timing information which may be used to compensate for data transmission delays or delays in response by lighting devices 62. Timing analyser 20.1 also includes a beat continuation function which uses BPM timing calculations to maintain a BPM simulation, even whilst a section of audio from audio sources I is analysed as containing no BPM information. This function enables the software system 80 to continue to provide lighting control that maintains synchromicity with the audio sources I even whilst some or all of the elements of audio source I being used to define BPM are not present for a period.

Furthermore, timing analyser 20.1 contains a BPM counter reset function whereby manual and automatic reset options are available to ensure that the BPM counting function counts its Most Significant Beat (MSB) in synchronicity with that of audio sources 1. Automatic MSB reset may be instigated by a number of options including but not limited to time code reset indicators, control instructions, or the process whereby zero BPM information preceding an identified beat is utillsed.

Furthermore, timing analyser 20.1 provides and/or reads a number of time codes for remote operation or linked operation of systems, such time codes may include; SMPTE or MIDI, or a time code generated by the present invention.

Furthermore such timecodes, may be sent, received, transmitted, saved or embedded in all forms of recording medium. Data from timing analyser 20.1 forms timing bus 25.

Function, parameter & main controls 20.2 comprise a set of main, parameter and function controls for; overall system operation for example start and stop, file management tools such as load and save, and configuration of time code and the overall system. In addition to the above, commands for tile control bus 26 relating to manual access or overrides, and communications are generated by function, parameter & main controls 20.2.

8 Furthermore, function, parameter & main controls 20.2 provide a function whereby a schedule can be generated to instruct the next level of operation titled "act" 3 1. This function contains a list of available acts 3 1, which may be instructed to apply, and a time scale - measured in convenient time units such as minutes or bars - against which, act change or select instructions can be set. All data from function, parameter & main controls 20.2 forms control bus 26.

FGLJRE 4 illustrates a preferred embodiment for the second level of operational control titled "act" 3 1. Timing bus 25 and control bus 26 are fed to act selector 30 to provide a means of selection of acts 3 1. Control bus 26 provides act io selection instructions created in performance 20, and timing bus 25 provides timing information which is correlated by act selector 30 with instructions from control bus 26, which can ensure that act selection is effected in time with the BPM or time indicator of audio sources 1.

Acts 31 comprise a plurality of texture detectors 31.1 and one or more scene arranger 35 which is a means of arranging the next level of operation titled a4scene" 41. With overall act selection being commanded by act selector 30 as previously explained, the acts 31 create scene selection instructions in two ways, both methods may be based on data supplied by timing bus 2 5 and data bus 10.

The first method operates as follows: Scene selectors 35 contains a list of available scenes 41 and a time scale against which certain scenes 41 may be instructed to apply. The period of time over which scenes 41 are selected by scene arranger 35 is measured in units directly related to the BPM or time signature of audio sources I as calculated by timing analyser 20.1 and supplied by timing bus 25. The result is a set of scene change instructions created by scene arranger 35 and sent to scene bus 36, which are in time with the BPM of audio source 1.

The second method that acts 31 uses to create scene selection instructions is to autonomously and automatically select scenes according to the settings of texture detectors 3 1. 1, explained as follows; Texture detectors 31.1 comprise function & parameter controls 31.2 and trigger generator 31.3. Function & parameter controls 31.2 comprise a set of definable filters, function, and parameter settings or options that may be used to identify certain aspects of audio source 1. A first aspect of function & parameter 9 controls 31.2 is to identify certain elements including but not limited to the following; beats, treble, middle or specific audio band activity, silence, crescendos or other musical nuances, and -specific patterns or instrument characteristics. These elements may be identified by analysing the triggers in data bus 10, by analysing the data in spectral bus 6, or by sample and look-up or other methods, and may be used in inverse to find missing elements. A further aspect of function & parameter controls 31.2 is to allow the specification of a detection time period which directly relates to the BPM of audio sources I - available from timing bus 25 - in which to identify the aforementioned elements. Yet another aspect is pre and post delay and duration io settings for the detection period, said settings again related to the BPM or timing signature of audio sources I as available ftom timing bus 25.

Trigger generator 31.3 additionally provides scene change triggers to scene bus 36 based on information supplied to, and having been affected by, the settings of function & parameter controls 31.2. The scene change instructions created by this method may be set to override those created as described in the first method.

Scene change instructions from both methods described above form scene bus 36.

FIGURE 5 illustrates a preferred embodiment of the third level of operation titled "scene" 41. Control bus 26 and scene bus 36 are fed to scene selector 40 to provide a means of selection of scenes 41 according to selection data as generated by act 31 and command data as created in performance 20.

Scenes 41 comprise function & parameter controls 41.1, and mood arranger 41.2 which provides a means of arranging the next level of operation titled "mood" 51. Mood arranger 41.2 provides a means of selecting, from a list of available options, which moods 5 1, are to be used for a particular scene 4 1, and sends mood selection instructions to mood bus 45. Function & parameter controls 41.1 provide a means of defining from a list of options, how a scene 41 will use the chosen moods. Examples of such options being; use a certain mood 51 every time scene 41(n) is applied, i. e. cycle, or; use a certain mood 51 only the first time scene 41(n) is applied, i.e. one-shot. Mood selection instructions from scenes 41, form mood bus 45.

FIGURE 6 illustrates a preferred embodiment of the fourth level of operation titled "mood" 5 1. Control bus 26 and mood bus 45 are fed to mood selector 50 to provide a means of selection of moods 51 according to selection data as generated by scenes 41 and command data as created in performance 20.

Moods 51 comprise function & parameter controls 51.1 and an output arranger 51.2 which provides a means of arranging data to be sent to lighting devices 62 via output bus 55, output devices 60 and connections 6 1. A preferred embodiment of the present invention uses a plurality of moods 51, for each individual lighting device 62.

FIGURE 6b illustrates a preferred embodiment utilising DMX control of lighting device 62 which may comprise an individual lighting device or a number of devices connected via DMX Accordingly, a number of DN4X channels 51.2(a) are dedicated to each individual lighting device 62, and in the preferred embodiment, at least one DMX channel 51.2(a) is available for each controllable aspect of an individual lighting device 62, for example, pan, tilt or strike. Within output arranger 51.2, timing and data buses 25 and 10 are made available to be routed to through to DMX channels 51.2(a) as illustrated. Connection node 51.2(b) illustrates a preferred embodiment of a routing form that enables any combination of connections. Output arranger 51.2 makes this routed data available to function & parameter controls 51.1 whereupon each data line, may be individually treated by a number of functions and or parameter controls, such controls may have, but are not limited to, the following options; The first aspect of the aforementioned options is function, including; toggle, slow ramp, fast ramp, logical, counter, fader, sound to light, colour change, gobo change, do nothing, flash, home, blackout or user defined.

The second aspect is parameter, including; on for V off for 'n', over 'n' triggers; AND/NAND/OR and other logical functions; count 'n' triggers, on under threshold; on over threshold; trigger from 'n' while logically high and other logical combinations; and, go to colour Y. The third aspect is cycle, including; cycle, one shot, forward/back, cycle 'n' times and other cycle connotations. All aspects and options are available in a form that enables countless combinations of options I I including inverses and cross correlation between aspects. Data from moods 51 forms output bus 55.

As illustrated in FIGURE 1, output bus 55 is fed to output devices 60 and on to lighting devices 62 via connections 61.

Output devices 60 may be one or more internal or external serial data cards or devices whereby data may be in the form of DN4X or other lighting control or data formats. Connections 61 can be electrically conductive wires, fibre optic cables, wireless links or any combination thereof Furthermore connections 2 may utilise a combination of communications interfaces such as the Internet or modem lo connections. Lighting devices 62 may be a single electronically controlled lighting device or one or more networks of electronically controlled lighting devices, or any combination thereof

Claims (24)

  1. I A method for controlling electronic lighting devices that uses data resulting from analysis of audio information as a basis for controlling said electronic lighting devices in accordance with said audio information.
  2. 2. A claim as claim I wherein said analysis of said audio information may be used to determine relevant activity indicators that are in synchronicity with said audio information.
  3. 3. A claim as claim I wherein said analysis of said audio information may be used to determine relevant activity indicators that are in sympathy with said audio information.
  4. 4, A claim as claim I wherein said analysis of said audio information may be used to determine relevant activity indicators that can be based on identification of said audio information.
  5. 5. A claim as claimed in claims 1, 2, 3 or 4 wherein said activity indicators can be used as a basis for controlling electronic lighting devices.
  6. 6. A claim as claimed in claims 1, 2, 3 or 4 wherein said activity indicators can be used as a basis for creating control command sets for electronic lighting devices
  7. 7. A claim as claimed in claims 1, 2, 3, 4, 5 or 6 wherein said activity indicators can be used as a basis for scheduling said control command sets for electronic lighting devices.
  8. 8. A digital controlling system for electronic lighting devices that uses data resulting from analysis of audio information as a basis for controlling said electronic lighting devices in accordance with said audio information, cornprising, electrical links and audio input devices to receive said audio information, a computer and software arrangement, and output devices and electrical links to command said electronic lighting devices.
    V
  9. 9. A claim as claim 8 wherein said analysis of said audio information may be used to determine relevant activity indicators that are in synchronicity with said audio information.
  10. 10. A claim as claim 8 wherein said analysis of said audio information may be used to determine relevant activity indicators that are in sympathy with said audio information.
  11. 11. A claim as claim 8 wherein said analysis of said audio information may be used to determine relevant activity indicators that can be based on identification of said audio infortnation
  12. 12. A claim as claimed in claims 8, 9, 10 or I I wherein said activity indicators can be used as a basis for controlling electronic lighting devices.
  13. 13. A claim as claimed in claims 8, 9, 10 or I I wherein said activity indicators can be used as a basis for creating control command sets for electronic lighting devices is
  14. 14. A claim as claimed in claims 8, 9, 10, 11, 12 or 13 wherein said activity indicators can be used as a basis for scheduling said control command sets for electronic lighting devices.
  15. 15. A computer-readable medium containing a computer program that performs a method for controlling electronic lighting devices that uses data resulting from analysis of audio information asa basis for controlling said electronic lighting devices in accordance with said audio information.
  16. 16. A claim as claim 15 wherein said analysis of said audio information may be used to determine relevant activity indicators that are in synchronicity with said audio information.
  17. 17. A claim as claim 15 wherein said analysis of said audio information may be used to determine relevant activity indicators that are in sympathy with said audio information.
  18. 18. A claim as claim 15 wherein said analysis of said audio information may be used to determine relevant activity indicators that can be based on identification of said audio information.
  19. 19. A claim as claimed in claims 15, 16, 17 or 18 wherein said activity indicators can be used as a basis for controlling electronic lighting devices.
  20. 20. A claim as claimed in claims 15, 16, 17 or 18 wherein said activity indicators can be used as a basis for creating control command sets for electronic lighting devices
  21. 21. A claim as claimed in claims 15, 16, 17, 18, 19 or 20 wherein said activity indicators can be used as a basis for scheduling said control command sets for electronic lighting devices.
  22. 22. A claim as in any preceding claim, wherein said system creates and maintains a lighting show in synchronicity and sympathy with said audio performance without human intervention.
  23. 23. A claim as in any preceding claim, wherein said system creates and maintains a lighting show in synchronicity and sympathy with said audio performance without a lighting console.
  24. 24. A digital controlling system for electronic lighting devices substantially as described herein with reference to Figures I - 6 of the accompanying drawing.
GB9920969A 1999-09-07 1999-09-07 Digital controlling system for electronic lighting devices Withdrawn GB2354602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9920969A GB2354602A (en) 1999-09-07 1999-09-07 Digital controlling system for electronic lighting devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9920969A GB2354602A (en) 1999-09-07 1999-09-07 Digital controlling system for electronic lighting devices

Publications (2)

Publication Number Publication Date
GB9920969D0 true GB9920969D0 (en) 1999-11-10
GB2354602A true true GB2354602A (en) 2001-03-28

Family

ID=10860391

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9920969A Withdrawn GB2354602A (en) 1999-09-07 1999-09-07 Digital controlling system for electronic lighting devices

Country Status (1)

Country Link
GB (1) GB2354602A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001099475A1 (en) 2000-06-21 2001-12-27 Color Kinetics Incorporated Method and apparatus for controlling a lighting system in response to an audio input
GB2391179A (en) * 2002-06-27 2004-02-04 Chao-Lang Wang Light display responsive to audio signals
EP1729615A2 (en) * 2004-03-02 2006-12-13 Color Kinetics Incorporated Entertainment lighting system
WO2007113738A1 (en) * 2006-03-31 2007-10-11 Koninklijke Philips Electronics, N.V. Combined video and audio based ambient lighting control
WO2008053409A1 (en) * 2006-10-31 2008-05-08 Koninklijke Philips Electronics N.V. Control of light in response to an audio signal
WO2008129505A1 (en) * 2007-04-24 2008-10-30 Koninklijke Philips Electronics N.V. Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
NL2000926C2 (en) * 2007-10-12 2009-04-15 Jan Jonquiere Light and sound column.
DE102008038340A1 (en) * 2008-08-19 2010-02-25 Austriamicrosystems Ag Circuit arrangement for driving a light source and method for generating a drive signal for the same
WO2010048907A1 (en) * 2008-10-29 2010-05-06 Jaroslav Nusl Method for controlling in particular lighting technology by audio signal and a device for performing this method
US7764026B2 (en) 1997-12-17 2010-07-27 Philips Solid-State Lighting Solutions, Inc. Systems and methods for digital entertainment
BE1019823A3 (en) * 2011-02-16 2013-01-08 Robrecht Karel V Noens Method and apparatus for converting an audio signal into a control signal for an audio visualization system and audio visualization system.
EP2637327A1 (en) * 2012-03-09 2013-09-11 Harman International Industries Ltd. Audio mixing console with lighting control and method of mixing by means of a mixing console
CN105874885A (en) * 2013-12-24 2016-08-17 Ag有限公司 Lighting device and frame with said lighting device attached thereto

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3806873A (en) * 1972-01-12 1974-04-23 W Brady Time perspective audio-video translator
GB2044484A (en) * 1979-02-24 1980-10-15 Cls Electronics Ltd Visual display apparatus
US4440059A (en) * 1981-12-18 1984-04-03 Daniel Lee Egolf Sound responsive lighting device with VCO driven indexing
US5083064A (en) * 1988-09-22 1992-01-21 Jones Sr Charles W Lamp modulating circuitry for incandescent and fluorescent lamps
GB2260041A (en) * 1991-09-30 1993-03-31 Jalco Co Ltd Sound responsive light blinking device
US5646361A (en) * 1995-08-04 1997-07-08 Morrow; Michael Laser emitting visual display for a music system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3806873A (en) * 1972-01-12 1974-04-23 W Brady Time perspective audio-video translator
GB2044484A (en) * 1979-02-24 1980-10-15 Cls Electronics Ltd Visual display apparatus
US4440059A (en) * 1981-12-18 1984-04-03 Daniel Lee Egolf Sound responsive lighting device with VCO driven indexing
US5083064A (en) * 1988-09-22 1992-01-21 Jones Sr Charles W Lamp modulating circuitry for incandescent and fluorescent lamps
GB2260041A (en) * 1991-09-30 1993-03-31 Jalco Co Ltd Sound responsive light blinking device
US5646361A (en) * 1995-08-04 1997-07-08 Morrow; Michael Laser emitting visual display for a music system

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7764026B2 (en) 1997-12-17 2010-07-27 Philips Solid-State Lighting Solutions, Inc. Systems and methods for digital entertainment
WO2001099475A1 (en) 2000-06-21 2001-12-27 Color Kinetics Incorporated Method and apparatus for controlling a lighting system in response to an audio input
EP1295515B1 (en) * 2000-06-21 2011-12-28 Philips Solid-State Lighting Solutions, Inc. Method and apparatus for controlling a lighting system in response to an audio input
US7228190B2 (en) 2000-06-21 2007-06-05 Color Kinetics Incorporated Method and apparatus for controlling a lighting system in response to an audio input
EP2364067A3 (en) * 2000-06-21 2011-12-14 Philips Solid-State Lighting Solutions, Inc. Method and apparatus for controlling a lighting system in response to an audio input
GB2391179A (en) * 2002-06-27 2004-02-04 Chao-Lang Wang Light display responsive to audio signals
EP1729615A2 (en) * 2004-03-02 2006-12-13 Color Kinetics Incorporated Entertainment lighting system
EP1729615A4 (en) * 2004-03-02 2013-11-20 Philips Solid State Lighting Entertainment lighting system
WO2007113738A1 (en) * 2006-03-31 2007-10-11 Koninklijke Philips Electronics, N.V. Combined video and audio based ambient lighting control
WO2008053409A1 (en) * 2006-10-31 2008-05-08 Koninklijke Philips Electronics N.V. Control of light in response to an audio signal
US8461443B2 (en) 2006-10-31 2013-06-11 Tp Vision Holding B.V. Control of light in response to an audio signal
CN101669406A (en) * 2007-04-24 2010-03-10 皇家飞利浦电子股份有限公司 Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
US8374880B2 (en) 2007-04-24 2013-02-12 Koninklijke Philips Electronics N.V. System for automatically creating a lighting atmosphere based on a keyword input
WO2008129505A1 (en) * 2007-04-24 2008-10-30 Koninklijke Philips Electronics N.V. Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
CN101669406B (en) 2007-04-24 2014-06-04 皇家飞利浦电子股份有限公司 Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
JP2010525538A (en) * 2007-04-24 2010-07-22 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Based on the keyword input, atmosphere, in particular a method for automatically forming a lighting atmosphere, system, and user interface
WO2009048333A1 (en) * 2007-10-12 2009-04-16 JONQUIÈRE, Jan Light and sound column
NL2000926C2 (en) * 2007-10-12 2009-04-15 Jan Jonquiere Light and sound column.
DE102008038340B4 (en) * 2008-08-19 2010-04-22 Austriamicrosystems Ag Circuit arrangement for driving a light source and method for generating a drive signal for the same
DE102008038340A1 (en) * 2008-08-19 2010-02-25 Austriamicrosystems Ag Circuit arrangement for driving a light source and method for generating a drive signal for the same
GB2479280A (en) * 2008-10-29 2011-10-05 Jaroslav Nusl Method for controlling in particular lighting technology by audio signal and a device for performing this method
WO2010048907A1 (en) * 2008-10-29 2010-05-06 Jaroslav Nusl Method for controlling in particular lighting technology by audio signal and a device for performing this method
BE1019823A3 (en) * 2011-02-16 2013-01-08 Robrecht Karel V Noens Method and apparatus for converting an audio signal into a control signal for an audio visualization system and audio visualization system.
EP2637327A1 (en) * 2012-03-09 2013-09-11 Harman International Industries Ltd. Audio mixing console with lighting control and method of mixing by means of a mixing console
CN105874885A (en) * 2013-12-24 2016-08-17 Ag有限公司 Lighting device and frame with said lighting device attached thereto
EP3102004A4 (en) * 2013-12-24 2017-11-22 AG Inc. Lighting device and frame with said lighting device attached thereto

Also Published As

Publication number Publication date Type
GB9920969D0 (en) 1999-11-10 application

Similar Documents

Publication Publication Date Title
US7711442B2 (en) Audio signal processor with modular user interface and processing functionality
US5740260A (en) Midi to analog sound processor interface
US5027687A (en) Sound field control device
US20020016643A1 (en) Playback apparatus, playback method, and recording medium
US7528315B2 (en) Rhythm action game apparatus and method
US6816833B1 (en) Audio signal processor with pitch and effect control
US5693903A (en) Apparatus and method for analyzing vocal audio data to provide accompaniment to a vocalist
US20060011052A1 (en) Sound-effect foot pedal for electric/electronic musical instruments
US6795560B2 (en) Digital mixer and digital mixing method
US20040159221A1 (en) System and method for structuring and mixing audio tracks
US20100257994A1 (en) Method and apparatus for producing audio tracks
EP1624728A1 (en) Systems and methods for authorizing lighting sequences
US5291558A (en) Automatic level control of multiple audio signal sources
US7809448B2 (en) Systems and methods for authoring lighting sequences
US20070025568A1 (en) Mixing apparatus and computer program therefor
US5886274A (en) System and method for generating, distributing, storing and performing musical work files
US7319493B2 (en) Apparatus and program for setting video processing parameters
US5138926A (en) Level control system for automatic accompaniment playback
US20020154787A1 (en) Acoustical to optical converter for providing pleasing visual displays
US5357048A (en) MIDI sound designer with randomizer function
US6490359B1 (en) Method and apparatus for using visual images to mix sound
US20090067641A1 (en) User interface for mixing sounds in a media application
Truax Real-time granular synthesis with a digital signal processor
US20050254663A1 (en) Electronic sound screening system and method of accoustically impoving the environment
US20030177892A1 (en) Rendition style determining and/or editing apparatus and method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)