WO2001099475A1 - Method and apparatus for controlling a lighting system in response to an audio input - Google Patents

Method and apparatus for controlling a lighting system in response to an audio input Download PDF

Info

Publication number
WO2001099475A1
WO2001099475A1 PCT/US2001/019782 US0119782W WO0199475A1 WO 2001099475 A1 WO2001099475 A1 WO 2001099475A1 US 0119782 W US0119782 W US 0119782W WO 0199475 A1 WO0199475 A1 WO 0199475A1
Authority
WO
WIPO (PCT)
Prior art keywords
act
audio input
lighting program
characteristic
lighting
Prior art date
Application number
PCT/US2001/019782
Other languages
French (fr)
Inventor
Kevin J. Dowling
Scott D. Johnston
Original Assignee
Color Kinetics Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Color Kinetics Incorporated filed Critical Color Kinetics Incorporated
Priority to AT01948546T priority Critical patent/ATE539593T1/en
Priority to ES01948546T priority patent/ES2380075T3/en
Priority to EP01948546A priority patent/EP1295515B1/en
Priority to JP2002504188A priority patent/JP4773673B2/en
Priority to AU2001270018A priority patent/AU2001270018A1/en
Publication of WO2001099475A1 publication Critical patent/WO2001099475A1/en
Priority to HK03106910.1A priority patent/HK1054839A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J17/00Apparatus for performing colour-music
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/30Driver circuits
    • H05B45/32Pulse-control circuits
    • H05B45/325Pulse-width modulation [PWM]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates generally to methods and apparatus for controlling a lighting system, and more particularly to methods and apparatus for controlling a lighting system in response to an audio input.
  • the increased accessibility of music in digital formats has led to the development of computer software to interpret digitally formatted music.
  • the software enables the music to be broadcast using speakers and other audio components that can be coupled to a computer system.
  • One example of such computer software is the MP3 players which allow music files in MP3 format to be interpreted and played by a user.
  • Some MP3 player software provides the additional feature of an on-screen visual interface whereby the motion of graphics displayed to the user is synchronized with aspects of the music, such as frequency or tempo.
  • One embodiment of the invention is directed to a method for executing a lighting program to control a plurality of light emitting diodes (LEDs).
  • the method comprises acts of: (A) receiving an audio input in digital form; (B) digitally processing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
  • Another embodiment of the invention is directed to a computer readable medium encoded with a program that, when executed, performs a method for executing a lighting program to control a plurality of LEDs.
  • the method comprises acts of: (A) receiving an audio input in digital form; (B) digitally processing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
  • the apparatus comprises at least one storage medium to store the lighting program; at least one input to receive an audio input; an audio decoder to digitally process the audio input to determine at least one characteristic of the audio input; and at least one controller, coupled to the audio decoder and the at least one storage medium, to execute the lighting program to generate control signals to control the plurality of LEDs.
  • the at least one controller generates at least one of the control signals based at least in part on the at least one characteristic of the audio input.
  • Another embodiment of the invention is directed to a computer readable medium encoded with a first program that, when executed on a processor, performs a method for executing a lighting program to control a plurality of LEDs.
  • the processor is programmed with a second program that processes an audio input to determine at least one characteristic of the audio input.
  • the method comprises acts of: (A) receiving information from the second program relating to the at least one characteristic of the audio input; (B) executing the lighting program to generate control signals to control the plurality of LEDs; and (C) during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input received from the first program.
  • Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs.
  • the method comprises acts of: (A) receiving an audio input and an input from at least one timer; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
  • Another embodiment of the invention is directed to a computer readable medium encoded with a program that, when executed, performs a method for executing a lighting program to control a plurality of LEDs.
  • the method comprises acts of: (A) receiving an audio input and an input from at least one timer; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
  • Another embodiment of the invention is directed to a computer readable medium encoded with a first program that, when executed on a processor, performs a method for executing a lighting program to control a plurality of LEDs.
  • the processor is programmed with a second program that processes an audio input to determine at least one characteristic of the audio input.
  • the method comprising acts of: (A) receiving information from the second program relating to the at least one characteristic of the audio inputand an input from the at least one timer; (B) executing the lighting program to generate control signals to control the plurality of LEDs; and (C) during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
  • the apparatus comprises at least one storage medium to store the lighting program; at least one input to receive an audio input; an audio decoder to process the audio input to determine at least one characteristic of the audio input; and at least one controller, coupled to the audio decoder and the at least one storage medium, to execute the lighting program to generate control signals to control the plurality of LEDs.
  • the at least one controller generates at least one of the control signals based at least in part on the at least one characteristic of the audio input and an input from at least one timer.
  • Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs.
  • the method comprises acts of: (A) receiving an audio input and an input from a graphical user interface; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the graphical user interface.
  • Another embodiment of the invention is directed to a method for execution on a computer.
  • the method comprises acts of: (A) processing, on the computer, information indicative of an audio signal to generate a speaker-compatible signal indicative of the audio signal; (B) determining at least one characteristic of the audio signal; (C) executing, on the computer, a lighting program to generate control signals to control a plurality of LEDs; (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input; and (E) transmitting the speaker-compatible signal to a speaker to generate audible sound indicative of the audio signal.
  • Another embodiment of the invention is directed to a method for authoring a lighting program to control a plurality of LEDs is response to at least one characteristic of an audio input.
  • the method comprises acts of: (A) providing a graphical user interface (GUI) that displays information representative of the plurality of LEDs, a plurality of lighting effects to be assigned thereto, and the at least one characteristic of the audio input (B) selecting, based on at least one user input provided via the GUI, at least one of the plurality of lighting effects to correspond to at least one of the plurality of LEDs in response to the at least one characteristic of the audio input; and (C) creating a lighting program, based on the at least one user input, for generating control information for the plurality of LEDs.
  • GUI graphical user interface
  • Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs.
  • the method comprises acts of: (A) receiving an audio input; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) storing information related to the at least one characteristic of the audio input; (D) executing the lighting program, after completion of the act (C), to generate control signals to control the plurality of LEDs; and (E) during execution of the lighting program in the act (D), reading the stored information and generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
  • Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs to create a light show.
  • the method comprises acts of: (A) receiving an audio input having a duration and varying in time during the duration of the audio input; (B) processing the audio input to determine at least one first characteristic of the audio input at a first time during the duration; (D) executing the lighting program in synchronization with the audio input to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C) at a time that is prior to the first time during the duration of the audio input, generating at least one of the control signals based at least in part on the least one first characteristic of the audio input so that the light show anticipates changes in the audio input.
  • Figure 1 illustrates a system for creating a lighting sequence and executing the lighting sequence on a plurality of lighting units according to one embodiment of the invention
  • Figure 2 presents an exemplary method for creating a lighting effect in accordance with one embodiment of the invention
  • Figure 3 depicts a representative interface for describing an arrangement of lighting units in accordance with another embodiment of the invention.
  • Figure 4 represents an alternate interface for graphically reproducing a lighting sequence
  • Figure 5 portrays a representative interface for creating a lighting sequence in accordance with one embodiment of the invention
  • Figure 6 shows one embodiment of an apparatus for executing a lighting sequence in accordance with another embodiment of the invention.
  • Figure 7 shows a block diagram of an alternate embodiment of the present invention directed to an apparatus for executing a lighting sequence
  • Figure 8 is a diagram showing an apparatus for controlling a lighting network in response to an audio input according to another embodiment of the invention
  • Figure 9 is a diagram showing an apparatus for controlling a lighting network in response to an audio input according to another embodiment of the invention.
  • Figure 10 is a diagram showing an example of a lighting control device in the apparatus of Figure 9, according to one embodiment of the invention.
  • music player software provides a convenient means of translating digitally formatted music for listening, and in some cases also provides a screen-based graphical interface for visually appreciating music
  • existing programs have limited functionality with respect to the visualization of music.
  • existing music player software does not allow for the visual display of music external to the computer. Such an external display would provide increased music-based feedback, thereby enhancing a user's overall sensory experience.
  • One embodiment of the present invention is directed to a method and apparatus for controlling a lighting network in response to an audio input. This can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation technique.
  • an audio input is digitally processed to analyze the audio input, and at least one aspect of a lighting system is controlled in response to a characteristic of the audio input.
  • timing information is also considered so that the control signals sent to the lighting network for a particular audio input can vary over time, to avoid repetitiveness.
  • the assignee of the present application has previously developed other systems on which users can author lighting programs including one more lighting sequences, as well as devices for playing back a lighting program to control a lighting system. Many of the features of those systems can be incorporated in the present invention to enable the control of a lighting system in response to an audio input. Therefore, a description will initially be provided of authoring software and playback devices for lighting programs to control a lighting system, before turning to the specific aspects of the present invention relating to performing such control in response to an audio input. Overview of Systems for Authoring and Playing Back Lighting Programs to Control A Lighting Network
  • Figure 1 illustrates an example of a system for authoring and playing back a lighting program including one more lighting sequences.
  • the system of Figure 1 includes a processor 10 supporting a software application, having an interface 15, which can be used to create a lighting program 20, which may include one or more lighting sequences.
  • the system further includes a lighting controller 30 which can execute or play back the lighting sequence 20 and in response thereto, control one or more lighting units 40.
  • sequence in the context ofthis disclosure refers to two or more lighting effects spaced in time.
  • the software application may be implemented in any of numerous ways, as the invention is not limited to any particular implementation.
  • the software application may be a stand-alone application, such as an executable image of a C++ or Fortran program or other executable code and/or libraries, or may be implemented in conjunction with or accessible by a Web browser, e.g., as a Java applet or one or more HTML web pages, etc.
  • Processor 10 may be any system for processing in response to a signal or data, as the present invention is not limited to any particular type of processor.
  • the processor 10 may comprise microprocessors, microcontrollers, other integrated circuits, computer software, computer hardware, electrical circuits, application-specific integrated circuits, personal computers, chips, and other devices alone or in combination capable of providing processing functions.
  • processor 10 can be any suitable data processing platform, such as a conventional LBM PC workstation operating the Windows operating system, a SUN workstation operating a version of the Unix operating system, such as Solaris, or any other suitable workstation.
  • Controller 30 may communicate with lighting units 40 by radio frequency ( F), ultrasonic, auditory, infrared (IR), optical, microwave, laser, electromagnetic, any type of computer link (e.g., a network) or any other suitable transmission or connection technique.
  • a suitable protocol may be used for transmission between the controller 30 and the lighting units 40, including sending pulse- width modulated signals over a protocol such as DMX, RS-485, RS-232, or any other suitable protocol.
  • Lighting units 40 maybe incandescent, LED, fluorescent, halogen, laser, or any other type of light source. Each lighting unit may be associated with a predetermined assigned address either unique to that lighting unit or overlapping the address of other lighting units to facilitate communication with the controller 30.
  • control signals for driving the lighting units 40 can take the form of pulse- width modulated signals.
  • the lighting units 40 may be driven with a fixed current or voltage that is then turned on or off in accordance with a pulse-width modulated control signal.
  • the lighting units 40 may be driven using analog techniques where the current or voltage level is varied with time, pulse amplitude modulation or any other technique that varies the power through the lighting units in response to a control signal.
  • a single component may be capable both of permitting a user to create a lighting program and controlling the lighting units, and the present invention is intended to encompass this and other variations on the system depicted in Figure 1 which can be used to implement the methods described below.
  • the processor 10 can have software loaded thereon to enable it to perform not only the authoring functions described below, but also the playback functions described below as being performed by the controller 30.
  • the functions described below as being performed by the software application alternatively may be provided by a hardware device, such as a chip or card, or any other system capable of performing the functions described herein.
  • a user may select from among a set of predetermined 'stock' effects at step 210.
  • the stock effects function as discrete elements or building blocks useful for assembling a sequence.
  • a user may compose a particular sequence and include that sequence in the stock effects to eliminate the need for creating repeated elements each time the effect is desired.
  • the set of stock effects may include a dimming effect and a brightening effect.
  • a user may compose a pulse effect by specifying the alternation of the dimming and brightening effects, and include the pulse effect in the set of stock effects.
  • stock effects may be created by a user via any programming language, such as Java, C, C++, or any other suitable language. Effects may be added to the set of stock effects by providing the effects as plug-ins, by including the effects in an effects file, or by any other technique suitable for organizing effects in a manner that permits adding, deleting, and altering the set of effects.
  • any programming language such as Java, C, C++, or any other suitable language. Effects may be added to the set of stock effects by providing the effects as plug-ins, by including the effects in an effects file, or by any other technique suitable for organizing effects in a manner that permits adding, deleting, and altering the set of effects.
  • the user may indicate a time at which the selected effect should begin at step 220. For example, the user may indicate that a brightening effect should start three minutes after a sequence commences. Additionally, the user may select an ending time or duration for the selected effect at step 230. Thus, by indicating that the effect should end five minutes after the sequence commences, or equivalently by indicating that the effect should last for two minutes, a user may set the time parameters of the selected effect. Additional parameters may be specified by the user at step 240, as may be appropriate for the particular effect. For example, a brightening or dimming effect may be further defined by an initial brightness and an ending brightness.
  • the rate of change may be predetermined, e.g., the dimming effect may apply a linear rate of dimming over the assigned timespan, or may be alterable by the user, e.g., may permit slow dimming at the beginning followed by a rapid drop-off, or by any other scheme the user specifies.
  • a pulse effect as described above, might instead be characterized by a maximum brightness, a minimum brightness, and a periodicity, or rate of alternation.
  • the mode of alternation may be alterable by the user, e.g., the changes in brightness may reflect a sine function or alternating linear changes.
  • parameters such as initial color, final color, rate of change, etc. may be specified by the user. It should be appreciated that the particular effects and parameters therefore described above are provided merely for illustrative purposes, and that the present invention is not limited to these effects or parameters, as numerous other lighting effects and parameters can be employed in accordance with the embodiments of the invention described herein.
  • a user may select, at step 250, one or more lighting units to execute the effect selected in step 210.
  • a user may specify a transition between two effects which occur in sequence. For example, when a pulse effect is followed by a dimming effect, the pulse effect may alternate less rapidly, grow gradually dimmer, or vary less between maximum and minimum brightness towards the termination of the effect. Techniques for transitioning between these or other effects may be determined by the user for each transition, e.g., by selecting a transition effect from a set of predetermined transition effects, or by setting transition parameters for the beginning and/or end of one or both effects. h a further embodiment, users may specify multiple lighting effects for the same lighting unit that place effects overlapping in time or in location.
  • overlapping effects may be used in an additive or subtractive manner such that the multiple effects interact with each other.
  • a user could impose a brightening effect on a pulsing effect, with the brightening effect imposing the minimum brightness parameter of the pulse to give the effect of pulsing slowly growing to a steady light.
  • lighting effects can have priorities or cues attached to them which could allow a particular lighting unit to change effect on the receipt of a cue.
  • This cue could be any type of cue, received externally or internally to the system, and includes, but is not limited to, a user-triggered cue such as a manual switch or bump button; a user-defined cue such as a certain keystroke combination or a timing key allowing a user to tap or pace for a certain effect; a cue generated by the system such as an internal clocking mechanism, an internal memory one, or a software based one; a mechanical cue generated from an analog or digital device attached to the system such as a clock, external light or motion sensor, music synchronization device, sound level detection device, or a manual device such as a switch; a cue received over a transmission medium such as an electrical wire or cable, RF signal or IR signal; a cue that relates to a characteristic of an audio signal; or a cue received from a lighting unit attached to the
  • the priority can allow the system to choose a default priority effect that is the effect used by the lighting unit unless a particular cue is received, at which point the system instructs the use of a different effect.
  • This change of effect could be temporary, occurring only while the cue occurs or defined for a specified period, could be permanent in that it does not allow for further receipt of other effects or cues, or could be priority based, waiting for a new cue to return to the original effect or select a new one.
  • the system could select effects based on the state of a cue and the importance of a desired effect. For instance, if a sound sensor sensed sudden noise, it could trigger a high priority alarm lighting effect overriding all the effects otherwise present or awaiting execution.
  • the priority could also be state dependent where a cue selects an alternative effect or is ignored depending on the current state of the system.
  • priorities or queues for various lighting effects are not limited to the particular types of queues and priorities discussed above, as numerous other types are possible.
  • the outcome of one effect may be programmed to depend upon a second effect.
  • an effect assigned to a first lighting unit may be a random color effect, and an effect assigned to a second lighting unit maybe designated to match the color of the random color effect.
  • one lighting unit may be programmed to execute an effect, such as a Hashing effect, whenever a second lighting unit meets a certain condition, such as being turned off.
  • an effect which is initiated upon a certain condition of a first effect matches the color of a second effect and the rate of a third effect, can be created by this scheme.
  • a lighting sequence or effect may be programmed to start upon receipt of a cue or trigger signal, a sequence or effect may take precedence if a cue or trigger signal is received, a sequence or effect may be designated to repeat or continue until a cue or trigger signal is received, etc.
  • a user may instead designate that effect or sequence to begin when a certain stimulus is received.
  • a user may designate two or more effects for overlapping or concurrent time periods and assign the effects different priorities or conditions to determine which effect is executed upon playback, hi yet another embodiment, a user may link a parameter for an effect to an external input (e.g., any of the types of inputs described above, including analog, digital or manual inputs) such that the color, speed, or other attribute of an effect may depend on a signal from an external device, measuring, for example, volume, brightness, temperature, pitch, inclination, wave length, or any other appropriate condition.
  • an external input e.g., any of the types of inputs described above, including analog, digital or manual inputs
  • the selection of a lighting sequence, the selection of an effect, or the selection of a parameter may be determined or influenced by input from an external source, such as a user, chronometer, device, audio source, or sensor.
  • an external source such as a user, chronometer, device, audio source, or sensor.
  • the types of external stimuli, cues and triggers described above, as well as the changes in a lighting effect or parameter influenced thereby, are provided merely for illustrative purposes, as numerous other variations are possible.
  • a menu may be provided to define inputs and the consequences thereof.
  • a palette of predetermined inputs maybe provided to a user.
  • Each input such as a specified transducer or the output of another effect, may be selected and placed within an authored lighting sequence as a trigger for a new effect, or as a trigger to a variation in an existing effect.
  • Known inputs may include, for example, thermistors, clocks, keyboards, numeric keypads, Musical Instrument Digital Interface (“MLDI”) inputs, DMX control signals, TTL or CMOS logical signals, other visual or audio signals, or any other protocol, standard, or other signaling or control technique, whether analog, digital, manual, or any other form.
  • the palette may also include a custom input, represented as, for example, an icon in a palette, or an option in a dropdown menu.
  • the custom input may allow a user to define the characteristics of an input signal (e.g., its voltage, current, duration, and/or form (i.e., sinusoid, pulse, step, modulation)) that will operate as a control or trigger in a sequence.
  • a theatrical lighting sequence may include programmed lighting sequences and special effects in the order in which they occur, but requiring input at specified points before the next sequence or portion thereof is executed.
  • scene changes may take place not automatically as a function of timing alone, but at the cue of a director, producer, stage hand, or other participant.
  • effects which need to be timed with an action on the stage such as brightening when an actor lights a candle or flips a switch, dramatic flashes of lightning, etc., can be indicated precisely by a director, producer, stage hand, or other participant - even an actor - thereby reducing the difficulty and risk of relying on preprogrammed timing alone.
  • input from sensors can also be used to modify lighting sequences.
  • a light sensor may be used to modify the intensity of the lights, for example, to maintain a constant lighting level regardless of the amount of sunlight entering a room, or to make sure a lighting effect is prominent despite the presence of other sources of light.
  • a motion sensor or other detector may be used as a trigger to start or alter a lighting sequence.
  • a user may program a lighting sequence for advertising or display purposes to change when a person approaches a sales counter or display.
  • Temperature sensors may also be used to provide input.
  • the color of light in a freezer may be programmed to be dependent on temperature, e.g., providing blue light to indicate cold temperature, changing gradually to red as the temperature rises, until a critical temperature is reached, whereupon a flashing or other warning effect may begin.
  • an alarm system may be used to provide a signal that triggers a lighting sequence or effect for providing a warning, distress signal, or other indication.
  • An interactive lighting sequence may be created, e.g., wherein the executed effect varies according to a person's position, movements, or other actions.
  • a user may provide information representative of the number and types of lighting units and the spatial relationships between them.
  • an interface 300 may be provided as depicted in Figure 3, such as a grid or other two-dimensional array, that permits the user to arrange icons or other representative elements to represent the arrangement of the lighting units being used.
  • the interface 300 provides to a user a selection of standard types of lighting units 310, e.g., cove lights, lamps, spotlights, etc., such as by providing a selection of types of lighting units in a menu, on a palette, on a toolbar, etc.
  • the user may then select and arrange the lighting units on the interface, e.g., within layout space 320 in an arrangement which approximates the physical arrangement of the actual lighting units.
  • layout space 320 e.g., within layout space 320 in an arrangement which approximates the physical arrangement of the actual lighting units.
  • the lighting units may be organized into different groups, e.g., to facilitate manipulation of a large number of lighting units.
  • Lighting units may be organized into groups based on spatial relationships, functional relationships, types of lighting units, or any other scheme desired by the user. Spatial arrangements can be helpful for entering and carrying out lighting effects easily. For example, if a group of lights are arranged in a row and this information is provided to the system, the system can then implement effects such as a rainbow or a sequential flash without need for a user to specify a separate and individual program for each lighting unit. All the above types of implementation or effects could be used on a group of units as well as on single lighting units.
  • the use of groups can also allow a user to enter a single command or cue to control a predetermined selection of lighting units.
  • a lighting sequence can be tested or executed on a lighting system to experience the effects created by the user.
  • the interface 300 may be capable of reproducing a lighting sequence created by the user, for example, by recreating the programmed effects as though the icons on the interface were the lighting units to be controlled.
  • the icon representing that lighting unit may start black and gradually lighten to gray.
  • color changes, flashing, and other effects can be visually represented on the interface.
  • This function may permit a user to present a wholly or partially created lighting sequence on a monitor or other video terminal, pause playback, and modify the lighting sequence before resuming playback, to provide a highly interactive method for show creation.
  • the system could allow fast-forwarding, reversing, rewinding, or other functions to allow editing of any portion of the lighting sequence, hi a still further embodiment, the system could use additional interface features like those known in the art. This can include, but is not limited to, non-linear editing such as that used in the Adobe or such devices or controls as scrolls, drag bars, or other devices and controls.
  • An alternate interface 400 for reproducing a lighting sequence is presented in Figure 4. Interface 400 includes representations of lighting elements 410 and playback controls 420. It should be appreciated that the present invention is not limited to the above-described techniques for visualizing a lighting sequence, as numerous other techniques are possible.
  • An interface capable of representing the lighting sequence may also be used during authoring or entry of the lighting sequence.
  • a grid such as interface one axis and time is represented along a second axis.
  • the portion of the grid defined by that lighting unit, the start time, and the ending time may appear black at one end of the grid portion and gradually lighten to gray at the other end of the grid portion. In this way, the effect can be visually represented to the user on the interface as the lighting sequence is being created.
  • effects that are difficult to represent with a static representation can be represented kinetically on the interface, e.g., by flashing or randomly changing the color of the defined grid portion.
  • An example of an interface 500 representing a sequence for an assortment of three lighting units is shown in Figure 5.
  • Time chart 510 visually depicts the output of each of the three lights at each moment in time according to the temporal axis 515. At a glance, the user can readily determine what effect is assigned to any lighting unit at any point in time, simplifying the coordination of effects across multiple lighting units and allowing rapid review of the lighting sequence.
  • Figure 5 depicts a palette 520 which includes the stock effects from which a user may select lighting effects, although other techniques for providing the set of stock effects, such as by a menu, toolbar, etc., may be employed in the systems and methods described herein.
  • palette 520 there are provided icons for stock effects for the lighting of a fixed color effect 552, a cross fade between two color effects 554, a random color effect 558, a color wash effect 560, a chasing rainbow effect 565, a strobe effect 564, and a sparkle effect 568. This list is by no means exhaustive and other types of effects can be included.
  • the user may select an effect from the palette and select a region of the grid corresponding to the appropriate lighting unit or units and the desired time interval for the effect.
  • Additional parameters may be set by any suitable technique, such as by entering numerical values, selecting options from a palette, menu, or toolbar, drawing a vector, or any other technique known in the art, such as the parameter entry field 525.
  • Other interfaces and techniques for entry of lighting sequences suitable for performing some or all of the various functions described herein may be used and are intended to be encompassed by the scope ofthis disclosure. Examples of functions and interfaces suitable for use with the invention may be found in "A Digital Video Primer," June, 2000, by the Adobe Dynamic Media Group, Adobe Systems, Inc., incorporated herein by reference.
  • the methods described above can be readily adapted for controlling devices other than lighting units.
  • fog machines, sound effects, wind machines, curtains, bubble machines, projectors, stage practicals, stage elevators, pyrotechnical devices, backdrops, and any other features capable of being controlled by a computer may be controlled by a sequence as described herein.
  • multiple events can be automated and timed.
  • the user may program the lights to begin to brighten as the curtain goes up, followed by the sound of a gunshot as the fog rolls over the stage.
  • a program can be used to turn on lights and sound an alarm at 7:00 and turn on a coffee maker fifteen minutes later.
  • Holiday lighting arrays e.g., on trees or houses, can be synchronized with the motion of mechanical figurines or musical recordings.
  • An exhibit or amusement ride can coordinate precipitation, wind, sound, and lights in a simulated thunderstorm.
  • a greenhouse, livestock barn, or other setting for growing living entities can synchronize ambient lighting with automated feeding and watering devices.
  • Any combination of electromechanical devices can be timed and/or coordinated by the systems and methods described herein. Such devices may be represented on an interface for creating the sequence as additional lines on a grid, e.g., one line for each separate component being controlled, or by any other suitable means. Effects of these other devices can also be visually represented to the user.
  • a coffee maker could be represented by a small representation of a coffee maker that appears to brew coffee on the interface as the action occurs at the device or the interface can show a bar slowing changing color as feed is dispensed in a livestock barn.
  • Other types of static or dynamic effects are also possible.
  • the lighting units are capable of motion, e.g., by sliding, pivoting, rotating, tilting, etc.
  • the user may include instructions for the motion or movement of lighting units. This function may be accomplished by any means.
  • the desired movement may be effected by selecting a motion effect from a set of motion effects, as described for lighting effects above.
  • a lighting unit capable of rotating on its base may be selected, and a rainbow wash effect may be programmed to occur simultaneously with a rotating motion effect.
  • lighting units may be mounted on movable platforms or supports which can be controlled independently of the lights, e.g., by providing an additional line on a grid interface as described above.
  • Motion effects may also have parameters, such as speed and amount (e.g., an angle, a distance, etc.), that can be specified by the user.
  • Such light/motion combinations may be useful in a wide variety of situations, such as light shows, planetarium presentations, moving spotlights, and any other scenario in which programmable moving lights may be desirable.
  • instructions for controlling objects placed between a lighting unit and an object being illuminated can be provided by a user according to the systems and methods described herein. In this manner, an even wider array of lighting effects may be designed and preprogrammed for later execution.
  • One embodiment of the present invention is directed to a computer system configured to design or create a lighting sequence according to the systems and methods described herein, e.g., by executing (e.g., on the processor 10 in Fig. 1) a computer program in a computer language, either interpreted or compiled, e.g., Fortran, C, Java, C++, etc.
  • a computer program in a computer language either interpreted or compiled, e.g., Fortran, C, Java, C++, etc.
  • Another embodiment of the invention is directed to a disk, CD, or other computer-readable storage medium that encodes a computer program that, when executed, is capable of performing some or all of the functions described above which enable a user to create or design a lighting sequence which can be used to control a plurality of lighting units.
  • a lighting sequence may be recorded on a storage medium, such as a compact disk, floppy disk, hard drive, magnetic tape, volatile or non- volatile solid state memory device, or any other computer-readable storage medium.
  • the lighting sequence may be stored in a format that records the effects and their parameters as created by a user, in a format converted from that format into a format which represents the final data stream, e.g., suitable for directly controlling lighting units or other devices, or in any other suitable format.
  • the format in which a lighting sequence is created in any of the manners described above may not be compatible for directly controlling a lighting network, such that some format conversion may be required between the format used for creating the lighting sequence, and a format for controlling a plurality of lighting units.
  • the lighting sequence can be recorded on a storage medium either in the format in which it was created, in a format suitable for controlling a lighting network (such that the conversion will take place before storing the lighting sequence), or any other suitable format.
  • formats that can be used for controlling a plurality of lighting units include data streams in data formats such as DMX, RS-485, RS-232, etc.
  • lighting sequences may be linked to each other, e.g., such that at the conclusion of one sequence, another sequence is executed, or a master sequence may be created for coordinating the execution of a plurality of subsequences, e.g., based on external signals, conditions, time, randomly, etc.
  • the same system that is used to author a lighting sequence can also be used to play it back and thereby control a plurality of lighting units 40.
  • a general purpose computer e.g., including a display that comprises the interface 15 and a processor that serves as the processor 10 shown in Fig. 1
  • that same general purpose computer can playback the lighting program, and thereby perform the functions of the lighting controller 30 shown in Fig. 1.
  • the general purpose computer can be coupled to the plurality of lights 40 in any suitable manner, examples of which are discussed above.
  • a lighting program on one device (e.g., a general purpose computer), but play it back on a different device.
  • a retail store may desire to author a lighting program that can then be played back at multiple retail locations. While it is possible to interconnect multiple locations to the device on which the lighting program was authored (e.g., over the Internet), it may be desirable in some circumstances to have each of the retail locations be capable of controlling playback of the lighting program individually.
  • lighting displays are mobile, such that it is not assured that in every location wherein it is desired to set up a lighting display that there will be access to the Internet or some other communication medium for connecting to the device on which the program is authored.
  • one embodiment of the present invention is directed to a system in which lighting programs are authored on one device as described above, and then transferred to a different device which plays back the lighting program and controls a lighting display.
  • the separate playback device can be a general purpose computer, with software loaded thereon to enable it to playback the lighting program.
  • the transfer of the lighting program from the device on which it is authored to the device on which it is played back can be accomplished in any of numerous ways, such as by connection over a communication medium (e.g., via email over the Internet), or by loading the lighting program onto a portable computer readable medium (e.g., a disk, flash memory or CD) and physically transporting the medium between the two devices.
  • a communication medium e.g., via email over the Internet
  • a portable computer readable medium e.g., a disk, flash memory or CD
  • the device used to playback a lighting program need not have all of the functionality and capability of the device used in authoring the program (e.g., it need not include a video monitor, a robust user interface, etc.). Furthermore, Applicants have appreciated that in many instances, it would be desirable to provide a relatively small and inexpensive device to perform the playback function, so that the device can be portable and such that if there are multiple instances of lighting systems on which a program is to be played back, separate devices can be used to control the playback on each of the lighting systems, to increase flexibility.
  • one embodiment of the present invention is directed to a device, for playing back a lighting program, that includes less hardware and is less expensive than a more complex system that permits authoring of the lighting program.
  • the device need not include a lot of the functionality found in a general purpose computer, such as a full size display, a full alphanumeric keyboard, an operating system that enables processing of multiple applications simultaneously, etc.
  • the playback device can take any of numerous forms, as the present invention is not limited to any particular implementation.
  • the playback device 31 may employ any suitable loader interface 610 for receiving a lighting program 20, e.g., an interface for reading a lighting program 20 from a storage medium such as a compact disk, diskette, magnetic tape, smart card, or other device, or an interface for receiving a transmission from another system, such as a serial port, USB (universal serial bus) port, parallel port, IR receiver, or other connection for receiving a lighting program 20.
  • the lighting program 20 may be transmitted over networks (e.g., the Internet).
  • the components on the playback device 31 can be powered in any of numerous ways, including through the provision of a power source (e.g., a battery) within the playback device, or through the provision of an interface for receiving a power cord compatible with a standard electrical outlet.
  • a power source e.g., a battery
  • the playback device 31 is provided with neither an onboard power source nor an interface for a standard electrical outlet.
  • the interfaces for connecting the playback device 31 to both a device that authors a lighting program (e.g., a general purpose computer with software loaded thereon to perform the above-described functions) and for connecting with one or more lighting units 40 provide an interface that enables not only the transfer of data or other communication signals, but also sufficient electrical current to power the components within the playback device 31, thereby eliminating the need for a separate power interface.
  • the present invention is not limited to the use of any particular type of interface.
  • One example of a suitable interface that provides both commumcation and power is a USB port.
  • the playback device 31 may begin execution of a lighting sequence 20 upon the loading the lighting sequence 20 into the device 31, upon receiving a command or signal from a user interface, another device, or a sensor; at a specified time; or upon any other suitable condition.
  • the condition for initiation may be included in the lighting sequence 20, or may be determined by the configuration of the playback device 31.
  • the playback device 31 may begin execution of a lighting sequence 20 at a starting point other than the beginning of the lighting sequence 20. For example, playback device 31 may, upon receiving a request from the user, execute a lighting sequence 20 starting from a point three minutes from the beginning of the sequence, or at any other specified point, e.g., from the fifth effect, etc.
  • the playback device 31 may, upon receiving a signal from a user, a device or sensor, pause the playback, and, upon receiving a suitable signal, resume playback from the point of pausing.
  • the playback device 31 may continue to execute the lighting sequence 20 until the sequence terminates, or it may repeatedly replay the sequence until a command or signal is received from a user, device or sensor, until a specified time, or until any other suitable condition.
  • the playback device 31 may include a storage device 620, such as a memory unit, database, or other suitable module (e.g., a removable Flash memory), for storing lighting information, h accordance with one embodiment of the present invention, the storage device 620 is formed as a non-volatile memory device, such that once information is stored thereon, the information is maintained, even when no power is provided to the playback device 31.
  • the lighting information may take any of many forms.
  • the storage device 620 may store a plurality of effects and instructions for converting those effects into a data format or protocol, such as DMX, RS-485, or RS- 232, suitable for controlling a plurality of lighting units 40.
  • the storage device 620 may be preconfigured for a set of stock effects, may receive effects and instructions in the form of an authored lighting sequence 20, or the storage device 620 may include a preconfigured set of stock effects which can be supplemented by additional effects provided in an authored lighting sequence 20. Preconfiguring the storage device 620 with a set of stock effects permits a reduction in the memory required to store a lighting sequence 20, because the lighting sequence 20 may omit conversion instructions for effects preconfigured into the playback device 31. hi embodiments wherein the lighting sequence 20 includes stock effects designed by the author, suitable instructions may be included in lighting sequence 20 and stored in storage device 620, e.g., upon loading or execution of the lighting sequence 20.
  • the information stored within the storage device 620 need not be stored in the form of lighting effects and instructions for converting those effects into a data format suitable for controlling a plurality of light units, as such a conversion can be performed prior to storing the information in the storage device 620.
  • a lighting program may be transformed and stored on a storage medium (e.g., storage device 620) in a format which represents the final data stream suitable for directly controlling lighting units or other devices.
  • a storage medium e.g., storage device 620
  • the lighting units 40 will go through a number of different states, in that the changing of an effect, or parameter therefore, for any of the lighting units will result in a different state for the lighting units taken as a whole.
  • a playback rate can be established, and the program can be stored in the storage medium with a frame corresponding to each update period established by the playback rate. A frame has sufficient information to establish a full state of the lighting units 40 controlled by the program.
  • the storage medium stores the lighting program in a format so that there is a frame corresponding to each of the states of the lighting units.
  • This is to be contrasted with other types of lighting unit playback devices, which do not store such complete frames, but rather, store information that enables the playback device to interpolate and thereby generate the frames necessary to place the hghting units in each of the plurality of states to be achieved.
  • the embodiment of the present invention that stores a specific frame for each of the plurality of states is advantageous, in that it provides more flexibility in programming the lighting program.
  • other embodiments of the present invention are not limited in this respect, and they can transfer data to and store it within the storage medium in different formats.
  • the playback device 31 may include an external interface 650 whereby the playback device 31 can receive external signals useful for impacting (e.g., modifying) the execution or output of one or more stored lighting sequences 20.
  • the external interface 650 may include a user interface, which may in turn include switches, buttons, dials, sliders, a console, a keyboard, a speech recognition system, or any other device, such as a sensor, whereby a command or signal can be provided to the playback device 31 to otherwise influence the execution or output of the lighting sequence 20.
  • the external devices may be coupled to the playback device 31 via any suitable technique, including a direct wire connection or via RF or some other type of wireless connection.
  • the playback device 31 is provided with a processor 651 that receives the output of the storage device 620, and can act thereon to influence the played back output of the lighting sequence 20 stored within the storage device 620.
  • the external interface 650 is directly coupled to the processor 651, such that the processor can examine any external signals and commands and make decisions based thereon to influence the played back output of the lighting sequence 20.
  • an external command, cue or signal can also influence the execution order of a lighting sequence, by causing an alteration in the execution order of a lighting sequence, for example, by branching to places out-of-line in a particular lighting sequence or by branching out of the lighting sequence altogether.
  • commands, cues or signals received by the external interface 650 can be provided directly to the processor 651, which can then alter the playback sequence of a particular lighting sequence, go to the execution of stock effects, switch between lighting sequences, or take any other type of action relating to the execution order of lighting sequences from the storage device 620.
  • the playback device 31 further includes chronometers to provide timing references to the processor 651.
  • two such chronometers are employed, a first being a local time module 660, which functions as a counter for measuring time from a predetermined starting point, for example, when the playback device 31 is turned on or a point in time when the counter is reset.
  • a date time module 665 is provided which calculates the current date and time.
  • an output from each of the modules 660, 665 is provided to the processor 651, which enables the processor 651 to include timing based information in making decisions impacting any of numerous aspects discussed above relating to the playback output and order of lighting sequences from the storage device 620, including but not limited to the rate at which a lighting sequence is being played back, the intensity or any other parameter relating to a lighting sequence being played back, switching between lighting sequences based upon a particular timing event, etc.
  • each of the timing modules 660, 665 can receive communications from an external source, for example, to reset the timing modules, to load a value therein, etc.
  • timing modules 660, 665 need not be employed, as they can alternatively receive communications from external sources via other paths, e.g., from the external interface 650, from the loader 610, from an output of the processor 651, etc., as the embodiment of the present invention that employs such timing modules is not limited to any particular implementation.
  • the timing modules, 660, 665 provide the advantages described above, it should be appreciated that they are optional, as some embodiments of the present invention need not employ any timing modules at all.
  • external signals received, via external interface 650 can be provided directly to the processor 651, which can then take any of the various actions described above based on the external signals, e.g., altering the rate at which lighting sequences are played back, branching within or between lighting sequences, altering brightness or other parameters of lighting sequences being played back, etc.
  • a cue table 630 is also provided to compare or interpret external signals received via the external interface 650, and to provide information related thereto to the processor 651.
  • the cue table 630 may contain information relating to various inputs or conditions received by the external interface 650, as designated by the author of a lighting sequence 620, to effect the execution or output of the lighting sequence.
  • the cue table can include a list of if/then statements, other types of boolean expressions, or any other types of functions to interpret actions to be taken during execution of the lighting program based upon the information received from various inputs or conditions.
  • the playback device 31 may alter the execution or output of the lighting sequence 20 as indicated by the program, based upon information that is stored within the cue table 630 and provided to the processor 651.
  • the signals received by the external interface 650 can be provided either directly to the processor 651 or can be interpreted via the cue table 630.
  • the signals received by the external interface 650 can, in another embodiment of the invention, not be sourced directly to the processor 651, such that they can always be interpreted via the cue table 630.
  • the cue table 630 can be eliminated.
  • the playback device 31 may respond to external signals in ways that are not determined by the contents and instructions of the lighting sequence 20.
  • the external interface 650 may include a dial, slider, or other feature by which a user may alter the rate of progression of the lighting sequence 20, e.g., by changing the speed of the local time counter 660, or by altering the interpretation ofthis counter by the playback device 31.
  • the external interface 650 may include a feature by which a user may adjust the intensity, color, or other characteristic of the output, h certain embodiments, a lighting sequence 20 may include instructions to receive a parameter for an effect from a feature or other user interface on the external interface 650, permitting user control over only specific effects during playback, rather than over all of the effects output to the system of lighting units as a whole.
  • the playback device 31 may also include a transient memory 640.
  • the transient memory 640 may store temporary information, such as the current state of each lighting unit under its control, which may be useful as a reference for the execution of the lighting sequence 20. For example, as described above, some effects may use the output of another effect to define a parameter; such effects may retrieve the output of the other effect as it is stored in the transient memory 640. It should be appreciated that the embodiment of the present invention that employs a transient memory is not limited to using it in this manner, as numerous other uses may be possible (e.g., as a scratch pad memory for the processor 651). Furthermore, various embodiments of the present invention can be implemented without using any transient memory at all.
  • the playback device 31 may send the data created by the execution of a lighting sequence 20 to the lighting units 40 in any of numerous ways, as the present invention is not limited to any particular technique.
  • the playback device 31 transmits such data to the lighting units 40 via a network output port 680, which can be any of numerous types of interfaces capable of communicating with the lighting units 40.
  • the network output 680 can be an interface for connection to the lighting units via wires or cables, via an TR, RF or other wireless transmission, over a computer network, any other suitable method of data transfer, or via any combination of techniques capable of controlling the lighting units 40 and/or any associated other devices, hi the embodiments shown, the information read from the storage device 620 is passed through an output buffer 670 that is then coupled to the network output port 680.
  • the present invention is not limited in this respect, as no output buffer need be used in other embodiments.
  • the storage device 620 can be loaded with only a single lighting sequence 20 at any particular time, such that the playback device 31 is programmed to only play one particular lighting sequence 20.
  • execution of the single lighting sequence 20 can begin immediately upon the playback device 31 receiving power, and the lighting sequence 20 can be programmed to execute a set number of times (e.g., once or multiple times), or it can be programmed to continuously loop through multiple executions.
  • the playback device 31 is arranged to enable multiple lighting sequences 20 to be stored within the storage device 620.
  • some user interface is provided to enable a user to select which of the multiple lighting sequences 20 is to be played back at any particular time.
  • the present invention is not limited to the use of any particular type of user interface in this regard, as numerous techniques can be employed.
  • a simple button or switch can be employed that, when toggled, switches between the multiple lighting sequences 20 stored within the storage device 620.
  • the playback device 31 may not communicate directly with the lighting units, but may instead communicate with one or more subcontrollers which, in turn, control the lighting units or another level of subcontrollers, etc.
  • the use of subcontrollers permits distributive allocation of computational requirements.
  • An example of such a system which uses this sort of distributional scheme is disclosed in U.S. Patent No. 5,769,527 to Taylor, described therein as a "master/slave" control system.
  • Communication between the various levels may be unidirectional, wherein the playback device 31 provides instructions or subroutines to be executed by the subcontrollers, or bidirectional, where subcontrollers relay information back to the controller 30, for example, to provide information useful for effects which rely on the output of other effects as described above, for synchronization, or for other purposes.
  • the playback device 31 architecture permits effects to be based on external environmental conditions or other input.
  • An effect is a predetermined output involving one or more lighting units. For example, fixed color, color wash, and rainbow wash are all types of effects.
  • An effect may be further defined by one or more parameters, which specify, for example, lights to control, colors to use, speed of the effect, or other aspects of an effect.
  • the environment refers to any external information that may be used as an input to modify or control an effect or the playback of one or more lighting sequences, such as the current time or external inputs such as switches, buttons, or other transducers capable of generating control signals, or events generated by other software or effects.
  • an effect may contain one or more states, so that the effect can retain information over the course of time. A combination of the state, the environment, and the parameters may be used to fully define the output of an effect at any moment in time, and over the passage of time
  • the playback device 31 may implement effect priorities. For example, different effects may be assigned to the same lights. By utilizing a priority scheme, differing weights can be assigned to effects assigned to the same lights. For example, in one embodiment only the highest priority effect will determine the light output. When multiple effects control a light at the same priority, the final output may be an average or other combination of the effect outputs.
  • An alternate embodiment of the present invention is directed to a playback device 1000, as shown in Fig. 7, that differs from the playback device 31 described above in that it does not include a loader 610 for loading lighting programs into the storage device 620. h accordance with this illustrative embodiment of the present invention, the playback device 1000 is not loadable with customized lighting programs via the user, but rather can be provided with a storage device 620 having one or more pre-installed lighting programs already loaded thereon, such that the lighting programs stored in the playback device 1000 are not modifiable by the user.
  • the playback device 1000 does not include a cue table 630, timing modules 665 or 660, or a transient memory 640.
  • a cue table 630 timing modules 665 or 660
  • a transient memory 640 any or all of these features can alternatively be provided, in much the same manner as described above in connection with the playback device 31 of Fig. 6.
  • the storage device 620 stores multiple lighting programs, in much the same manner as discussed above in connection with some embodiments of the playback device 31 in Fig. 6.
  • a first external interface 1002 is provided to receive an externally generated signal to select which lighting program stored within the storage device 620 is to be played back by the playback device 1000.
  • the first external interface 1002 is compatible with any of numerous types of user interfaces to enable selection of a particular lighting program to be played back.
  • a push button, toggle switch or other type of device can be used that when activated by the user, causes the processor 651 to select a next lighting program for playback, so that by repeatedly toggling the input device, a user can step through all of the lighting programs stored in the storage device 620 to select a desired program for execution.
  • the playback device 1000 further includes a second external interface 1004 that is compatible with another user interface to enable the user to vary a parameter of a lighting program being played back by the playback device 1000.
  • the parameter being varied can apply to all of the lighting effects in a lighting program (e.g., can influence the playback speed or intensity of an entire lighting program being played back) or can relate to only a subset (including only a single effect) of the lighting effects. Any of numerous types of lighting effect or parameter changes can be accomplished, as described above in connection with other embodiments of the present invention.
  • the user interface compatible with the second external interface 1004 can take any of numerous forms, as this embodiment of the present , invention is not limited to the use of any particular type of interface.
  • the user interface may be capable of generating a plurality of different signals, which can be used to vary a parameter of the lighting program being played back, such as the playback speed, intensity of illumination, color of a particular portion of a lighting program (including adjustments in hue, saturation and/or intensity) or any other parameter.
  • the second external interface may provide a variable digital signal to the processor 651 depending on the setting or position of the user interface.
  • the user interface may supply an analog signal to the, second external interface 1004, which can then convert the analog signal to a digital signal for communication to the processor 651. While the embodiment of the present invention shown in Fig.
  • first and second external interfaces to perform the functions of selecting a particular lighting program to be played back and varying a lighting effect or parameter thereof, it should be appreciated that the present invention is not limited in this respect, and that other arrangements are possible, such as employing a single user interface to perform both of these functions.
  • a cue table 630 can be provided to interpret the information received from the first and second external interfaces 1002, 1004, rather than providing their outputs directly to the processor 651.
  • a lighting sequence as described above may be implemented using one or more subroutines, such as a Java program fragment.
  • Such subroutines may be compiled in an intermediate format, such as by using an available Java compiler to compile the program as byte codes.
  • the fragment In such a byte code format, the fragment may be called a sequence.
  • a sequence may be interpreted or executed by the playback device 31.
  • the sequence is not a stand-alone program, and adheres to a defined format, such as an instantiation of an object from a class, that the playback device 31 may use to generate effects.
  • the playback device 31 interprets the sequence, executing portions based on time or input stimuli.
  • a building block for producing a show is an effect object.
  • the effect object includes instructions for producing one specific effect, such as color wash, cross fade, or fixed color, based on initial parameters (such as which lights to control, start color, wash period, etc.) and inputs (such as time, environmental conditions, or results from other effect objects).
  • the sequence contains all of the information to generate every effect object for the show.
  • the playback device 31 instantiates all of the effect objects one time when the show is started, then periodically sequentially activates each one. Based on the state of the entire system, each effect object can programmatically decide if and how to change the lights it is controlling.
  • the run-time environment software running on the playback device 31 may be referred to as a conductor.
  • the conductor may be responsible for downloading sequences, building and maintaining a list of effect object instances, managing the interface to external inputs and outputs (including DMX), managing the time clock, and periodically invoking each effect object.
  • the conductor also maintains a memory (e.g., transient memory 640) that objects can use to communicate with each other.
  • a channel may be a single data byte at a particular location in the DMX universe.
  • a frame may be all of the channels in the universe. The number of channels in the universe is specified when the class is instantiated.
  • an effect object When an effect object sets the data for a particular channel it may also assign that data a priority.
  • the priorities can be interpreted in any of numerous ways. For example, if the priority is greater than the priority of the last data set for that channel, then the new data may supercede the old data; if the priority is lesser, then the old value may be retained; and if the priorities are equal, then the new data value may be added to a running total and a counter for that channel may be incremented. When the frame is sent, the sum of the data values for each channel may be divided by the channel counter to produce an average value for the highest priority data. Of course, other ways of responding to established priorities are possible.
  • the channel priorities may all be reset to zero.
  • the to-be-sent data may be retained, so if no new data is written for a given channel it will maintain its last value, and also copied to a buffer in case any effect objects are interested.
  • the conductor is the run-time component of the playback device 31 that unites the various data and input elements. The conductor may download sequences, manage the user interface, manage the time clock and other external inputs, and sequence through the active effect objects.
  • the technique for downloading the sequence file into the conductor can vary depending on the hardware and transport mechanism.
  • the sequence object and various required classes maybe loaded into memory, along with a reference to the sequence object.
  • more than one sequence object may be loaded into the conductor, and only one sequence may be active.
  • the conductor can activate a sequence based on external inputs, such as the user interface or the time of day.
  • the playback device 31 can be implemented in any of numerous ways.
  • a single processor 651 is shown in the embodiment of Figure 6 to perform each of the functions described above, it should be appreciated that the present invention is not limited in this respect, and that the various functions described above as being performed by the processor 651 can be distributed among two or more processors or controllers, such that in one embodiment there is a dedicated controller to carry out each of the functions of the processor 651 described above.
  • FIG 8 illustrates a computer system 2009 for implementing this embodiment of the present invention.
  • the audio input can be provided in any of numerous ways, h the embodiment shown in Figure 8, the audio input is provided as audio data 2005 provided on a computer-readable medium 2007 accessible to the computer system 2009.
  • the computer-readable medium 2007 can take any of numerous forms, as the present invention is not limited to the use of any particular computer-readable medium. Examples of suitable computer-readable media include compact discs, floppy discs, hard discs, magnetic tapes, and volatile and non-volatile memory devices.
  • the audio data 2005 may be stored in any format suitable for the storage of digital data.
  • One popular format is the MPEG Layer III data compression algorithm, which is often used for transmitting files over the Internet, and is widely known as MP3.
  • the files stored in the MP3 format are typically processed by an MP3 decoder for playback.
  • MP3 is merely one of numerous types of formats suitable for the storage of digital data, with other examples including MIDI, MOD, CD A, WMA, AS and WAN. It should be appreciated that these are merely examples of suitable formats, and that there are other standards and formats that can be used, including formats that do not adhere to any particular standard.
  • the MP3 format compresses the data, it should be appreciated that other formats may not. It should further be appreciated that the present invention is not limited to use with data stored in any particular format.
  • the audio signal 2003 may be a digital signal, input to the computer system 2009 via a digital interface such as a USB, serial or parallel port or any other suitable interface, or may be an analog signal, input to the computer system 2009 via an audio jack or any other suitable interface, hi accordance with one embodiment of the present invention, when the audio signal 2003 is provided in analog form, it can be converted (via an analog-to-digital converter not shown) within the computer system 2009, so that the audio signal can be processed digitally, which provides a number of advantages as discussed below.
  • the computer 2009 includes an audio decoder 2011 that accepts as an input either audio data 2005 which is stored on a computer readable medium 2007 coupled to the computer 2009, or an external audio signal 2003.
  • the audio decoder 2011 generates as an output information reflective of one or more characteristics of the audio signal that is input to the audio decoder (i.e., either the audio signal defined by the audio data 2005 or the external audio signal 2003).
  • the information characteristic of the audio input signal can take any of numerous forms, as the present invention is not limited to any particular technique for analyzing an audio signal, h accordance with one embodiment of the present invention, digital signal processing techniques are used to analyze the audio signal. It should be appreciated that there are many different types of computations that can be performed using digital signal processing techniques, and the present invention is not limited to any particular technique for analyzing the audio signal.
  • Examples of information characteristic of an audio signal include information relating to a frequency content and an intensity of the audio signal.
  • the audio decoder 2011 may generate time domain information for the audio input signal, representing the intensity of the audio signal over time.
  • the time domain information may be outputted as an array, wherein each array element is an integer representing the intensity of the audio signal for a given point in time, or in any other suitable format.
  • Audio decoder 2011 may further generate frequency domain information by performing a Laplace transform (examples of which include a Fourier transform and a fast Fourier transform (FFT)) of time domain information for the audio signal.
  • a fast Fourier transform is performed, but the present invention is not limited in this respect and can employ any suitable technique for analysis in the frequency domain.
  • the frequency domain information may be outputted as an array, wherein each array element is an integer representing the intensity of the audio signal for a given point in time.
  • Audio decoder 2011 may further generate frequency domain information by performing a fast Fourier transform (FFT) of time domain information for the audio signal.
  • the frequency domain information may be outputted as an array, wherein each array element can be an integer representing the amplitude of the signal for a given frequency band during a corresponding time frame.
  • the frequency domain information is the FFT of the corresponding time domain information for a particular time frame.
  • each channel for a single audio signal is analyzed separately by the audio decoder 2011, such that separate information is generated by analyzing the characteristics of the different channels. For example, using the example described above, wherein the information concerning an audio signal includes frequency domain information and time domain information, in one embodiment of the present invention the audio decoder 2011 generates separate frequency domain information and time domain information for each separate channel for a single input audio signal (e.g., audio data 2005 or external audio signal 2003).
  • the audio decoder 2011 can be implemented in any of numerous ways, as the present invention is not limited to any particular implementation technique.
  • the audio decoder 2011 can be implemented in dedicated hardware, or can be implemented in software executed on a processor (not shown) within the computer system 2009.
  • the audio decoder 2011 can be provided as an executable program written in any suitable computer programming language (e.g., Fortran, C, Java, C++, etc.).
  • the software for implementing the audio decoder 2011 can be stored on any computer readable medium accessible to the computer system 2009, including the computer readable medium 2007 that stores the audio data 2005, or any other computer readable media.
  • the software for implementing the audio decoder 2011 can, for example, can be any one of a number of commercially available software programs that perform the above-described functions.
  • Examples of such commercially available software programs include MP3 players such as WinampTM, available from Nullsoft, Inc.
  • MP3 players include application programming interfaces (APIs) that enable third party add-on plug-in software components to interface with the MP3 player, and to take advantage of the functionality provided thereby, including the above-described information that the audio decoder 2011 provides concerning the characteristics of an audio input.
  • APIs application programming interfaces
  • one embodiment of the present invention is directed to software, for execution on a computer system 2009, that acts as a plug-in to a commercially available MP3 player to provide the mapping functions described below to control a lighting network in response to an input audio signal (e.g., stored audio data 2005 or an external audio signal 2003).
  • the mapper 2015 performs a function that is similar in many respects to the playback function performed by the processor 651 and the storage device 620 (see e.g., Figures 6-7) in the embodiments discussed above.
  • the mapper 2015 can be provided with a lighting program (e.g., stored in a mapping table 2015t) that can include one or more variables to receive input values at execution time.
  • the mapper 2015 can receive the output of the audio decoder 2011, so that information concerning the characteristics of the input audio signal can be provided to the mapper 2015 to provide the input values for variables in the lighting program executed by the mapper 2015.
  • the mapper 2015 can execute lighting programs that each includes only a single entry defining the manner in which control signals, to be passed to the lighting network, will be generated.
  • Each such lighting program for the mapper 2015 may be programmed using a number of if/then statements or Boolean logic to interpret the numerous varied permutations of inputs from the audio decoder 2011 relating to characteristics of the audio input signal, and may generate control signals to the lighting network accordingly.
  • the control signals transmitted to the lighting network will result in a changing light show as the input audio signal is played, as the characteristics of the audio signal will change over time, resulting in changing inputs to the mapper 2015 and, consequently, changing control signals sent to the lighting network.
  • the mapping table 2015t can include lighting programs that include a plurality of lighting sequences, in much the same manner as the embodiments described above (e.g., in connection with Figures 6-7).
  • the mapper 2015 will step through various lighting sequences as the input audio signal is played back, which can result in a more varied light show, as not only will the inputs from the audio decoder 2011 change as the input audio signal is played back, but the mapping function executed by the mapper 2015 can also be programmed to change over time.
  • FIG 8 can be programmed (i.e., in the mapping table 2015t) with lighting programs that can achieve any of the lighting effects discussed above, including those described in connection with the systems in Figures 1 -7.
  • the computer system 2009 includes a timer 2021 that provides an input to the mapper 2015.
  • the timer can be used in a manner similar to the timing modules 660, 665 discussed above in connection with the embodiment of Figure 6, but is an optional feature that need not be employed in all embodiments of the present invention, h accordance with one embodiment of the present invention, the timer 2021 is used to provide variation over time in the mapping function executed by the mapper 2015, to achieve resulting variation in the control signals sent to the lighting network during the playback of one or more audio input signals and thereby avoid redundancy in the lighting show produced in response to the audio signals.
  • This changing of the mapping function can be accomplished in any of numerous ways.
  • a variable can be provided that receives an input value from the timer 2021, such that the ti er information can be taken into account in the mapping logic.
  • a mapper 2015 can use inputs received from the timer 2021 to index into the mapping table 2015t to select a different lighting program, or a different line within a particular lighting program, to change the mapping function.
  • the timer 2021 can include date and time information, such that the mapping function can change as a result of the date and/or time, or can include local time information so that the mapping function can be changed as a result of the amount of time that a particular lighting show has been executed in response to audio signal inputs.
  • an external interface 2045 is provided to receive additional user inputs that can be input to the mapper 2015 to impact the control signals sent to the lighting network. It should be appreciated that this is an optional feature, and need not be provided in every embodiment of the present invention.
  • the external interface 2045 can be of any of numerous types, including all of those discussed above in connection with the embodiments of Figures 1-7, and can control the lighting show produced by the mapper 2015 in any of the numerous ways discussed above.
  • one or more additional external inputs can provide an additional variable to the mapping function performed by the mapper 2015 to impact the control signals sent to the lighting network.
  • the external input received by the external interface 2045 can also be used to change between lighting programs provided by the mapping table
  • the external interface 2045 is a graphical user interface (GUI) that can be displayed on a display of the computer system 2009 to facilitate a user in selecting a particular mapping function to be provided by the mapping table 2015t.
  • GUI graphical user interface
  • This aspect of the present invention can be implemented in any of numerous ways, and is not limited to any particular implementation technique.
  • a graphical user interface can be provided that lists various types of mapping functions that are considered to be particularly suitable for particular music types.
  • mapping function e.g., from the mapping table 2015t
  • the particular mapping function employed can be selected based upon information provided with the audio signal that provides an indication of the type of music included therein.
  • some pieces of music can include a tag or other information in the music, or associated therewith, that identifies the type of music, hi accordance with one embodiment of the present invention, such information can be used to select a mapping function that fits the style of music in much the same manner as described above.
  • changes in the mapping performed by the mapper 2015 can be accomplished in numerous ways by including a variable in a single mapping function that can result in changes of the mapping output or by switching between different mapping functions in the mapping table 2015t.
  • the changes in the mapping performed by the mapper 2015 can be accomplished in response to any of numerous stimuli, including input provided from an external input (e.g., from a user selecting a different mapping function), in response to timing information from the timer 2021, in response to some characteristic of an input audio signal (e.g., provided to the mapper 2015 by the audio decoder 2011), in response to a detection by the audio decoder that a particular audio signal (e.g., a song) has terminated and a new one is beginning, etc.
  • an external input e.g., from a user selecting a different mapping function
  • some characteristic of an input audio signal e.g., provided to the mapper 2015 by the audio decoder 2011
  • a particular audio signal e.g., a song
  • the computer system 2009 does not include a cue table 630 or a transient memory 640 as described in connection with the embodiment of Figure 6.
  • the cue table 630 can be provided between the external interface 2045 and the mapper 2015, and/or between the audio decoder 2011 and the mapper 2015 to assist in analyzing the inputs provided by the external interface 2045 and/or the characteristics of the input audio signal provided by the audio decoder 2011.
  • these features are optional, and need not be employed in all embodiments of the present invention.
  • the manner in which the characteristics of the input audio signal are analyzed by the mapper 2015 to impact the control signals sent to the lighting network to control the lighting show can be performed in any of numerous ways, as the present invention is not limited to any particular, type of analysis.
  • the mapper 2015 can look for particular activity levels within a particular frequency band, can detect a beat of the music based upon pulses within particular frequency bands or overall activity of the input signal, can look for an interaction between two or more different frequency bands, can analyze intensity levels characteristic of a volume at which the audio signal is being played, etc.
  • the external interface 2045 can also enable external inputs (e.g., inputs from a user) to change any of numerous variables within the mapping function to impact the lighting show produced.
  • the mapper 2015 can be implemented in any of numerous ways, including with dedicated hardware, or with software executed on a processor (not shown) within the computer system 2009. When implemented in software, the software can be stored on any computer readable medium accessible to the computer system 2009, including a computer readable medium 2007 that stores the audio data 2005.
  • the software that implements the mapper 2015 can be implemented as an executable program written in any number of computer programming languages, such as those discussed above.
  • the software can be implemented on a same processor that also executes software to implement the audio decoder 2011, or the computer system 2009 can be provided with separate processors to perform these functions.
  • one embodiment of the present invention is directed to the provision of a software plug-in that is compatible with commercially available MP3 players to enable the control of a lighting network in response to an audio signal being played by the MP3 player.
  • one embodiment of the present invention is directed to a computer readable medium encoded with a program that, when executed by a processor on a computer system such as 2009, interacts with an audio decoder 2011 of an MP3 player executing on the computer system 2009, and implements the functions of the mapper 2015 to generate the control signals necessary to control a lighting network as described above.
  • a processor on a computer system such as 2009
  • an audio decoder 2011 of an MP3 player executing on the computer system 2009 and implements the functions of the mapper 2015 to generate the control signals necessary to control a lighting network as described above.
  • the lighting units 40 ( Figure 1) of the lighting network may be any type of light source, including incandescent, LED, fluorescent, halogen, laser, etc. Each lighting unit may be associated with a predetermined assigned address as discussed above.
  • the computer system 2009 may send control signals to the lighting network in any of numerous ways, as the present invention is not limited to any particular technique.
  • the computer system 2009 includes an output buffer 2019 and a network output port 2020 to facilitate transmission of control signals from the mapper 2015 to the lighting network.
  • the network output port 2020 can be any of numerous types of interfaces capable of communicating with the lighting network, including the numerous types of interfaces discussed above in connection with the output ports 680 described in connection with Figures 6-7.
  • the information outputted by the mapper 2015 is passed through an output buffer 2019 that is then coupled to the network output 2020.
  • the present invention is not limited in this respect, as no output buffer need be used.
  • mapping table 2015t and output from the mapper 2015 may not be in a format capable of directly controlling a lighting network, such that in one embodiment of the present invention, a format conversion is performed.
  • formats for controlling a plurality of lighting units include data streams and data formats such as DMX, RS-485, RS-232, etc.
  • Any format conversion can be performed by the mapper 2015, or a separate converter can be employed.
  • the converter can be implemented in any of numerous ways, including in dedicated hardware or in software executing on a processor within the computer system 2009.
  • the computer system 2009 not only generates control signals to control a lighting network, but also drives one or more speakers to generate an audible sound from the audio input signal, with the audible sound being synchronized to the light show produced by the lighting network.
  • the computer system 2009 includes an audio player 2022 that reads audio data 2005 stored on the computer readable medium 2007, performs any processing necessary depending upon the format in which the audio data 2005 is stored (e.g., decompresses the data if stored in a compressed format) and passes the information to a speaker driver 2024 which can then drive one or more speakers to produce an audible sound.
  • the one or more speakers described above may include any device for generating audible output including, for example, headphones and loudspeakers.
  • the speaker driver 2024 can be implemented in any of numerous ways, as the present invention is not limited to any particular implementation technique.
  • the speaker drivers 2024 can be implemented on a sound card provided within the computer system 2009.
  • the audio player 2022 also can be implemented in any of numerous ways.
  • commercially available MP3 players include software that, when executed on a processor within the computer system 2009, perform the functions of the audio player 2022.
  • the external audio signal 2003 can be provided in either digital form, or in analog form.
  • the external audio signal may pass through an analog-digital converter (not shown) within the computer system 2009 prior to being passed to the audio decoder 2011. This conversion can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation.
  • the external audio signal can be provided to a sound card within the computer system 2009, which can perform the analog-to-digital conversion.
  • some synchronization may be performed to ensure that the lighting show produced on the lighting network is synchronized with the audible playing of the audio signal. This can be accomplished within the computer system 2009 in any of numerous ways. For example, when the audio player 2022 and audio decoder 2011 are provided as part of a commercially available MP3 player, the MP3 player will automatically perform this synchronization.
  • the analyzing of an audio input signal is performed essentially simultaneously with a playing of the audio signal to generate an audible sound.
  • the analysis of the audio input signal is performed prior to playing the audio signal to generate an audible sound. This can provide for some flexibility in performing the mapping of the audio input signal to control signals for the lighting network, as the mapping function can consider not only the characteristics of the audible signal that corresponds with the instant in time for the control signals being generated, but can also look ahead in the audio signal to anticipate changes that will occur, and thereby institute lighting effects in advance of a change in the audible playback of the audio signal.
  • the audio input signal can be analyzed prior to it being played to generate an audible output, and the results of that analysis (e.g., from the audio decoder 2011) can be stored in memory (e.g., in a transient memory such as 640 in Figure 6) or in the mapping table 2015t, for future reference by the mapper 2015 when the audio signal is audibly played.
  • the function performed by the mapper 2015 can look not only to characteristics of the music that correspond to the point in time with the audio signal being played, but can also look ahead (or alternatively behind) in the audio signal to anticipate changes therein.
  • mapping rather than storing the outputs that are characteristic of the audio signal, another option is to perform the mapping at the time when the audio input signal is first analyzed, and store the entire control signal sequence in memory (e.g., in the mapping table 2015t). Thereafter, when the audio signal is audibly played, the mapper 2015 need not do any analysis in real time, but rather, can simply read out the previously defined control signals, which for example can be stored at a particular sample rate to then be played back when the audio signal is played to generate an audible signal. While the embodiment of the present invention directed to performing an analysis of the audio signal prior to playing it back provides the advantages described above, it should be appreciated that this is not a requirement of all embodiments of the present invention.
  • the lighting programs e.g., entries in the mapping table 2015t
  • the lighting programs can be authored using an authoring system in much the same manner as described above in connection with the generation of lighting programs for the embodiments of Figures 1-7.
  • a graphical user interface can be provided to assist a user in generating the lighting programs.
  • the authoring can be performed on the same computer system 2009 that is used to playback the lighting program and generate the control signals to the lighting network, or the lighting programs can be authored on a different system, and then transferred, via a computer readable medium, to the mapping table 2015t in the computer system 2009.
  • the device used to control the lighting network 2001 need not have all of the functionality and capability of a computer system, for example it need not include a video monitor, keyboard, or other robust user interface. Furthermore, Applicants have appreciated that in many instances, it is desirable to provide a relatively small and inexpensive device to perform the lighting control function in response to an audio input, so that the device can be portable.
  • one embodiment of the present invention is directed to a lighting control device that includes all of the functionality described above in connection with Figure 8, but is implemented on a computer system that is dedicated to performing the functions described above, and is not a general purpose computer.
  • FIG 9 discloses a lighting control device 2027 for controlling lighting units 40 of a lighting network 2001 in response to audio input data or an input audio signal.
  • the lighting control device performs all of the functions of the embodiment illustrated in Figure 8, but is not implemented on a general purpose computer. Rather, the lighting control device is a device dedicated to performing only those functions described above, and need not include a lot of the functionality found in a general purpose computer, such as a full size display, a full alphanumeric keyboard, an operating system that enables processing of multiple applications simultaneously, etc.
  • the lighting control device can take any of numerous forms, as the present invention is not limited to any particular implementation.
  • Figure 10 illustrates a lighting control device 2030 that includes only a subset of the functionality provided in the embodiment of the invention shown in Figure 8.
  • the embodiment of the invention shown in Figure 10 does not include an audio player for generating an audio signal internally, and is not adapted to be coupled to a computer readable medium including audio data.
  • the lighting control device 2030 is adapted to receive an external audio signal 2003 from any suitable source, and to then process the audio signal, in much the same manner as the embodiment of Figure 8, to generate control signals for a lighting network to produce a lighting show based on the external audio input.
  • the lighting control device 2030 includes an audio decoder 2011 and a mapper 2015 (with its associated table 2015t) that each performs the functions described above in terms of analyzing an external audio input signal and generating commands for a lighting network based thereon, and further includes a network output port 2020 compatible with the lighting network.
  • the lighting control device 2030 may optionally include a timer 2021, output buffer 2019 and or a cue table (not shown) that can perform the same functions described above in connection with the embodiment of Figure 8.
  • the lighting control device 2030 includes an external interface 2045 for receiving an external input 2046, which can take any of numerous forms as discussed above in connection with the embodiment of Figure 8.
  • the external interface 2045 is adapted to be a simple interface that is relatively inexpensive and compact.
  • the external interface can be used to perform any of numerous functions, such as to switch between lighting programs (e.g., entries in the mapping table 2015t), to vary lighting effects or parameters therefore, or any of the other functions discussed above in connection with the embodiments of Figures 1-9.
  • the external interface can take any of numerous forms, including switches, buttons, dials, sliders, a console, a keyboard, a speech recognition system or any other device, such as a sensor (e.g., responsive to light, motion or temperature) whereby a command or signal can be provided to the lighting control device 2030.
  • An external device may be coupled to the external interface 2045 via any suitable technique, including a direct wire connection, or via RF or some other type of wireless connection.
  • the lighting control device 2030 may receive the external audio signal using any suitable interface, such as the serial port, USB port, parallel port, LR receiver, a standard stereo audio jack, or any other suitable interface.
  • the components on the lighting control device 2030 can be powered in any of numerous ways, including through the provision of a power source (e.g., a battery) within the lighting control device 2030, or through the provision of an interface for receiving a power cord compatible with a standard electrical outlet.
  • a power source e.g., a battery
  • the lighting control device 2030 is provided with neither an onboard power source nor an interface for a standard electrical outlet.
  • the interface for connecting the lighting control device 2030 to a lighting network 2001 enables not only the transfer of data or other communication signals, but also sufficient electrical current to power the components within the lighting control device 2030.
  • the need for a separate power interface may be thereby eliminated.
  • the present invention is not limited to the use of any particular type of interface.
  • One example of a suitable interface that provides both communication and power is a USB port.
  • the lighting control device 2030 may begin processing of the external audio signal 2003 and/or initiate the sending of control signals to the lighting network to initiate a lighting show either in response to a signal received at the external input 2046, or immediately upon receipt of the external audio signal 2003.
  • the lighting control device 2030 may initiate a lighting show at a specified time, or upon any suitable condition.
  • the lighting control device 2030 may continue to send control information to the lighting network until it no longer receives any external audio signal 2003, until a signal is received at the external input 2046, until the occurrence of a specified condition, until a particular point in time, or any other suitable event, hi one embodiment of the present invention, the lighting control device 2030 includes a storage device to store the mapping table 2015t.
  • the storage device can be a memory unit, database, or other suitable module (e.g., a removable Flash memory) for storing one or more lighting programs in the mapping table 2015t.
  • the storage device is formed as a non- volatile memory device, such that once information is stored thereon, the information is maintained, even when no power is provided to the lighting control device 2030.
  • any single component or collection of multiple components of the above-described embodiments that perform the functions described above can be generically considered as one or more controllers that control the above- discussed functions.
  • the one or more controllers can be implemented in numerous ways, such as with dedicated hardware, or using a processor that is programmed to perform the functions recited above.
  • one implementation of the present invention comprises at least one computer readable medium (e.g., a computer memory, a floppy disk, a compact disk, a tape, etc.) encoded with a computer program that, when executed on a processor, performs the above-discussed functions of the present invention.
  • the computer readable medium can be transportable such that the program stored thereon can be loaded onto any device having a processor to implement the aspects of the present invention discussed above.
  • the reference to a computer program that, when executed, performs the above-discussed functions is not limited to an application program, but rather is used herein in the generic sense to reference any type of computer code (e.g., software or microcode) that can be employed to program a processor to implement the above- discussed aspects of the present invention.
  • any reference to an LED is intended to encompass any light emitting semiconductor device.
  • any reference to a light or illumination unit generating a "color” refers to the generation of any frequency of radiation, including not only frequencies within the visible spectrum, but also frequencies in the infrared, ultraviolet and other areas of the electromagnetic spectrum.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

Methods and apparatus for executing a lighting program to control a plurality of light emitting diodes (LEDs) in response to at least one characteristics of an audio input. In one embodiment, the audio input is digitally processed to determine the at least one characteristic. In other embodiments, control signals for the LEDs are generated in response to a timer and/or input from a user interface, as well as in response to the at least one characteristic of the audio input. In another embodiment, the control signals for the LEDs are generated by a same computer that processes the audio input to transmit signals to speakers to audibly play the audio input. In a further embodiment, a GUI is provided to assist in authoring the lighting program. In another ambodiment, the audio signal is processed before being played back. In a further embodiment, the lighting program anticipates changes in the audio input.

Description

METHOD AND APPARATUS FOR CONTROLLING A LIGHTING SYSTEM IN RESPONSE TO AN AUDIO INPUT
Field of the Invention
The present invention relates generally to methods and apparatus for controlling a lighting system, and more particularly to methods and apparatus for controlling a lighting system in response to an audio input.
Background of the Invention
The increased accessibility of music in digital formats has led to the development of computer software to interpret digitally formatted music. The software enables the music to be broadcast using speakers and other audio components that can be coupled to a computer system. One example of such computer software is the MP3 players which allow music files in MP3 format to be interpreted and played by a user. Some MP3 player software provides the additional feature of an on-screen visual interface whereby the motion of graphics displayed to the user is synchronized with aspects of the music, such as frequency or tempo.
While such software has the benefit of providing a visual means for the appreciation of music, it does not allow for any type of visual display via a device peripheral to the computer system. It is an object of the present invention to provide methods and apparatus for controlling a lighting display in response to an audio input.
Summary of the Invention
One embodiment of the invention is directed to a method for executing a lighting program to control a plurality of light emitting diodes (LEDs). The method comprises acts of: (A) receiving an audio input in digital form; (B) digitally processing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input. Another embodiment of the invention is directed to a computer readable medium encoded with a program that, when executed, performs a method for executing a lighting program to control a plurality of LEDs. The method comprises acts of: (A) receiving an audio input in digital form; (B) digitally processing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
Another embodiment of the invention is directed to an apparatus for executing a lighting program to control a plurality of LEDs. The apparatus comprises at least one storage medium to store the lighting program; at least one input to receive an audio input; an audio decoder to digitally process the audio input to determine at least one characteristic of the audio input; and at least one controller, coupled to the audio decoder and the at least one storage medium, to execute the lighting program to generate control signals to control the plurality of LEDs. The at least one controller generates at least one of the control signals based at least in part on the at least one characteristic of the audio input.
Another embodiment of the invention is directed to a computer readable medium encoded with a first program that, when executed on a processor, performs a method for executing a lighting program to control a plurality of LEDs. The processor is programmed with a second program that processes an audio input to determine at least one characteristic of the audio input. The method comprises acts of: (A) receiving information from the second program relating to the at least one characteristic of the audio input; (B) executing the lighting program to generate control signals to control the plurality of LEDs; and (C) during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input received from the first program.
Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs. The method comprises acts of: (A) receiving an audio input and an input from at least one timer; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
Another embodiment of the invention is directed to a computer readable medium encoded with a program that, when executed, performs a method for executing a lighting program to control a plurality of LEDs. The method comprises acts of: (A) receiving an audio input and an input from at least one timer; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
Another embodiment of the invention is directed to a computer readable medium encoded with a first program that, when executed on a processor, performs a method for executing a lighting program to control a plurality of LEDs. The processor is programmed with a second program that processes an audio input to determine at least one characteristic of the audio input. The method comprising acts of: (A) receiving information from the second program relating to the at least one characteristic of the audio inputand an input from the at least one timer; (B) executing the lighting program to generate control signals to control the plurality of LEDs; and (C) during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
Another embodiment of the invention is directed to an apparatus for executing a lighting program to control a plurality of LEDs. The apparatus comprises at least one storage medium to store the lighting program; at least one input to receive an audio input; an audio decoder to process the audio input to determine at least one characteristic of the audio input; and at least one controller, coupled to the audio decoder and the at least one storage medium, to execute the lighting program to generate control signals to control the plurality of LEDs. The at least one controller generates at least one of the control signals based at least in part on the at least one characteristic of the audio input and an input from at least one timer.
Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs. The method comprises acts of: (A) receiving an audio input and an input from a graphical user interface; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the graphical user interface.
Another embodiment of the invention is directed to a method for execution on a computer. The method comprises acts of: (A) processing, on the computer, information indicative of an audio signal to generate a speaker-compatible signal indicative of the audio signal; (B) determining at least one characteristic of the audio signal; (C) executing, on the computer, a lighting program to generate control signals to control a plurality of LEDs; (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input; and (E) transmitting the speaker-compatible signal to a speaker to generate audible sound indicative of the audio signal.
Another embodiment of the invention is directed to a method for authoring a lighting program to control a plurality of LEDs is response to at least one characteristic of an audio input. The method comprises acts of: (A) providing a graphical user interface (GUI) that displays information representative of the plurality of LEDs, a plurality of lighting effects to be assigned thereto, and the at least one characteristic of the audio input (B) selecting, based on at least one user input provided via the GUI, at least one of the plurality of lighting effects to correspond to at least one of the plurality of LEDs in response to the at least one characteristic of the audio input; and (C) creating a lighting program, based on the at least one user input, for generating control information for the plurality of LEDs.
Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs. The method comprises acts of: (A) receiving an audio input; (B) analyzing the audio input to determine at least one characteristic of the audio input; (C) storing information related to the at least one characteristic of the audio input; (D) executing the lighting program, after completion of the act (C), to generate control signals to control the plurality of LEDs; and (E) during execution of the lighting program in the act (D), reading the stored information and generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
Another embodiment of the invention is directed to a method for executing a lighting program to control a plurality of LEDs to create a light show. The method comprises acts of: (A) receiving an audio input having a duration and varying in time during the duration of the audio input; (B) processing the audio input to determine at least one first characteristic of the audio input at a first time during the duration; (D) executing the lighting program in synchronization with the audio input to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C) at a time that is prior to the first time during the duration of the audio input, generating at least one of the control signals based at least in part on the least one first characteristic of the audio input so that the light show anticipates changes in the audio input.
Brief Description of the Drawings
Figure 1 illustrates a system for creating a lighting sequence and executing the lighting sequence on a plurality of lighting units according to one embodiment of the invention;
Figure 2 presents an exemplary method for creating a lighting effect in accordance with one embodiment of the invention;
Figure 3 depicts a representative interface for describing an arrangement of lighting units in accordance with another embodiment of the invention;
Figure 4 represents an alternate interface for graphically reproducing a lighting sequence; Figure 5 portrays a representative interface for creating a lighting sequence in accordance with one embodiment of the invention;
Figure 6 shows one embodiment of an apparatus for executing a lighting sequence in accordance with another embodiment of the invention;
Figure 7 shows a block diagram of an alternate embodiment of the present invention directed to an apparatus for executing a lighting sequence;
Figure 8 is a diagram showing an apparatus for controlling a lighting network in response to an audio input according to another embodiment of the invention; Figure 9 is a diagram showing an apparatus for controlling a lighting network in response to an audio input according to another embodiment of the invention; and
Figure 10 is a diagram showing an example of a lighting control device in the apparatus of Figure 9, according to one embodiment of the invention.
Detailed Description
As mentioned above, while music player software provides a convenient means of translating digitally formatted music for listening, and in some cases also provides a screen-based graphical interface for visually appreciating music, existing programs have limited functionality with respect to the visualization of music. For example, existing music player software does not allow for the visual display of music external to the computer. Such an external display would provide increased music-based feedback, thereby enhancing a user's overall sensory experience.
One embodiment of the present invention is directed to a method and apparatus for controlling a lighting network in response to an audio input. This can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation technique. In accordance with one illustrative embodiment, an audio input is digitally processed to analyze the audio input, and at least one aspect of a lighting system is controlled in response to a characteristic of the audio input. In another embodiment of the present invention, timing information is also considered so that the control signals sent to the lighting network for a particular audio input can vary over time, to avoid repetitiveness.
The assignee of the present application has previously developed other systems on which users can author lighting programs including one more lighting sequences, as well as devices for playing back a lighting program to control a lighting system. Many of the features of those systems can be incorporated in the present invention to enable the control of a lighting system in response to an audio input. Therefore, a description will initially be provided of authoring software and playback devices for lighting programs to control a lighting system, before turning to the specific aspects of the present invention relating to performing such control in response to an audio input. Overview of Systems for Authoring and Playing Back Lighting Programs to Control A Lighting Network
Figure 1 illustrates an example of a system for authoring and playing back a lighting program including one more lighting sequences. The system of Figure 1 includes a processor 10 supporting a software application, having an interface 15, which can be used to create a lighting program 20, which may include one or more lighting sequences. The system further includes a lighting controller 30 which can execute or play back the lighting sequence 20 and in response thereto, control one or more lighting units 40. The term "sequence" in the context ofthis disclosure refers to two or more lighting effects spaced in time.
The software application may be implemented in any of numerous ways, as the invention is not limited to any particular implementation. For example, the software application may be a stand-alone application, such as an executable image of a C++ or Fortran program or other executable code and/or libraries, or may be implemented in conjunction with or accessible by a Web browser, e.g., as a Java applet or one or more HTML web pages, etc. Processor 10 may be any system for processing in response to a signal or data, as the present invention is not limited to any particular type of processor. For example, the processor 10 may comprise microprocessors, microcontrollers, other integrated circuits, computer software, computer hardware, electrical circuits, application-specific integrated circuits, personal computers, chips, and other devices alone or in combination capable of providing processing functions. For example, processor 10 can be any suitable data processing platform, such as a conventional LBM PC workstation operating the Windows operating system, a SUN workstation operating a version of the Unix operating system, such as Solaris, or any other suitable workstation. Controller 30 may communicate with lighting units 40 by radio frequency ( F), ultrasonic, auditory, infrared (IR), optical, microwave, laser, electromagnetic, any type of computer link (e.g., a network) or any other suitable transmission or connection technique. A suitable protocol may be used for transmission between the controller 30 and the lighting units 40, including sending pulse- width modulated signals over a protocol such as DMX, RS-485, RS-232, or any other suitable protocol. Lighting units 40 maybe incandescent, LED, fluorescent, halogen, laser, or any other type of light source. Each lighting unit may be associated with a predetermined assigned address either unique to that lighting unit or overlapping the address of other lighting units to facilitate communication with the controller 30.
It should be appreciated from the foregoing, in one embodiment of the present invention, control signals for driving the lighting units 40 can take the form of pulse- width modulated signals. Thus, the lighting units 40 may be driven with a fixed current or voltage that is then turned on or off in accordance with a pulse-width modulated control signal. Alternatively, the lighting units 40 may be driven using analog techniques where the current or voltage level is varied with time, pulse amplitude modulation or any other technique that varies the power through the lighting units in response to a control signal.
In certain embodiments, a single component may be capable both of permitting a user to create a lighting program and controlling the lighting units, and the present invention is intended to encompass this and other variations on the system depicted in Figure 1 which can be used to implement the methods described below. For example, the processor 10 can have software loaded thereon to enable it to perform not only the authoring functions described below, but also the playback functions described below as being performed by the controller 30. In certain embodiments, the functions described below as being performed by the software application alternatively may be provided by a hardware device, such as a chip or card, or any other system capable of performing the functions described herein.
An illustrative method 200 for creating a lighting sequence is described making reference to Figure 2. According to the method, a user may select from among a set of predetermined 'stock' effects at step 210. The stock effects function as discrete elements or building blocks useful for assembling a sequence. Additionally, a user may compose a particular sequence and include that sequence in the stock effects to eliminate the need for creating repeated elements each time the effect is desired. For example, the set of stock effects may include a dimming effect and a brightening effect. A user may compose a pulse effect by specifying the alternation of the dimming and brightening effects, and include the pulse effect in the set of stock effects. Thus, each time a pulse effect is thereafter desired, the stock effect can be utilized without the need for repeatedly selecting dimming and brightening effects to achieve the same goal. In certain embodiments, stock effects may be created by a user via any programming language, such as Java, C, C++, or any other suitable language. Effects may be added to the set of stock effects by providing the effects as plug-ins, by including the effects in an effects file, or by any other technique suitable for organizing effects in a manner that permits adding, deleting, and altering the set of effects.
The user may indicate a time at which the selected effect should begin at step 220. For example, the user may indicate that a brightening effect should start three minutes after a sequence commences. Additionally, the user may select an ending time or duration for the selected effect at step 230. Thus, by indicating that the effect should end five minutes after the sequence commences, or equivalently by indicating that the effect should last for two minutes, a user may set the time parameters of the selected effect. Additional parameters may be specified by the user at step 240, as may be appropriate for the particular effect. For example, a brightening or dimming effect may be further defined by an initial brightness and an ending brightness. The rate of change may be predetermined, e.g., the dimming effect may apply a linear rate of dimming over the assigned timespan, or may be alterable by the user, e.g., may permit slow dimming at the beginning followed by a rapid drop-off, or by any other scheme the user specifies. Similarly, a pulse effect, as described above, might instead be characterized by a maximum brightness, a minimum brightness, and a periodicity, or rate of alternation. Additionally, the mode of alternation may be alterable by the user, e.g., the changes in brightness may reflect a sine function or alternating linear changes. In embodiments wherein color-changing lights are employed, parameters such as initial color, final color, rate of change, etc. may be specified by the user. It should be appreciated that the particular effects and parameters therefore described above are provided merely for illustrative purposes, and that the present invention is not limited to these effects or parameters, as numerous other lighting effects and parameters can be employed in accordance with the embodiments of the invention described herein.
Finally, the user may select, at step 250, one or more lighting units to execute the effect selected in step 210. certain embodiments, a user may specify a transition between two effects which occur in sequence. For example, when a pulse effect is followed by a dimming effect, the pulse effect may alternate less rapidly, grow gradually dimmer, or vary less between maximum and minimum brightness towards the termination of the effect. Techniques for transitioning between these or other effects may be determined by the user for each transition, e.g., by selecting a transition effect from a set of predetermined transition effects, or by setting transition parameters for the beginning and/or end of one or both effects. h a further embodiment, users may specify multiple lighting effects for the same lighting unit that place effects overlapping in time or in location. These overlapping effects may be used in an additive or subtractive manner such that the multiple effects interact with each other. For example, a user could impose a brightening effect on a pulsing effect, with the brightening effect imposing the minimum brightness parameter of the pulse to give the effect of pulsing slowly growing to a steady light.
In one embodiment of the invention, lighting effects can have priorities or cues attached to them which could allow a particular lighting unit to change effect on the receipt of a cue. This cue could be any type of cue, received externally or internally to the system, and includes, but is not limited to, a user-triggered cue such as a manual switch or bump button; a user-defined cue such as a certain keystroke combination or a timing key allowing a user to tap or pace for a certain effect; a cue generated by the system such as an internal clocking mechanism, an internal memory one, or a software based one; a mechanical cue generated from an analog or digital device attached to the system such as a clock, external light or motion sensor, music synchronization device, sound level detection device, or a manual device such as a switch; a cue received over a transmission medium such as an electrical wire or cable, RF signal or IR signal; a cue that relates to a characteristic of an audio signal; or a cue received from a lighting unit attached to the system. The priority can allow the system to choose a default priority effect that is the effect used by the lighting unit unless a particular cue is received, at which point the system instructs the use of a different effect. This change of effect could be temporary, occurring only while the cue occurs or defined for a specified period, could be permanent in that it does not allow for further receipt of other effects or cues, or could be priority based, waiting for a new cue to return to the original effect or select a new one. Alternatively, the system could select effects based on the state of a cue and the importance of a desired effect. For instance, if a sound sensor sensed sudden noise, it could trigger a high priority alarm lighting effect overriding all the effects otherwise present or awaiting execution. The priority could also be state dependent where a cue selects an alternative effect or is ignored depending on the current state of the system. Again, it should be appreciated that the embodiments of the present invention that employ priorities or queues for various lighting effects are not limited to the particular types of queues and priorities discussed above, as numerous other types are possible.
In certain embodiments, the outcome of one effect may be programmed to depend upon a second effect. For example, an effect assigned to a first lighting unit may be a random color effect, and an effect assigned to a second lighting unit maybe designated to match the color of the random color effect. Alternatively, one lighting unit may be programmed to execute an effect, such as a Hashing effect, whenever a second lighting unit meets a certain condition, such as being turned off. Even more complex arrangements, such as an effect which is initiated upon a certain condition of a first effect, matches the color of a second effect and the rate of a third effect, can be created by this scheme. It should be appreciated that the above-described examples of combinations of effects or parameters being dependent upon other effects or parameters is provided merely for illustrative purposes, as the present invention is not limited to these specific examples, as numerous other dependencies and combinations are possible, h still other embodiments, the systems and methods described herein permit the playback of a lighting sequence to be influenced by external inputs during performance such as any of the examples of cues described above. For example, a lighting sequence or effect may be programmed to start upon receipt of a cue or trigger signal, a sequence or effect may take precedence if a cue or trigger signal is received, a sequence or effect may be designated to repeat or continue until a cue or trigger signal is received, etc. Thus, instead of assigning a discrete start time to an effect or sequence, a user may instead designate that effect or sequence to begin when a certain stimulus is received. Furthermore, during creation, a user may designate two or more effects for overlapping or concurrent time periods and assign the effects different priorities or conditions to determine which effect is executed upon playback, hi yet another embodiment, a user may link a parameter for an effect to an external input (e.g., any of the types of inputs described above, including analog, digital or manual inputs) such that the color, speed, or other attribute of an effect may depend on a signal from an external device, measuring, for example, volume, brightness, temperature, pitch, inclination, wave length, or any other appropriate condition. Thus, the selection of a lighting sequence, the selection of an effect, or the selection of a parameter may be determined or influenced by input from an external source, such as a user, chronometer, device, audio source, or sensor. Of course, the types of external stimuli, cues and triggers described above, as well as the changes in a lighting effect or parameter influenced thereby, are provided merely for illustrative purposes, as numerous other variations are possible.
In event-driven embodiments, such as those using external inputs and those using outputs of other effects as inputs, a menu may be provided to define inputs and the consequences thereof. For example, a palette of predetermined inputs maybe provided to a user. Each input, such as a specified transducer or the output of another effect, may be selected and placed within an authored lighting sequence as a trigger for a new effect, or as a trigger to a variation in an existing effect. Known inputs may include, for example, thermistors, clocks, keyboards, numeric keypads, Musical Instrument Digital Interface ("MLDI") inputs, DMX control signals, TTL or CMOS logical signals, other visual or audio signals, or any other protocol, standard, or other signaling or control technique, whether analog, digital, manual, or any other form. The palette may also include a custom input, represented as, for example, an icon in a palette, or an option in a dropdown menu. The custom input may allow a user to define the characteristics of an input signal (e.g., its voltage, current, duration, and/or form (i.e., sinusoid, pulse, step, modulation)) that will operate as a control or trigger in a sequence.
For instance, a theatrical lighting sequence may include programmed lighting sequences and special effects in the order in which they occur, but requiring input at specified points before the next sequence or portion thereof is executed. In this way, scene changes may take place not automatically as a function of timing alone, but at the cue of a director, producer, stage hand, or other participant. Similarly, effects which need to be timed with an action on the stage, such as brightening when an actor lights a candle or flips a switch, dramatic flashes of lightning, etc., can be indicated precisely by a director, producer, stage hand, or other participant - even an actor - thereby reducing the difficulty and risk of relying on preprogrammed timing alone. As should be appreciated from the foregoing, input from sensors can also be used to modify lighting sequences. For example, a light sensor may be used to modify the intensity of the lights, for example, to maintain a constant lighting level regardless of the amount of sunlight entering a room, or to make sure a lighting effect is prominent despite the presence of other sources of light. A motion sensor or other detector may be used as a trigger to start or alter a lighting sequence. For example, a user may program a lighting sequence for advertising or display purposes to change when a person approaches a sales counter or display. Temperature sensors may also be used to provide input. For example, the color of light in a freezer may be programmed to be dependent on temperature, e.g., providing blue light to indicate cold temperature, changing gradually to red as the temperature rises, until a critical temperature is reached, whereupon a flashing or other warning effect may begin. Similarly, an alarm system may be used to provide a signal that triggers a lighting sequence or effect for providing a warning, distress signal, or other indication. An interactive lighting sequence may be created, e.g., wherein the executed effect varies according to a person's position, movements, or other actions. It should be appreciated that the types of sensors described herein, and their modifying effect on a light sequence, are provided merely for illustrative purposes, as numerous other types of sensors can be employed, and numerous other lighting effects or parameters can be modified in response to inputs from these or other types of sensors. In certain embodiments, a user may provide information representative of the number and types of lighting units and the spatial relationships between them. For example, an interface 300 may be provided as depicted in Figure 3, such as a grid or other two-dimensional array, that permits the user to arrange icons or other representative elements to represent the arrangement of the lighting units being used. In one embodiment, depicted in Figure 3, the interface 300 provides to a user a selection of standard types of lighting units 310, e.g., cove lights, lamps, spotlights, etc., such as by providing a selection of types of lighting units in a menu, on a palette, on a toolbar, etc. The user may then select and arrange the lighting units on the interface, e.g., within layout space 320 in an arrangement which approximates the physical arrangement of the actual lighting units. It should be appreciated that numerous different types of user interfaces can be employed, and that the embodiments of the present invention described herein are not limited to the use of any particular user interface, or any specific technique for representing the number and types of lighting units and their spatial relationship. h certain embodiments, the lighting units may be organized into different groups, e.g., to facilitate manipulation of a large number of lighting units. Lighting units may be organized into groups based on spatial relationships, functional relationships, types of lighting units, or any other scheme desired by the user. Spatial arrangements can be helpful for entering and carrying out lighting effects easily. For example, if a group of lights are arranged in a row and this information is provided to the system, the system can then implement effects such as a rainbow or a sequential flash without need for a user to specify a separate and individual program for each lighting unit. All the above types of implementation or effects could be used on a group of units as well as on single lighting units. The use of groups can also allow a user to enter a single command or cue to control a predetermined selection of lighting units. A lighting sequence can be tested or executed on a lighting system to experience the effects created by the user. Additionally, the interface 300 may be capable of reproducing a lighting sequence created by the user, for example, by recreating the programmed effects as though the icons on the interface were the lighting units to be controlled. Thus, if a lighting sequence specified that a certain lighting unit gradually brightens to a medium intensity, upon playback, the icon representing that lighting unit may start black and gradually lighten to gray. Similarly, color changes, flashing, and other effects can be visually represented on the interface. This function may permit a user to present a wholly or partially created lighting sequence on a monitor or other video terminal, pause playback, and modify the lighting sequence before resuming playback, to provide a highly interactive method for show creation. In a further embodiment, the system could allow fast-forwarding, reversing, rewinding, or other functions to allow editing of any portion of the lighting sequence, hi a still further embodiment, the system could use additional interface features like those known in the art. This can include, but is not limited to, non-linear editing such as that used in the Adobe or such devices or controls as scrolls, drag bars, or other devices and controls. An alternate interface 400 for reproducing a lighting sequence is presented in Figure 4. Interface 400 includes representations of lighting elements 410 and playback controls 420. It should be appreciated that the present invention is not limited to the above-described techniques for visualizing a lighting sequence, as numerous other techniques are possible.
An interface capable of representing the lighting sequence may also be used during authoring or entry of the lighting sequence. For example, a grid, such as interface one axis and time is represented along a second axis. Thus, when a user specifies that a certain lighting unit gradually brightens to a medium intensity, the portion of the grid defined by that lighting unit, the start time, and the ending time may appear black at one end of the grid portion and gradually lighten to gray at the other end of the grid portion. In this way, the effect can be visually represented to the user on the interface as the lighting sequence is being created. In certain embodiments, effects that are difficult to represent with a static representation, such as flashing, random color changes, etc., can be represented kinetically on the interface, e.g., by flashing or randomly changing the color of the defined grid portion. An example of an interface 500 representing a sequence for an assortment of three lighting units is shown in Figure 5. Time chart 510 visually depicts the output of each of the three lights at each moment in time according to the temporal axis 515. At a glance, the user can readily determine what effect is assigned to any lighting unit at any point in time, simplifying the coordination of effects across multiple lighting units and allowing rapid review of the lighting sequence.
Additionally, Figure 5 depicts a palette 520 which includes the stock effects from which a user may select lighting effects, although other techniques for providing the set of stock effects, such as by a menu, toolbar, etc., may be employed in the systems and methods described herein. In palette 520 there are provided icons for stock effects for the lighting of a fixed color effect 552, a cross fade between two color effects 554, a random color effect 558, a color wash effect 560, a chasing rainbow effect 565, a strobe effect 564, and a sparkle effect 568. This list is by no means exhaustive and other types of effects can be included. To assign an effect to a lighting unit, the user may select an effect from the palette and select a region of the grid corresponding to the appropriate lighting unit or units and the desired time interval for the effect. Additional parameters may be set by any suitable technique, such as by entering numerical values, selecting options from a palette, menu, or toolbar, drawing a vector, or any other technique known in the art, such as the parameter entry field 525. Other interfaces and techniques for entry of lighting sequences suitable for performing some or all of the various functions described herein may be used and are intended to be encompassed by the scope ofthis disclosure. Examples of functions and interfaces suitable for use with the invention may be found in "A Digital Video Primer," June, 2000, by the Adobe Dynamic Media Group, Adobe Systems, Inc., incorporated herein by reference.
The methods described above can be readily adapted for controlling devices other than lighting units. For example, in a theatrical setting, fog machines, sound effects, wind machines, curtains, bubble machines, projectors, stage practicals, stage elevators, pyrotechnical devices, backdrops, and any other features capable of being controlled by a computer may be controlled by a sequence as described herein. In this way, multiple events can be automated and timed. For example, the user may program the lights to begin to brighten as the curtain goes up, followed by the sound of a gunshot as the fog rolls over the stage. In a home, for example, a program can be used to turn on lights and sound an alarm at 7:00 and turn on a coffee maker fifteen minutes later. Holiday lighting arrays, e.g., on trees or houses, can be synchronized with the motion of mechanical figurines or musical recordings. An exhibit or amusement ride can coordinate precipitation, wind, sound, and lights in a simulated thunderstorm. A greenhouse, livestock barn, or other setting for growing living entities can synchronize ambient lighting with automated feeding and watering devices. Any combination of electromechanical devices can be timed and/or coordinated by the systems and methods described herein. Such devices may be represented on an interface for creating the sequence as additional lines on a grid, e.g., one line for each separate component being controlled, or by any other suitable means. Effects of these other devices can also be visually represented to the user. For instance, continued use of a smoke machine could slowly haze out other grids, a coffee maker could be represented by a small representation of a coffee maker that appears to brew coffee on the interface as the action occurs at the device or the interface can show a bar slowing changing color as feed is dispensed in a livestock barn. Other types of static or dynamic effects are also possible. i certain embodiments, wherein the lighting units are capable of motion, e.g., by sliding, pivoting, rotating, tilting, etc., the user may include instructions for the motion or movement of lighting units. This function may be accomplished by any means. For example, if the lighting unit includes a motor or other system capable of causing movement, the desired movement may be effected by selecting a motion effect from a set of motion effects, as described for lighting effects above. Thus, for example, a lighting unit capable of rotating on its base may be selected, and a rainbow wash effect may be programmed to occur simultaneously with a rotating motion effect. In other embodiments, lighting units may be mounted on movable platforms or supports which can be controlled independently of the lights, e.g., by providing an additional line on a grid interface as described above. Motion effects may also have parameters, such as speed and amount (e.g., an angle, a distance, etc.), that can be specified by the user. Such light/motion combinations may be useful in a wide variety of situations, such as light shows, planetarium presentations, moving spotlights, and any other scenario in which programmable moving lights may be desirable.
Similarly, instructions for controlling objects placed between a lighting unit and an object being illuminated, such as gobos, stencils, filters, lenses, irises and other objects through which light may pass, can be provided by a user according to the systems and methods described herein. In this manner, an even wider array of lighting effects may be designed and preprogrammed for later execution.
One embodiment of the present invention is directed to a computer system configured to design or create a lighting sequence according to the systems and methods described herein, e.g., by executing (e.g., on the processor 10 in Fig. 1) a computer program in a computer language, either interpreted or compiled, e.g., Fortran, C, Java, C++, etc. Another embodiment of the invention is directed to a disk, CD, or other computer-readable storage medium that encodes a computer program that, when executed, is capable of performing some or all of the functions described above which enable a user to create or design a lighting sequence which can be used to control a plurality of lighting units.
A lighting sequence may be recorded on a storage medium, such as a compact disk, floppy disk, hard drive, magnetic tape, volatile or non- volatile solid state memory device, or any other computer-readable storage medium. The lighting sequence may be stored in a format that records the effects and their parameters as created by a user, in a format converted from that format into a format which represents the final data stream, e.g., suitable for directly controlling lighting units or other devices, or in any other suitable format. In this respect, it should be appreciated that the format in which a lighting sequence is created in any of the manners described above may not be compatible for directly controlling a lighting network, such that some format conversion may be required between the format used for creating the lighting sequence, and a format for controlling a plurality of lighting units. When such a conversion is desired, it can be performed at various different times, as the embodiments of the present invention described herein are not limited to any particular conversion time or technique. Thus, the lighting sequence can be recorded on a storage medium either in the format in which it was created, in a format suitable for controlling a lighting network (such that the conversion will take place before storing the lighting sequence), or any other suitable format. Examples of formats that can be used for controlling a plurality of lighting units include data streams in data formats such as DMX, RS-485, RS-232, etc.
It should be appreciated that lighting sequences may be linked to each other, e.g., such that at the conclusion of one sequence, another sequence is executed, or a master sequence may be created for coordinating the execution of a plurality of subsequences, e.g., based on external signals, conditions, time, randomly, etc.
Playback Devices
In one embodiment of the present invention, the same system that is used to author a lighting sequence can also be used to play it back and thereby control a plurality of lighting units 40. For example, when the lighting program is authored on a general purpose computer, (e.g., including a display that comprises the interface 15 and a processor that serves as the processor 10 shown in Fig. 1), that same general purpose computer can playback the lighting program, and thereby perform the functions of the lighting controller 30 shown in Fig. 1. In this respect, the general purpose computer can be coupled to the plurality of lights 40 in any suitable manner, examples of which are discussed above.
It should be appreciated that in many instances, it may be desirable to author a lighting program on one device (e.g., a general purpose computer), but play it back on a different device. For example, a retail store may desire to author a lighting program that can then be played back at multiple retail locations. While it is possible to interconnect multiple locations to the device on which the lighting program was authored (e.g., over the Internet), it may be desirable in some circumstances to have each of the retail locations be capable of controlling playback of the lighting program individually. Furthermore, there may also be situations where lighting displays are mobile, such that it is not assured that in every location wherein it is desired to set up a lighting display that there will be access to the Internet or some other communication medium for connecting to the device on which the program is authored. In addition, it should be appreciated that it may be desirable for an organization to have only a single device with the capability of authoring a lighting program (i.e., having a display, relevant software, etc.), on which numerous different lighting programs can be authored. If playback of the lighting program were limited to the device on which it was authored, then only one of potentially numerous programs authored on a particular device could be played back at a time, which would severely restrict the usefulness of the system. i view of the foregoing, one embodiment of the present invention is directed to a system in which lighting programs are authored on one device as described above, and then transferred to a different device which plays back the lighting program and controls a lighting display. In accordance with one illustrative embodiment of the invention, the separate playback device can be a general purpose computer, with software loaded thereon to enable it to playback the lighting program. The transfer of the lighting program from the device on which it is authored to the device on which it is played back can be accomplished in any of numerous ways, such as by connection over a communication medium (e.g., via email over the Internet), or by loading the lighting program onto a portable computer readable medium (e.g., a disk, flash memory or CD) and physically transporting the medium between the two devices. hi accordance with an alternate embodiment of the invention, Applicants have appreciated that the device used to playback a lighting program need not have all of the functionality and capability of the device used in authoring the program (e.g., it need not include a video monitor, a robust user interface, etc.). Furthermore, Applicants have appreciated that in many instances, it would be desirable to provide a relatively small and inexpensive device to perform the playback function, so that the device can be portable and such that if there are multiple instances of lighting systems on which a program is to be played back, separate devices can be used to control the playback on each of the lighting systems, to increase flexibility.
In view of the foregoing, one embodiment of the present invention is directed to a device, for playing back a lighting program, that includes less hardware and is less expensive than a more complex system that permits authoring of the lighting program. For example, the device need not include a lot of the functionality found in a general purpose computer, such as a full size display, a full alphanumeric keyboard, an operating system that enables processing of multiple applications simultaneously, etc. The playback device can take any of numerous forms, as the present invention is not limited to any particular implementation.
One illustrative implementation of a playback device 31 is shown in Figure 6. The playback device 31 may employ any suitable loader interface 610 for receiving a lighting program 20, e.g., an interface for reading a lighting program 20 from a storage medium such as a compact disk, diskette, magnetic tape, smart card, or other device, or an interface for receiving a transmission from another system, such as a serial port, USB (universal serial bus) port, parallel port, IR receiver, or other connection for receiving a lighting program 20. In certain embodiments, the lighting program 20 may be transmitted over networks (e.g., the Internet).
The components on the playback device 31 can be powered in any of numerous ways, including through the provision of a power source (e.g., a battery) within the playback device, or through the provision of an interface for receiving a power cord compatible with a standard electrical outlet. However, in accordance with one illustrative embodiment of the present invention, the playback device 31 is provided with neither an onboard power source nor an interface for a standard electrical outlet. Thus, in accordance with one illustrative embodiment of the invention, the interfaces for connecting the playback device 31 to both a device that authors a lighting program (e.g., a general purpose computer with software loaded thereon to perform the above-described functions) and for connecting with one or more lighting units 40 provide an interface that enables not only the transfer of data or other communication signals, but also sufficient electrical current to power the components within the playback device 31, thereby eliminating the need for a separate power interface. The present invention is not limited to the use of any particular type of interface. One example of a suitable interface that provides both commumcation and power is a USB port.
The playback device 31 may begin execution of a lighting sequence 20 upon the loading the lighting sequence 20 into the device 31, upon receiving a command or signal from a user interface, another device, or a sensor; at a specified time; or upon any other suitable condition. The condition for initiation may be included in the lighting sequence 20, or may be determined by the configuration of the playback device 31. Additionally, in certain embodiments, the playback device 31 may begin execution of a lighting sequence 20 at a starting point other than the beginning of the lighting sequence 20. For example, playback device 31 may, upon receiving a request from the user, execute a lighting sequence 20 starting from a point three minutes from the beginning of the sequence, or at any other specified point, e.g., from the fifth effect, etc. In one embodiment, the playback device 31 may, upon receiving a signal from a user, a device or sensor, pause the playback, and, upon receiving a suitable signal, resume playback from the point of pausing. The playback device 31 may continue to execute the lighting sequence 20 until the sequence terminates, or it may repeatedly replay the sequence until a command or signal is received from a user, device or sensor, until a specified time, or until any other suitable condition.
The playback device 31 may include a storage device 620, such as a memory unit, database, or other suitable module (e.g., a removable Flash memory), for storing lighting information, h accordance with one embodiment of the present invention, the storage device 620 is formed as a non-volatile memory device, such that once information is stored thereon, the information is maintained, even when no power is provided to the playback device 31. The lighting information may take any of many forms. For example, the storage device 620 may store a plurality of effects and instructions for converting those effects into a data format or protocol, such as DMX, RS-485, or RS- 232, suitable for controlling a plurality of lighting units 40. The storage device 620 may be preconfigured for a set of stock effects, may receive effects and instructions in the form of an authored lighting sequence 20, or the storage device 620 may include a preconfigured set of stock effects which can be supplemented by additional effects provided in an authored lighting sequence 20. Preconfiguring the storage device 620 with a set of stock effects permits a reduction in the memory required to store a lighting sequence 20, because the lighting sequence 20 may omit conversion instructions for effects preconfigured into the playback device 31. hi embodiments wherein the lighting sequence 20 includes stock effects designed by the author, suitable instructions may be included in lighting sequence 20 and stored in storage device 620, e.g., upon loading or execution of the lighting sequence 20. It should be appreciated that the information stored within the storage device 620 need not be stored in the form of lighting effects and instructions for converting those effects into a data format suitable for controlling a plurality of light units, as such a conversion can be performed prior to storing the information in the storage device 620.
As mentioned above, in one embodiment of the present invention, a lighting program may be transformed and stored on a storage medium (e.g., storage device 620) in a format which represents the final data stream suitable for directly controlling lighting units or other devices. It should be appreciated that during the execution of a lighting program, the lighting units 40 will go through a number of different states, in that the changing of an effect, or parameter therefore, for any of the lighting units will result in a different state for the lighting units taken as a whole. When a lighting program is authored, a playback rate can be established, and the program can be stored in the storage medium with a frame corresponding to each update period established by the playback rate. A frame has sufficient information to establish a full state of the lighting units 40 controlled by the program. Thus, in accordance with one embodiment of the present invention, the storage medium stores the lighting program in a format so that there is a frame corresponding to each of the states of the lighting units. This is to be contrasted with other types of lighting unit playback devices, which do not store such complete frames, but rather, store information that enables the playback device to interpolate and thereby generate the frames necessary to place the hghting units in each of the plurality of states to be achieved. The embodiment of the present invention that stores a specific frame for each of the plurality of states is advantageous, in that it provides more flexibility in programming the lighting program. However, it should be appreciated that other embodiments of the present invention are not limited in this respect, and they can transfer data to and store it within the storage medium in different formats.
In one embodiment, the playback device 31 may include an external interface 650 whereby the playback device 31 can receive external signals useful for impacting (e.g., modifying) the execution or output of one or more stored lighting sequences 20. For example, the external interface 650 may include a user interface, which may in turn include switches, buttons, dials, sliders, a console, a keyboard, a speech recognition system, or any other device, such as a sensor, whereby a command or signal can be provided to the playback device 31 to otherwise influence the execution or output of the lighting sequence 20. The external devices may be coupled to the playback device 31 via any suitable technique, including a direct wire connection or via RF or some other type of wireless connection. The manner in which an external command or signal can influence execution or output of the lighting sequence 20 can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation. In the illustrative embodiment shown in Figure 6, the playback device 31 is provided with a processor 651 that receives the output of the storage device 620, and can act thereon to influence the played back output of the lighting sequence 20 stored within the storage device 620. In the embodiment shown, the external interface 650 is directly coupled to the processor 651, such that the processor can examine any external signals and commands and make decisions based thereon to influence the played back output of the lighting sequence 20. As mentioned elsewhere herein, there are numerous types of external commands, cues and signals that can be provided and also numerous ways in which they can influence the execution of a lighting sequence, such that the present invention is not limited to any particular commands, cues or signals, nor any particular manner of influencing the playback of a lighting sequence. In addition to influencing the played back output of a lighting sequence 20, an external command, cue or signal can also influence the execution order of a lighting sequence, by causing an alteration in the execution order of a lighting sequence, for example, by branching to places out-of-line in a particular lighting sequence or by branching out of the lighting sequence altogether. Thus, as shown in Figure 6, commands, cues or signals received by the external interface 650 can be provided directly to the processor 651, which can then alter the playback sequence of a particular lighting sequence, go to the execution of stock effects, switch between lighting sequences, or take any other type of action relating to the execution order of lighting sequences from the storage device 620. In the embodiment shown in Figure 6, the playback device 31 further includes chronometers to provide timing references to the processor 651. In the embodiment shown, two such chronometers are employed, a first being a local time module 660, which functions as a counter for measuring time from a predetermined starting point, for example, when the playback device 31 is turned on or a point in time when the counter is reset. In addition, a date time module 665 is provided which calculates the current date and time. In the embodiment shown, an output from each of the modules 660, 665 is provided to the processor 651, which enables the processor 651 to include timing based information in making decisions impacting any of numerous aspects discussed above relating to the playback output and order of lighting sequences from the storage device 620, including but not limited to the rate at which a lighting sequence is being played back, the intensity or any other parameter relating to a lighting sequence being played back, switching between lighting sequences based upon a particular timing event, etc. In the embodiment shown in Figure 6, each of the timing modules 660, 665 can receive communications from an external source, for example, to reset the timing modules, to load a value therein, etc. It should be appreciated that a dedicated input port for the timing modules 660, 665 need not be employed, as they can alternatively receive communications from external sources via other paths, e.g., from the external interface 650, from the loader 610, from an output of the processor 651, etc., as the embodiment of the present invention that employs such timing modules is not limited to any particular implementation. In addition, while the timing modules, 660, 665 provide the advantages described above, it should be appreciated that they are optional, as some embodiments of the present invention need not employ any timing modules at all.
As discussed above, in one embodiment of the present invention, external signals received, via external interface 650, can be provided directly to the processor 651, which can then take any of the various actions described above based on the external signals, e.g., altering the rate at which lighting sequences are played back, branching within or between lighting sequences, altering brightness or other parameters of lighting sequences being played back, etc. In the embodiment of the invention shown in Figure 6, a cue table 630 is also provided to compare or interpret external signals received via the external interface 650, and to provide information related thereto to the processor 651. The cue table 630 may contain information relating to various inputs or conditions received by the external interface 650, as designated by the author of a lighting sequence 620, to effect the execution or output of the lighting sequence. The cue table can include a list of if/then statements, other types of boolean expressions, or any other types of functions to interpret actions to be taken during execution of the lighting program based upon the information received from various inputs or conditions. Thus, if the playback device 31 compares an input to the cue table 630 and determines that a condition has been satisfied or a designated signal has been received, the playback device 31 may alter the execution or output of the lighting sequence 20 as indicated by the program, based upon information that is stored within the cue table 630 and provided to the processor 651. In the embodiment shown in Figure 6, the signals received by the external interface 650 can be provided either directly to the processor 651 or can be interpreted via the cue table 630. It should be appreciated that other configurations are possible, as the present invention is not limited to the particular implementation shown in Figure 6. For example, the signals received by the external interface 650 can, in another embodiment of the invention, not be sourced directly to the processor 651, such that they can always be interpreted via the cue table 630. Alternatively, in another embodiment of the invention, the cue table 630 can be eliminated. hi certain embodiments, the playback device 31 may respond to external signals in ways that are not determined by the contents and instructions of the lighting sequence 20. For example, the external interface 650 may include a dial, slider, or other feature by which a user may alter the rate of progression of the lighting sequence 20, e.g., by changing the speed of the local time counter 660, or by altering the interpretation ofthis counter by the playback device 31. Similarly, the external interface 650 may include a feature by which a user may adjust the intensity, color, or other characteristic of the output, h certain embodiments, a lighting sequence 20 may include instructions to receive a parameter for an effect from a feature or other user interface on the external interface 650, permitting user control over only specific effects during playback, rather than over all of the effects output to the system of lighting units as a whole.
It should be appreciated that the specific types of external interfaces described above, as well as their specific impacts on a lighting sequence, are provided merely for illustrative purposes, as numerous other types of interfaces and impacts on a lighting sequence are possible. Thus, the embodiment of the present invention related to the use of an external interface to impact the playing back of the lighting sequence is not limited to the specific examples described above. Furthermore, although this embodiment of the present invention includes a number of advantages as described above, it should be appreciated that an external interface is not a requirement of other aspects of the present invention, as various embodiments of the present invention need not employ an external interface at all.
The playback device 31 may also include a transient memory 640. The transient memory 640 may store temporary information, such as the current state of each lighting unit under its control, which may be useful as a reference for the execution of the lighting sequence 20. For example, as described above, some effects may use the output of another effect to define a parameter; such effects may retrieve the output of the other effect as it is stored in the transient memory 640. It should be appreciated that the embodiment of the present invention that employs a transient memory is not limited to using it in this manner, as numerous other uses may be possible (e.g., as a scratch pad memory for the processor 651). Furthermore, various embodiments of the present invention can be implemented without using any transient memory at all.
The playback device 31 may send the data created by the execution of a lighting sequence 20 to the lighting units 40 in any of numerous ways, as the present invention is not limited to any particular technique. In the embodiment shown in Figure 6, the playback device 31 transmits such data to the lighting units 40 via a network output port 680, which can be any of numerous types of interfaces capable of communicating with the lighting units 40. For example, the network output 680 can be an interface for connection to the lighting units via wires or cables, via an TR, RF or other wireless transmission, over a computer network, any other suitable method of data transfer, or via any combination of techniques capable of controlling the lighting units 40 and/or any associated other devices, hi the embodiments shown, the information read from the storage device 620 is passed through an output buffer 670 that is then coupled to the network output port 680. However, it should be appreciated that the present invention is not limited in this respect, as no output buffer need be used in other embodiments.
In one embodiment of the present invention, the storage device 620 can be loaded with only a single lighting sequence 20 at any particular time, such that the playback device 31 is programmed to only play one particular lighting sequence 20. hi accordance with this embodiment of the present invention, execution of the single lighting sequence 20 can begin immediately upon the playback device 31 receiving power, and the lighting sequence 20 can be programmed to execute a set number of times (e.g., once or multiple times), or it can be programmed to continuously loop through multiple executions. h an alternate embodiment of the present invention, the playback device 31 is arranged to enable multiple lighting sequences 20 to be stored within the storage device 620. hi accordance with this embodiment of the present invention, some user interface is provided to enable a user to select which of the multiple lighting sequences 20 is to be played back at any particular time. The present invention is not limited to the use of any particular type of user interface in this regard, as numerous techniques can be employed. In one embodiment of the present invention, it is desirable to minimize the size, cost and complexity of the playback device 31. In accordance with that embodiment of the present invention, a simple button or switch can be employed that, when toggled, switches between the multiple lighting sequences 20 stored within the storage device 620.
In the embodiment shown in Figure 6, separate data paths are shown for providing input to the timing modules 660, 665, the loader 610, the external interface 650 and the network output port 680. It should be appreciated that numerous other implementations are possible that can reduce the number of input/output ports on the playback device 31. For example, a single data path can be shared for providing data to the timing modules 660, 665 and the loader 610. h addition, a bi-directional input/output interface can be used so that the data path for loading the storage device 620 can be shared with the data path for providing an output to the plurality of lighting units, hi addition, to reduce the number of input/output ports on the device, serial (rather than parallel) interfaces can be employed. Thus, as should be appreciated from the foregoing, numerous techniques are possible for configuring the input/output ports of the playback device 31, as the present invention is not limited to any particular implementation technique.
In certain embodiments, the playback device 31 may not communicate directly with the lighting units, but may instead communicate with one or more subcontrollers which, in turn, control the lighting units or another level of subcontrollers, etc. The use of subcontrollers permits distributive allocation of computational requirements. An example of such a system which uses this sort of distributional scheme is disclosed in U.S. Patent No. 5,769,527 to Taylor, described therein as a "master/slave" control system. Communication between the various levels may be unidirectional, wherein the playback device 31 provides instructions or subroutines to be executed by the subcontrollers, or bidirectional, where subcontrollers relay information back to the controller 30, for example, to provide information useful for effects which rely on the output of other effects as described above, for synchronization, or for other purposes. As discussed above, the playback device 31 architecture permits effects to be based on external environmental conditions or other input. An effect is a predetermined output involving one or more lighting units. For example, fixed color, color wash, and rainbow wash are all types of effects. An effect may be further defined by one or more parameters, which specify, for example, lights to control, colors to use, speed of the effect, or other aspects of an effect. The environment refers to any external information that may be used as an input to modify or control an effect or the playback of one or more lighting sequences, such as the current time or external inputs such as switches, buttons, or other transducers capable of generating control signals, or events generated by other software or effects. Finally, an effect may contain one or more states, so that the effect can retain information over the course of time. A combination of the state, the environment, and the parameters may be used to fully define the output of an effect at any moment in time, and over the passage of time
In addition, the playback device 31 may implement effect priorities. For example, different effects may be assigned to the same lights. By utilizing a priority scheme, differing weights can be assigned to effects assigned to the same lights. For example, in one embodiment only the highest priority effect will determine the light output. When multiple effects control a light at the same priority, the final output may be an average or other combination of the effect outputs.
An alternate embodiment of the present invention is directed to a playback device 1000, as shown in Fig. 7, that differs from the playback device 31 described above in that it does not include a loader 610 for loading lighting programs into the storage device 620. h accordance with this illustrative embodiment of the present invention, the playback device 1000 is not loadable with customized lighting programs via the user, but rather can be provided with a storage device 620 having one or more pre-installed lighting programs already loaded thereon, such that the lighting programs stored in the playback device 1000 are not modifiable by the user.
In the embodiment shown in Fig. 7, the playback device 1000 does not include a cue table 630, timing modules 665 or 660, or a transient memory 640. However, it should be appreciated that any or all of these features can alternatively be provided, in much the same manner as described above in connection with the playback device 31 of Fig. 6.
In one embodiment of the playback device 1000, the storage device 620 stores multiple lighting programs, in much the same manner as discussed above in connection with some embodiments of the playback device 31 in Fig. 6. hi accordance with this embodiment, a first external interface 1002 is provided to receive an externally generated signal to select which lighting program stored within the storage device 620 is to be played back by the playback device 1000. The first external interface 1002 is compatible with any of numerous types of user interfaces to enable selection of a particular lighting program to be played back. For example, in accordance with one illustrative embodiment of the present invention, a push button, toggle switch or other type of device can be used that when activated by the user, causes the processor 651 to select a next lighting program for playback, so that by repeatedly toggling the input device, a user can step through all of the lighting programs stored in the storage device 620 to select a desired program for execution.
In the embodiment shown in Fig. 7, the playback device 1000 further includes a second external interface 1004 that is compatible with another user interface to enable the user to vary a parameter of a lighting program being played back by the playback device 1000. The parameter being varied can apply to all of the lighting effects in a lighting program (e.g., can influence the playback speed or intensity of an entire lighting program being played back) or can relate to only a subset (including only a single effect) of the lighting effects. Any of numerous types of lighting effect or parameter changes can be accomplished, as described above in connection with other embodiments of the present invention. Similarly, the user interface compatible with the second external interface 1004 can take any of numerous forms, as this embodiment of the present , invention is not limited to the use of any particular type of interface. For example, in one embodiment of the present invention the user interface may be capable of generating a plurality of different signals, which can be used to vary a parameter of the lighting program being played back, such as the playback speed, intensity of illumination, color of a particular portion of a lighting program (including adjustments in hue, saturation and/or intensity) or any other parameter. For example, the second external interface may provide a variable digital signal to the processor 651 depending on the setting or position of the user interface. Alternatively, the user interface may supply an analog signal to the, second external interface 1004, which can then convert the analog signal to a digital signal for communication to the processor 651. While the embodiment of the present invention shown in Fig. 7 includes separate first and second external interfaces to perform the functions of selecting a particular lighting program to be played back and varying a lighting effect or parameter thereof, it should be appreciated that the present invention is not limited in this respect, and that other arrangements are possible, such as employing a single user interface to perform both of these functions.
As indicated above, in an alternate embodiment of the present invention, a cue table 630 can be provided to interpret the information received from the first and second external interfaces 1002, 1004, rather than providing their outputs directly to the processor 651.
A lighting sequence as described above may be implemented using one or more subroutines, such as a Java program fragment. Such subroutines may be compiled in an intermediate format, such as by using an available Java compiler to compile the program as byte codes. In such a byte code format, the fragment may be called a sequence. A sequence may be interpreted or executed by the playback device 31. The sequence is not a stand-alone program, and adheres to a defined format, such as an instantiation of an object from a class, that the playback device 31 may use to generate effects. When downloaded into the playback device 31 (via serial port, infrared port, smart card, or some other interface), the playback device 31 interprets the sequence, executing portions based on time or input stimuli.
In one embodiment, a building block for producing a show is an effect object. The effect object includes instructions for producing one specific effect, such as color wash, cross fade, or fixed color, based on initial parameters (such as which lights to control, start color, wash period, etc.) and inputs (such as time, environmental conditions, or results from other effect objects). The sequence contains all of the information to generate every effect object for the show. The playback device 31 instantiates all of the effect objects one time when the show is started, then periodically sequentially activates each one. Based on the state of the entire system, each effect object can programmatically decide if and how to change the lights it is controlling. The run-time environment software running on the playback device 31 may be referred to as a conductor. The conductor may be responsible for downloading sequences, building and maintaining a list of effect object instances, managing the interface to external inputs and outputs (including DMX), managing the time clock, and periodically invoking each effect object. The conductor also maintains a memory (e.g., transient memory 640) that objects can use to communicate with each other.
A channel may be a single data byte at a particular location in the DMX universe. A frame may be all of the channels in the universe. The number of channels in the universe is specified when the class is instantiated.
When an effect object sets the data for a particular channel it may also assign that data a priority. The priorities can be interpreted in any of numerous ways. For example, if the priority is greater than the priority of the last data set for that channel, then the new data may supercede the old data; if the priority is lesser, then the old value may be retained; and if the priorities are equal, then the new data value may be added to a running total and a counter for that channel may be incremented. When the frame is sent, the sum of the data values for each channel may be divided by the channel counter to produce an average value for the highest priority data. Of course, other ways of responding to established priorities are possible.
After each frame has been sent the channel priorities may all be reset to zero. The to-be-sent data may be retained, so if no new data is written for a given channel it will maintain its last value, and also copied to a buffer in case any effect objects are interested. The conductor is the run-time component of the playback device 31 that unites the various data and input elements. The conductor may download sequences, manage the user interface, manage the time clock and other external inputs, and sequence through the active effect objects.
The technique for downloading the sequence file into the conductor can vary depending on the hardware and transport mechanism. In one embodiment, the sequence object and various required classes maybe loaded into memory, along with a reference to the sequence object. h one embodiment, more than one sequence object may be loaded into the conductor, and only one sequence may be active. The conductor can activate a sequence based on external inputs, such as the user interface or the time of day.
The above-discussed embodiments of the playback device 31 can be implemented in any of numerous ways. Thus, while a single processor 651 is shown in the embodiment of Figure 6 to perform each of the functions described above, it should be appreciated that the present invention is not limited in this respect, and that the various functions described above as being performed by the processor 651 can be distributed among two or more processors or controllers, such that in one embodiment there is a dedicated controller to carry out each of the functions of the processor 651 described above.
Controlling Lighting Systems in Response to an Audio Input
As mentioned above, one embodiment of the present invention is directed to a method and apparatus for controlling a lighting system in response to an audio input. Figure 8 illustrates a computer system 2009 for implementing this embodiment of the present invention. However, it should be appreciated that this embodiment of the present invention is not limited to the implementation shown in Figure 8, as numerous other implementations are possible. The audio input can be provided in any of numerous ways, h the embodiment shown in Figure 8, the audio input is provided as audio data 2005 provided on a computer-readable medium 2007 accessible to the computer system 2009. The computer-readable medium 2007 can take any of numerous forms, as the present invention is not limited to the use of any particular computer-readable medium. Examples of suitable computer-readable media include compact discs, floppy discs, hard discs, magnetic tapes, and volatile and non-volatile memory devices.
The audio data 2005 may be stored in any format suitable for the storage of digital data. One popular format is the MPEG Layer III data compression algorithm, which is often used for transmitting files over the Internet, and is widely known as MP3. The files stored in the MP3 format are typically processed by an MP3 decoder for playback. It should be appreciated that MP3 is merely one of numerous types of formats suitable for the storage of digital data, with other examples including MIDI, MOD, CD A, WMA, AS and WAN. It should be appreciated that these are merely examples of suitable formats, and that there are other standards and formats that can be used, including formats that do not adhere to any particular standard. In addition, while the MP3 format compresses the data, it should be appreciated that other formats may not. It should further be appreciated that the present invention is not limited to use with data stored in any particular format.
Rather than originating from a computer readable medium accessible to the computer system 2009, such as a microphone, stereo system, musical instrument or any other source capable of generating an audio signal 2003. The audio signal 2003 maybe a digital signal, input to the computer system 2009 via a digital interface such as a USB, serial or parallel port or any other suitable interface, or may be an analog signal, input to the computer system 2009 via an audio jack or any other suitable interface, hi accordance with one embodiment of the present invention, when the audio signal 2003 is provided in analog form, it can be converted (via an analog-to-digital converter not shown) within the computer system 2009, so that the audio signal can be processed digitally, which provides a number of advantages as discussed below. However, it should be appreciated that riot all aspects of the present invention are limited in this respect, such that other embodiments of the present invention can process the audio signal in analog form. In the embodiment shown in Figure 8, the computer 2009 includes an audio decoder 2011 that accepts as an input either audio data 2005 which is stored on a computer readable medium 2007 coupled to the computer 2009, or an external audio signal 2003. The audio decoder 2011 generates as an output information reflective of one or more characteristics of the audio signal that is input to the audio decoder (i.e., either the audio signal defined by the audio data 2005 or the external audio signal 2003). The information characteristic of the audio input signal can take any of numerous forms, as the present invention is not limited to any particular technique for analyzing an audio signal, h accordance with one embodiment of the present invention, digital signal processing techniques are used to analyze the audio signal. It should be appreciated that there are many different types of computations that can be performed using digital signal processing techniques, and the present invention is not limited to any particular technique for analyzing the audio signal. Examples of information characteristic of an audio signal include information relating to a frequency content and an intensity of the audio signal. For example, the audio decoder 2011 may generate time domain information for the audio input signal, representing the intensity of the audio signal over time. The time domain information may be outputted as an array, wherein each array element is an integer representing the intensity of the audio signal for a given point in time, or in any other suitable format. Audio decoder 2011 may further generate frequency domain information by performing a Laplace transform (examples of which include a Fourier transform and a fast Fourier transform (FFT)) of time domain information for the audio signal. In one embodiment, a fast Fourier transform is performed, but the present invention is not limited in this respect and can employ any suitable technique for analysis in the frequency domain. The frequency domain information may be outputted as an array, wherein each array element is an integer representing the intensity of the audio signal for a given point in time. Audio decoder 2011 may further generate frequency domain information by performing a fast Fourier transform (FFT) of time domain information for the audio signal. The frequency domain information may be outputted as an array, wherein each array element can be an integer representing the amplitude of the signal for a given frequency band during a corresponding time frame. In accordance with one embodiment of the present invention, the frequency domain information is the FFT of the corresponding time domain information for a particular time frame. Again, it should be appreciated that the audio decoder 2011 is not limited to generating information characteristic of an audio signal in this manner, as other techniques for analyzing an audio signal and formats for presenting information relating thereto are possible.
It should be appreciated that many audio signal formats comprise two or more independently encoded channels, and that many audio file formats maintain the independence of the channel data. Examples of such multi-channel audio signals include stereo signals, AC-1 (Audio Coding-1), AC-2 and AC-3 (Dolby Digital). In accordance with one embodiment of the present invention, each channel for a single audio signal is analyzed separately by the audio decoder 2011, such that separate information is generated by analyzing the characteristics of the different channels. For example, using the example described above, wherein the information concerning an audio signal includes frequency domain information and time domain information, in one embodiment of the present invention the audio decoder 2011 generates separate frequency domain information and time domain information for each separate channel for a single input audio signal (e.g., audio data 2005 or external audio signal 2003). The audio decoder 2011 can be implemented in any of numerous ways, as the present invention is not limited to any particular implementation technique. For example, the audio decoder 2011 can be implemented in dedicated hardware, or can be implemented in software executed on a processor (not shown) within the computer system 2009. When implemented in software, the audio decoder 2011 can be provided as an executable program written in any suitable computer programming language (e.g., Fortran, C, Java, C++, etc.). The software for implementing the audio decoder 2011 can be stored on any computer readable medium accessible to the computer system 2009, including the computer readable medium 2007 that stores the audio data 2005, or any other computer readable media. The software for implementing the audio decoder 2011 can, for example, can be any one of a number of commercially available software programs that perform the above-described functions. Examples of such commercially available software programs include MP3 players such as Winamp™, available from Nullsoft, Inc. Such commercially available MP3 players include application programming interfaces (APIs) that enable third party add-on plug-in software components to interface with the MP3 player, and to take advantage of the functionality provided thereby, including the above-described information that the audio decoder 2011 provides concerning the characteristics of an audio input. Thus, as discussed further below, one embodiment of the present invention is directed to software, for execution on a computer system 2009, that acts as a plug-in to a commercially available MP3 player to provide the mapping functions described below to control a lighting network in response to an input audio signal (e.g., stored audio data 2005 or an external audio signal 2003). The mapper 2015 performs a function that is similar in many respects to the playback function performed by the processor 651 and the storage device 620 (see e.g., Figures 6-7) in the embodiments discussed above. In this respect, the mapper 2015 can be provided with a lighting program (e.g., stored in a mapping table 2015t) that can include one or more variables to receive input values at execution time. As shown in Figure 8, the mapper 2015 can receive the output of the audio decoder 2011, so that information concerning the characteristics of the input audio signal can be provided to the mapper 2015 to provide the input values for variables in the lighting program executed by the mapper 2015. In accordance with one illustrative embodiment of the present invention, the mapper 2015 can execute lighting programs that each includes only a single entry defining the manner in which control signals, to be passed to the lighting network, will be generated. Each such lighting program for the mapper 2015 may be programmed using a number of if/then statements or Boolean logic to interpret the numerous varied permutations of inputs from the audio decoder 2011 relating to characteristics of the audio input signal, and may generate control signals to the lighting network accordingly. Even with such static lighting programs, the control signals transmitted to the lighting network will result in a changing light show as the input audio signal is played, as the characteristics of the audio signal will change over time, resulting in changing inputs to the mapper 2015 and, consequently, changing control signals sent to the lighting network. Alternatively, the mapping table 2015t can include lighting programs that include a plurality of lighting sequences, in much the same manner as the embodiments described above (e.g., in connection with Figures 6-7). In accordance with these embodiments of the present invention, the mapper 2015 will step through various lighting sequences as the input audio signal is played back, which can result in a more varied light show, as not only will the inputs from the audio decoder 2011 change as the input audio signal is played back, but the mapping function executed by the mapper 2015 can also be programmed to change over time.
It should be appreciated that the embodiment of the present invention shown in Figure 8 can be programmed (i.e., in the mapping table 2015t) with lighting programs that can achieve any of the lighting effects discussed above, including those described in connection with the systems in Figures 1 -7.
In the embodiment shown in Figure 8, the computer system 2009 includes a timer 2021 that provides an input to the mapper 2015. The timer can be used in a manner similar to the timing modules 660, 665 discussed above in connection with the embodiment of Figure 6, but is an optional feature that need not be employed in all embodiments of the present invention, h accordance with one embodiment of the present invention, the timer 2021 is used to provide variation over time in the mapping function executed by the mapper 2015, to achieve resulting variation in the control signals sent to the lighting network during the playback of one or more audio input signals and thereby avoid redundancy in the lighting show produced in response to the audio signals. This changing of the mapping function can be accomplished in any of numerous ways. For example, for a particular entry in the mapping table 2015t, a variable can be provided that receives an input value from the timer 2021, such that the ti er information can be taken into account in the mapping logic. Alternatively, a mapper 2015 can use inputs received from the timer 2021 to index into the mapping table 2015t to select a different lighting program, or a different line within a particular lighting program, to change the mapping function. As with the embodiment of the present invention discussed above in connection with Figures 6-7, the timer 2021 can include date and time information, such that the mapping function can change as a result of the date and/or time, or can include local time information so that the mapping function can be changed as a result of the amount of time that a particular lighting show has been executed in response to audio signal inputs. hi the embodiment of Figure 8, an external interface 2045 is provided to receive additional user inputs that can be input to the mapper 2015 to impact the control signals sent to the lighting network. It should be appreciated that this is an optional feature, and need not be provided in every embodiment of the present invention. The external interface 2045 can be of any of numerous types, including all of those discussed above in connection with the embodiments of Figures 1-7, and can control the lighting show produced by the mapper 2015 in any of the numerous ways discussed above. For example, one or more additional external inputs can provide an additional variable to the mapping function performed by the mapper 2015 to impact the control signals sent to the lighting network. In addition, the external input received by the external interface 2045 can also be used to change between lighting programs provided by the mapping table
2015t, change the sequence of commands executed thereby (e.g., by branching to an out- of-line location), or any of the other results described in connection with the embodiments discussed above. hi accordance with one illustrative embodiment of the present invention, the external interface 2045 is a graphical user interface (GUI) that can be displayed on a display of the computer system 2009 to facilitate a user in selecting a particular mapping function to be provided by the mapping table 2015t. This aspect of the present invention can be implemented in any of numerous ways, and is not limited to any particular implementation technique. As an example, a graphical user interface can be provided that lists various types of mapping functions that are considered to be particularly suitable for particular music types. Thus, prior to playing a particular song as the audio input signal, a user can select a mapping function (e.g., from the mapping table 2015t) that fits the style of music of the song to be played. In this manner, the user can customize the lighting show generated based upon the type of music to be played. Of course, it should be appreciated that this is simply one example of the manner in which a graphical user interface can be used, as numerous other implementations are possible. In another embodiment of the present invention, the particular mapping function employed can be selected based upon information provided with the audio signal that provides an indication of the type of music included therein. Specifically, some pieces of music can include a tag or other information in the music, or associated therewith, that identifies the type of music, hi accordance with one embodiment of the present invention, such information can be used to select a mapping function that fits the style of music in much the same manner as described above.
As should be appreciated from the foregoing, changes in the mapping performed by the mapper 2015 can be accomplished in numerous ways by including a variable in a single mapping function that can result in changes of the mapping output or by switching between different mapping functions in the mapping table 2015t. The changes in the mapping performed by the mapper 2015 can be accomplished in response to any of numerous stimuli, including input provided from an external input (e.g., from a user selecting a different mapping function), in response to timing information from the timer 2021, in response to some characteristic of an input audio signal (e.g., provided to the mapper 2015 by the audio decoder 2011), in response to a detection by the audio decoder that a particular audio signal (e.g., a song) has terminated and a new one is beginning, etc. Thus, there are numerous ways of continually updating the mapping performed by the mapper 2015. Of course, it should be appreciated that the present invention is not limited to using any or all of these techniques, as these are described herein merely for illustrative purposes.
In the embodiment shown in Figure 8, the computer system 2009 does not include a cue table 630 or a transient memory 640 as described in connection with the embodiment of Figure 6. However, it should be appreciated that either or both of these features can alternatively be provided, in much the same manner as described above in connection with the playback device 31 of Figure 6. hi this respect, the cue table 630 can be provided between the external interface 2045 and the mapper 2015, and/or between the audio decoder 2011 and the mapper 2015 to assist in analyzing the inputs provided by the external interface 2045 and/or the characteristics of the input audio signal provided by the audio decoder 2011. Of course, it should be appreciated that these features are optional, and need not be employed in all embodiments of the present invention. As mentioned above, it should be appreciated that the manner in which the characteristics of the input audio signal are analyzed by the mapper 2015 to impact the control signals sent to the lighting network to control the lighting show can be performed in any of numerous ways, as the present invention is not limited to any particular, type of analysis. For example, the mapper 2015 can look for particular activity levels within a particular frequency band, can detect a beat of the music based upon pulses within particular frequency bands or overall activity of the input signal, can look for an interaction between two or more different frequency bands, can analyze intensity levels characteristic of a volume at which the audio signal is being played, etc. One variable for consideration by the mapper 2015 is the sensitivity of the system at which differences in a characteristic of the audio signal will be recognized, resulting in a change in the control signals sent to the lighting network, and thereby a change in the lighting show. As indicated above, in one embodiment of the present invention, the external interface 2045 can also enable external inputs (e.g., inputs from a user) to change any of numerous variables within the mapping function to impact the lighting show produced. It should be appreciated that the mapper 2015 can be implemented in any of numerous ways, including with dedicated hardware, or with software executed on a processor (not shown) within the computer system 2009. When implemented in software, the software can be stored on any computer readable medium accessible to the computer system 2009, including a computer readable medium 2007 that stores the audio data 2005. The software that implements the mapper 2015 can be implemented as an executable program written in any number of computer programming languages, such as those discussed above. The software can be implemented on a same processor that also executes software to implement the audio decoder 2011, or the computer system 2009 can be provided with separate processors to perform these functions. As discussed above, one embodiment of the present invention is directed to the provision of a software plug-in that is compatible with commercially available MP3 players to enable the control of a lighting network in response to an audio signal being played by the MP3 player. Thus, one embodiment of the present invention is directed to a computer readable medium encoded with a program that, when executed by a processor on a computer system such as 2009, interacts with an audio decoder 2011 of an MP3 player executing on the computer system 2009, and implements the functions of the mapper 2015 to generate the control signals necessary to control a lighting network as described above. Of course, it should be understood that this is simply one illustrative embodiment of the present invention, as numerous other implementations are possible.
As with the other embodiments of the invention described above, the lighting units 40 (Figure 1) of the lighting network may be any type of light source, including incandescent, LED, fluorescent, halogen, laser, etc. Each lighting unit may be associated with a predetermined assigned address as discussed above. The computer system 2009 may send control signals to the lighting network in any of numerous ways, as the present invention is not limited to any particular technique. In the embodiment shown in Figure 8, the computer system 2009 includes an output buffer 2019 and a network output port 2020 to facilitate transmission of control signals from the mapper 2015 to the lighting network. The network output port 2020 can be any of numerous types of interfaces capable of communicating with the lighting network, including the numerous types of interfaces discussed above in connection with the output ports 680 described in connection with Figures 6-7. In the embodiments shown, the information outputted by the mapper 2015 is passed through an output buffer 2019 that is then coupled to the network output 2020. However, it should be appreciated that the present invention is not limited in this respect, as no output buffer need be used.
It should be appreciated that the information stored in the mapping table 2015t and output from the mapper 2015 may not be in a format capable of directly controlling a lighting network, such that in one embodiment of the present invention, a format conversion is performed. As discussed above, examples of formats for controlling a plurality of lighting units include data streams and data formats such as DMX, RS-485, RS-232, etc. Any format conversion can be performed by the mapper 2015, or a separate converter can be employed. The converter can be implemented in any of numerous ways, including in dedicated hardware or in software executing on a processor within the computer system 2009. In the embodiment of the invention shown in Figure 8, the computer system 2009 not only generates control signals to control a lighting network, but also drives one or more speakers to generate an audible sound from the audio input signal, with the audible sound being synchronized to the light show produced by the lighting network. For example, the computer system 2009 includes an audio player 2022 that reads audio data 2005 stored on the computer readable medium 2007, performs any processing necessary depending upon the format in which the audio data 2005 is stored (e.g., decompresses the data if stored in a compressed format) and passes the information to a speaker driver 2024 which can then drive one or more speakers to produce an audible sound. It should be appreciated that the one or more speakers described above may include any device for generating audible output including, for example, headphones and loudspeakers. The speaker driver 2024 can be implemented in any of numerous ways, as the present invention is not limited to any particular implementation technique. For example, the speaker drivers 2024 can be implemented on a sound card provided within the computer system 2009. The audio player 2022 also can be implemented in any of numerous ways. For example, commercially available MP3 players include software that, when executed on a processor within the computer system 2009, perform the functions of the audio player 2022.
It should be appreciated that the external audio signal 2003 can be provided in either digital form, or in analog form. When provided in analog form, the external audio signal may pass through an analog-digital converter (not shown) within the computer system 2009 prior to being passed to the audio decoder 2011. This conversion can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation. For example, the external audio signal can be provided to a sound card within the computer system 2009, which can perform the analog-to-digital conversion.
It should be appreciated that in the embodiment of the present invention wherein the same computer system 2009 that generates the control signals for the lighting network also drives speakers to generate an audible sound for the audio signal, some synchronization may be performed to ensure that the lighting show produced on the lighting network is synchronized with the audible playing of the audio signal. This can be accomplished within the computer system 2009 in any of numerous ways. For example, when the audio player 2022 and audio decoder 2011 are provided as part of a commercially available MP3 player, the MP3 player will automatically perform this synchronization.
As should be appreciated from the foregoing, in one embodiment of the present invention, the analyzing of an audio input signal is performed essentially simultaneously with a playing of the audio signal to generate an audible sound. However, the present invention is not limited in this respect, as in another embodiment of the present invention, the analysis of the audio input signal is performed prior to playing the audio signal to generate an audible sound. This can provide for some flexibility in performing the mapping of the audio input signal to control signals for the lighting network, as the mapping function can consider not only the characteristics of the audible signal that corresponds with the instant in time for the control signals being generated, but can also look ahead in the audio signal to anticipate changes that will occur, and thereby institute lighting effects in advance of a change in the audible playback of the audio signal. This can be performed in any of numerous ways. For example, the audio input signal can be analyzed prior to it being played to generate an audible output, and the results of that analysis (e.g., from the audio decoder 2011) can be stored in memory (e.g., in a transient memory such as 640 in Figure 6) or in the mapping table 2015t, for future reference by the mapper 2015 when the audio signal is audibly played. Thus, the function performed by the mapper 2015 can look not only to characteristics of the music that correspond to the point in time with the audio signal being played, but can also look ahead (or alternatively behind) in the audio signal to anticipate changes therein. Alternatively, rather than storing the outputs that are characteristic of the audio signal, another option is to perform the mapping at the time when the audio input signal is first analyzed, and store the entire control signal sequence in memory (e.g., in the mapping table 2015t). Thereafter, when the audio signal is audibly played, the mapper 2015 need not do any analysis in real time, but rather, can simply read out the previously defined control signals, which for example can be stored at a particular sample rate to then be played back when the audio signal is played to generate an audible signal. While the embodiment of the present invention directed to performing an analysis of the audio signal prior to playing it back provides the advantages described above, it should be appreciated that this is not a requirement of all embodiments of the present invention.
It should be appreciated that the lighting programs (e.g., entries in the mapping table 2015t) for the embodiment shown in Figure 8 can be authored using an authoring system in much the same manner as described above in connection with the generation of lighting programs for the embodiments of Figures 1-7. Thus, for example, a graphical user interface can be provided to assist a user in generating the lighting programs. As with the embodiments of the invention described above, the authoring can be performed on the same computer system 2009 that is used to playback the lighting program and generate the control signals to the lighting network, or the lighting programs can be authored on a different system, and then transferred, via a computer readable medium, to the mapping table 2015t in the computer system 2009.
In accordance with an alternate embodiment of the invention, Applicants have appreciated that the device used to control the lighting network 2001 need not have all of the functionality and capability of a computer system, for example it need not include a video monitor, keyboard, or other robust user interface. Furthermore, Applicants have appreciated that in many instances, it is desirable to provide a relatively small and inexpensive device to perform the lighting control function in response to an audio input, so that the device can be portable. hi view of the foregoing, one embodiment of the present invention, is directed to a lighting control device that includes all of the functionality described above in connection with Figure 8, but is implemented on a computer system that is dedicated to performing the functions described above, and is not a general purpose computer. An illustration ofthis embodiment of the present invention is provided in Figure 9, which discloses a lighting control device 2027 for controlling lighting units 40 of a lighting network 2001 in response to audio input data or an input audio signal. The lighting control device performs all of the functions of the embodiment illustrated in Figure 8, but is not implemented on a general purpose computer. Rather, the lighting control device is a device dedicated to performing only those functions described above, and need not include a lot of the functionality found in a general purpose computer, such as a full size display, a full alphanumeric keyboard, an operating system that enables processing of multiple applications simultaneously, etc. The lighting control device can take any of numerous forms, as the present invention is not limited to any particular implementation.
An even further simplified embodiment of the present invention is illustrated in Figure 10, which illustrates a lighting control device 2030 that includes only a subset of the functionality provided in the embodiment of the invention shown in Figure 8. Specifically, the embodiment of the invention shown in Figure 10 does not include an audio player for generating an audio signal internally, and is not adapted to be coupled to a computer readable medium including audio data. Rather, the lighting control device 2030 is adapted to receive an external audio signal 2003 from any suitable source, and to then process the audio signal, in much the same manner as the embodiment of Figure 8, to generate control signals for a lighting network to produce a lighting show based on the external audio input. Thus, the lighting control device 2030 includes an audio decoder 2011 and a mapper 2015 (with its associated table 2015t) that each performs the functions described above in terms of analyzing an external audio input signal and generating commands for a lighting network based thereon, and further includes a network output port 2020 compatible with the lighting network. The lighting control device 2030 may optionally include a timer 2021, output buffer 2019 and or a cue table (not shown) that can perform the same functions described above in connection with the embodiment of Figure 8. In the embodiment shown in Figure 10, the lighting control device 2030 includes an external interface 2045 for receiving an external input 2046, which can take any of numerous forms as discussed above in connection with the embodiment of Figure 8. hi accordance with one embodiment of the present invention, the external interface 2045 is adapted to be a simple interface that is relatively inexpensive and compact. The external interface can be used to perform any of numerous functions, such as to switch between lighting programs (e.g., entries in the mapping table 2015t), to vary lighting effects or parameters therefore, or any of the other functions discussed above in connection with the embodiments of Figures 1-9. The external interface can take any of numerous forms, including switches, buttons, dials, sliders, a console, a keyboard, a speech recognition system or any other device, such as a sensor (e.g., responsive to light, motion or temperature) whereby a command or signal can be provided to the lighting control device 2030. An external device may be coupled to the external interface 2045 via any suitable technique, including a direct wire connection, or via RF or some other type of wireless connection.
It should be appreciated that the lighting control device 2030 may receive the external audio signal using any suitable interface, such as the serial port, USB port, parallel port, LR receiver, a standard stereo audio jack, or any other suitable interface. The components on the lighting control device 2030 can be powered in any of numerous ways, including through the provision of a power source (e.g., a battery) within the lighting control device 2030, or through the provision of an interface for receiving a power cord compatible with a standard electrical outlet. However, in accordance with one illustrative embodiment of the present invention, the lighting control device 2030 is provided with neither an onboard power source nor an interface for a standard electrical outlet. Thus, in accordance with one illustrative embodiment of the invention, the interface for connecting the lighting control device 2030 to a lighting network 2001 enables not only the transfer of data or other communication signals, but also sufficient electrical current to power the components within the lighting control device 2030. The need for a separate power interface may be thereby eliminated. The present invention is not limited to the use of any particular type of interface. One example of a suitable interface that provides both communication and power is a USB port. The lighting control device 2030 may begin processing of the external audio signal 2003 and/or initiate the sending of control signals to the lighting network to initiate a lighting show either in response to a signal received at the external input 2046, or immediately upon receipt of the external audio signal 2003. Alternatively, the lighting control device 2030 may initiate a lighting show at a specified time, or upon any suitable condition. The lighting control device 2030 may continue to send control information to the lighting network until it no longer receives any external audio signal 2003, until a signal is received at the external input 2046, until the occurrence of a specified condition, until a particular point in time, or any other suitable event, hi one embodiment of the present invention, the lighting control device 2030 includes a storage device to store the mapping table 2015t. The storage device can be a memory unit, database, or other suitable module (e.g., a removable Flash memory) for storing one or more lighting programs in the mapping table 2015t. In accordance with one embodiment of the present invention, the storage device is formed as a non- volatile memory device, such that once information is stored thereon, the information is maintained, even when no power is provided to the lighting control device 2030.
It should be appreciated that any single component or collection of multiple components of the above-described embodiments that perform the functions described above can be generically considered as one or more controllers that control the above- discussed functions. The one or more controllers can be implemented in numerous ways, such as with dedicated hardware, or using a processor that is programmed to perform the functions recited above. In this respect, it should be appreciated that one implementation of the present invention comprises at least one computer readable medium (e.g., a computer memory, a floppy disk, a compact disk, a tape, etc.) encoded with a computer program that, when executed on a processor, performs the above-discussed functions of the present invention. The computer readable medium can be transportable such that the program stored thereon can be loaded onto any device having a processor to implement the aspects of the present invention discussed above. In addition, it should be appreciated that the reference to a computer program that, when executed, performs the above-discussed functions is not limited to an application program, but rather is used herein in the generic sense to reference any type of computer code (e.g., software or microcode) that can be employed to program a processor to implement the above- discussed aspects of the present invention.
As used herein, the reference to an LED is intended to encompass any light emitting semiconductor device. In addition, any reference to a light or illumination unit generating a "color" refers to the generation of any frequency of radiation, including not only frequencies within the visible spectrum, but also frequencies in the infrared, ultraviolet and other areas of the electromagnetic spectrum.
Having described several embodiments of the invention in detail, various modifications and improvements will readily occur to those skilled in the art. Such modifications and improvements are intended to be within the spirit and scope of the invention. Accordingly, the foregoing description is by way of example only, and is not intended as limiting. The invention is limited only as defined by the following claims and equivalents thereto.
What is claimed is:

Claims

1. A method for executing a lighting program to control a plurality of light emitting diodes (LEDs), the method comprising acts of: (A) receiving an audio input in digital form;
(B) digitally processing the audio input to determine at least one characteristic of the audio input;
(C) executing the lighting program to generate control signals to control the plurality of LEDs; and (D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
2. The method of claim 1, wherein the act (A) includes an act of receiving the audio input in analog form and converting the audio input to digital form.
3. The method of claim 1, wherein the act (B) includes an act of performing a frequency transformation of the audio input to determine an activity level within at least one frequency band, and wherein the at least one characteristic of the audio input relates to the activity level within the at least one frequency band.
4. The method of claim 1, wherein the act (B) includes an act of determining a beat of the audio input, and wherein the at least one characteristic of the audio input relates to the beat.
5. The method of claim 1 , wherein the act (B) includes an act of determining a volume of the audio input, and wherein the at least one characteristic of the audio input relates to the volume.
6. The method of claim 1 , wherein the act (B) includes an act of determining an intensity of the audio input, and wherein the at least one characteristic of the audio input relates to the intensity.
7. The method of claim 1, wherein the act (A) includes an act of receiving the audio input as part of an audio/video signal.
8. The method of claim 1, wherein the act (C) includes an act of transmitting pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
9. The method of claim 1, wherein the act (C) includes an act of executing a lighting program having at least one variable that has an input value, and wherein the act (D) includes an act of providing the at least one characteristic of the audio input as the input value of the at least one variable.
10. The method of claim 1 , wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (C), switching to execution of a second lighting program in response to the at least one characteristic of the audio input.
11. The method of claim 1, further including an act of, during execution of the lighting program in the act (C), assigning an effect to at least one of the plurality of
LEDs based at least in part on the at least one characteristic of the audio input.
12. The method of claim 1, further including an act of, during execution of the lighting program in the act (C), determining a parameter of at least one effect assigned to at least one of the plurality of LEDs based at least in part on the at least one characteristic of the audio input.
13'. The method of claim 1, wherein the method further includes an act of providing a cue table that identifies various actions to be taken during execution of the lighting program in response to at least two inputs received at the cue table, and wherein the act (D) includes acts of: providing at least two characteristics of the audio input as inputs to the cue table; and during execution of the lighting program, generating at least one of the control signals in response to an output of the cue table.
14. The method of claim 1, wherein the lighting program performs a mapping from the at least one characteristic of the audio input to the at least one of the control signals, wherein the method further includes an act of providing a cue table that identifies various actions to be taken during execution of the lighting program in response to at least two inputs received at the cue table, and wherein the act (D) includes acts of: providing at least two characteristics of the audio input as inputs to the cue table; and during execution of the lighting program, changing the mapping performed by the lighting program in response to an output of the cue table.
15. The method of claim 1, wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the act (D) includes an act of, during execution of the lighting program in the act (C), changing the mapping function performed by the lighting program in response to the second characteristic of the audio input.
16. The method of claim 15, wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (C), switching to execution of a second lighting program in response to the second characteristic of the audio input.
17. The method of claim 1, wherein the act (B) includes an act of digitally processing the audio input to determine a plurality of characteristics of the audio input; and wherein the act (D) includes an act of, during execution of the lighting program in the act (C), generating the control signals based at least in part on the plurality of characteristics of the audio input.
18. The method of claim 1 , wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; and wherein the method further includes an act of, during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on user input provided via the at least one user interface.
19. The method of claim 1 , wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the method further includes an act of changing the mapping function performed by the lighting program in response to an input received from the user interface.
20. A computer readable medium encoded with a program that, when executed, performs a method for executing a lighting program to control a plurality of light emitting diodes (LEDs), the method comprising acts of:
(A) receiving an audio input in digital form; (B) digitally processing the audio input to determine at least one characteristic of the audio input;
(C) executing the lighting program to generate control signals to control the plurality of LEDs; and
(D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
21. The computer readable medium of claim 20, wherein the act (A) includes an act of receiving the audio input in analog form and converting the audio input to digital form.
22. The computer readable medium of claim 20, wherein the act (B) includes an act of performing a frequency transformation on the audio input to determine an activity level within at least one frequency band, and wherein the at least one characteristic of the audio input relates to the activity level within the at least one frequency band.
23. The computer readable medium of claim 20, wherein the act (B) includes an act of determining a beat of the audio input, and wherein the at least one characteristic of the audio input relates to the beat.
24. The computer readable medium of claim 20, wherein the act (B) includes an act of determining a volume of the audio input, and wherein the at least one characteristic of the audio input relates to the volume.
25. The computer readable medium of claim 20, wherein the act (B) includes an act of determining an intensity of the audio input, and wherein the at least one characteristic of the audio input relates to the intensity.
26. The computer readable medium of claim 20, wherein the act (C) includes an act of transmitting pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
27. The computer readable medium of claim 20, wherein the act (C) includes an act of executing a lighting program having at least one variable that has an input value, and wherein the act (D) includes an act of providing the at least one characteristic of the audio input as the input value of the at least one variable.
28. The computer readable medium of claim 20, wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (C), switching to execution of a second lighting program in response to the at least one characteristic of the audio input.
29. The computer readable medium of claim 20, wherein the method further includes an act of, during execution of the lighting program in the act (C), assigning an effect to at least one of the plurality of LEDs based at least in part on the at least one characteristic of the audio input.
30. The computer readable medium of claim 20, further including an act of, during execution of the lighting program in the act (C), determining a parameter of at least one effect assigned to at least one of the plurality of LEDs based at least in part on the at least one characteristic of the audio input.
30. The computer readable medium of claim 20, wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program perfonns a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the act (D) includes an act of, during execution of the lighting program in the act (C), changing the mapping function performed by the lighting program in response to the second characteristic of the audio input.
31. The computer readable medium of claim 30, wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (C), switching to execution of a second lighting program in response to the second characteristic of the audio input.
32. The computer readable medium of claim 20, wherein the act (B) includes an act of digitally processing the audio input to determine a plurality of characteristics of the audio input; and wherein the act (D) includes an act of, during execution of the lighting program in the act (C), generating the control signals based at least in part on the plurality of characteristics of the audio input.
33. The computer readable medium of claim 20, wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; and wherein the method further includes an act of, during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on user input provided via the at least one user interface.
34. The computer readable medium of claim 20, wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the method further includes an act of changing the mapping function performed by the lighting program in response to an input received from the user interface.
35. An apparatus for executing a lighting program to control a plurality of light emitting diodes (LEDs), the apparatus comprising: at least one storage medium to store the lighting program; at least one input to receive an audio input; an audio decoder to digitally process the audio input to determine at least one characteristic of the audio input; and at least one controller, coupled to the audio decoder and the at least one storage medium, to execute the lighting program to generate control signals to control the plurality of LEDs, wherein the at least one controller generates at least one of the control signals based at least in part on the at least one characteristic of the audio input.
36. The apparatus of claim 35, further including an analog-to-digital converter, coupled to the at least one input, to convert the audio input from analog form to digital form.
37. The apparatus of claim 35, wherein the audio decoder performs a frequency transformation on the audio input to determine an activity level within at least one frequency band, and wherein the at least one characteristic of the audio input relates to the activity level within the at least one frequency band.
38. The apparatus of claim 35, wherein the audio decoder determines a beat of the audio input, and wherein the at least one characteristic of the audio input relates to the beat.
39. The apparatus of claim 35, wherein the audio decoder determines a volume of the audio input, and wherein the at least one characteristic of the audio input relates to the volume.
40. The apparatus of claim 35, wherein the audio decoder determines an intensity of the audio input, and wherein the at least one characteristic of the audio input relates to the intensity.
41. The apparatus of claim 35, wherein the at least one controller transmits pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
42. The apparatus of claim 35, wherein the lighting program has at least one variable that has an input value, and wherein the at least one controller provides the at least one characteristic of the audio input as the input value of the at least one variable.
43. The apparatus of claim 35, wherein the lighting program is a first lighting program, wherein the at least one storage medium further stores a second lighting program, and wherein the at least one controller, during execution of the first lighting program, switches to execution of the second lighting program in response to the at least one characteristic of the audio input.
44. The apparatus of claim 35, wherein the at least one controller, during execution of the lighting program, assigns an effect to at least one of the plurality of LEDs based at least in part on the at least one characteristic of the audio input.
45. The apparatus of claim 35, wherein the at least one controller, during execution of the lighting program, determines a parameter of at least one effect assigned to at least one of the plurality of LEDs based at least in part on the at least one characteristic of the audio input.
46. The apparatus of claim 35, further including a cue table that identifies various actions to be taken during execution of the lighting program in response to at least two inputs received at the cue table, wherein the cue table is coupled to the audio decoder to receive information identifying at least two characteristics of the audio input, and wherein the at least one controller generates at least one of the control signals in response to an output of the cue table.
47. The apparatus of claim 35, wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the at least one controller, during execution of the lighting program, changes the mapping function performed by the lighting program in response to the second characteristic of the audio input.
48. The apparatus of claim 47, wherein the lighting program is a first lighting program, wherein the at least one storage medium further stores a second lighting program, and wherein the at least one controller, during execution of the first lighting program, switches to execution of the second lighting program in response to the second characteristic of the audio input.
49. The apparatus of claim 35, further including at least one user interface, and wherein the at least one controller, during execution of the lighting program, generates at least one of the control signals based at least in part on user input provided via the at least one user interface.
50. The apparatus of claim 35, further including at least one user interface; and wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the at least one controller changes the mapping function performed by the lighting program in response to an input received from the user interface.
51. A computer readable medium encoded with a first program that, when executed on a processor, performs a method for executing a lighting program to control a plurality of light emitting diodes (LEDs), wherein the processor is programmed with a second program that processes an audio input to detennine at least one characteristic of the audio input, the method comprising acts of:
(A) receiving information from the second program relating to the at least one characteristic of the audio input;
(B) executing the lighting program to generate control signals to control the plurality of LEDs; and (C) during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input received from the first program.
52. The computer readable medium of claim 51 , wherein the act (B) includes an act of transmitting pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
53. The computer readable medium of claim 51, wherein the act (B) includes an act of executing a lighting program having at least one variable that has an input value, and wherein the act (C) includes an act of providing the at least one characteristic of the audio input as the input value of the at least one variable.
54. The computer readable medium of claim 51 , wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the act (C) includes an act of, during execution of the lighting program in the act (B), changing the mapping function performed by the lighting program in response to the second characteristic of the audio input.
55. The computer readable medium of claim 51, wherein the act (B) includes an act of executing the lighting program on a device coupled to at least one user interface; wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the method further includes an act of changing the mapping function performed by the lighting program in response to an input received from the user interface.
56. The computer readable medium of claim 51 , wherein the second program processes an audio input in MP3 format to determine at least one characteristic of the audio input, and wherein the first program is a plug-in compatible with an application programming interface provided by the second program.
57. A method for executing a lighting program to control a plurality of light emitting diodes (LEDs), the method comprising acts of:
(A) receiving an audio input and an input from at least one timer; (B) analyzing the audio input to determine at least one characteristic of the audio input;
(C) executing the lighting program to generate control signals to control the plurality of LEDs; and
(D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
58. The method of claim 57, wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the act (D) includes an act of, during execution of the lighting program in the act (C), changing the mapping function performed by the lighting program in response to the second characteristic of the audio input.
59. The method of claim 57, wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; and wherein the method further includes an act of, during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on user input provided via the at least one user interface.
60. The method of claim 57, wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the method further includes an act of changing the mapping function performed by the lighting program in response to an input received from the user interface.
61. The method of claim 57, wherein the act (C) includes an act of transmitting pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
62. The method of claim 57, wherein the act (C) includes an act of executing a lighting program having at least first and second variables that each has an input value, and wherein the act (D) includes an act of providing the at least one characteristic of the audio input as the input value of the first variable and the input from the at least one timer as the input value of the second variable.
63. The method of claim 57, wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (C), switching to execution of a second lighting program in response to the input from the at least one timer.
64. A computer readable medium encoded with a program that, when executed, performs a method for executing a lighting program to control a plurality of light emitting diodes (LEDs), the method comprising acts of: (A) receiving an audio input and an input from at least one timer;
(B) analyzing the audio input to determine at least one characteristic of the audio input; (C) executing the lighting program to generate control signals to control the plurality of LEDs; and
(D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
65. The computer readable medium of claim 64, wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the act (D) includes an act of, during execution of the lighting program in the act (C), changing the mapping function performed by the lighting program in response to the second characteristic of the audio input.
66. The computer readable medium of claim 64, wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; and wherein the method further includes an act of, during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on user input provided via the at least one user interface.
67. The computer readable medium of claim 64, wherein the act (C) includes an act of executing the lighting program on a device coupled to at least one user interface; wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the method further includes an act of changing the mapping function performed by the lighting program in response to an input received from the user interface.
68. The computer readable medium of claim 64, wherein the act (C) includes an act of transmitting pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
69. The computer readable medium of claim 64, wherein the act (C) includes an act of executing a lighting program having at least first and second variables that each has an input value, and wherein the act (D) includes an act of providing the at least one characteristic of the audio input as the input value of the first variable and the input from the at least one timer as the input value of the second variable.
70. The computer readable medium of claim 64, wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (C), switching to execution of a second lighting program in response to the input from the at least one timer.
71. A computer readable medium encoded with a first program that, when executed on a processor, performs a method for executing a lighting program to control a plurality of light emitting diodes (LEDs), wherein the processor is programmed with a second program that processes an audio input to determine at least one characteristic of the audio input, the method comprising acts of:
(A) receiving information from the second program relating to the at least one characteristic of the audio input and an input from the at least one timer;
(B) executing the lighting program to generate control signals to control the plurality of LEDs; and
(C) during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the at least one timer.
72. The computer readable medium of claim 71 , wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the act (C) includes an act of, during execution of the lighting program in the act (B), changing the mapping function performed by the lighting program in response to the second characteristic of the audio input.
73. The computer readable medium of claim 71, wherein the act (B) includes an act of executing the lighting program on a device coupled to at least one user interface; and wherein the method further includes an act of, during execution of the lighting program in the act (B), generating at least one of the control signals based at least in part on user input provided via the at least one user interface.
74. The computer readable medium of claim 71, wherein the act (B) includes an act of executing the lighting program on a device coupled to at least one user interface; wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the method further includes an act of changing the mapping function performed by the lighting program in response to an input received from the user interface.
75. The computer readable medium of claim 71 , wherein the act (B) includes an act of transmitting pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
76. The computer readable medium of claim 71, wherein the act (B) includes an act of executing a lighting program having at least first and second variables that each has an input value, and wherein the act (C) includes an act of providing the at least one characteristic of the audio input as the input value of the first variable and the input from the at least one timer as the input value of the second variable.
77. The computer readable medium of claim 71, wherein the lighting program is a first lighting program, and wherein the method further includes an act of, during execution of the first lighting program in the act (B), switching to execution of a second lighting program in response to the input from the at least one timer.
78. An apparatus for executing a lighting program to control a plurality of light emitting diodes (LEDs), the apparatus comprising: at least one storage medium to store the lighting program; at least one input to receive an audio input; an audio decoder to process the audio input to determine at least one characteristic of the audio input; and at least one controller, coupled to the audio decoder and the at least one storage medium, to execute the lighting program to generate control signals to control the plurality of LEDs, wherein the at least one controller generates at least one of the control signals based at least in part on the at least one characteristic of the audio input and an input from at least one timer.
79. The apparatus of claim 78, further including the at least one timer.
80. The apparatus of claim 78, wherein the at least one characteristic of the audio signal includes at least first and second characteristics, wherein the lighting program performs a mapping function from the first characteristic of the audio input to the at least one of the control signals, and wherein the at least one controller, during execution of the lighting program, changes the mapping function performed by the lighting program in response to the second characteristic of the audio input.
81. The apparatus of claim 78, further including at least one user interface, and wherein the at least one controller generates at least one of the control signals based at least in part on user input provided via the at least one user interface.
82. The apparatus of claim 78, further including at least one user interface; and wherein the lighting program performs a mapping function from the at least one characteristic of the audio input to the at least one of the control signals; and wherein the at least one controller changes the mapping function performed by the lighting program in response to an input received from the user interface.
83. The apparatus of claim 78, wherein the at least one controller transmits pulse width modulated signals to the plurality of LEDs to control a perceived intensity of each of the plurality of LEDs.
84. The apparatus of claim 78, wherein the lighting program having at least first and second variables that each has an input value, and wherein the at least one provides the at least one characteristic of the audio input as the input value of the first variable and the input from the at least one timer as the input value of the second variable.
85. The apparatus of claim 78, wherein the lighting program is a first lighting program, and wherein the at least one controller, during execution of the first lighting program, switches to execution of a second lighting program in response to the input from the at least one timer.
86. A method for executing a lighting program to control a plurality of light emitting diodes (LEDs), the method comprising acts of:
(A) receiving an audio input and an input from a graphical user interface;
(B) analyzing the audio input to determine at least one characteristic of the audio input;
(C) executing the lighting program to generate control signals to confrol the plurality of LEDs; and
(D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input and the input from the graphical user interface.
87. A method for execution on a computer, the method comprising acts of:
(A) processing, on the computer, information indicative of an audio signal to generate a speaker-compatible signal indicative of the audio signal; (B) determining at least one characteristic of the audio signal;
(C) executing, on the computer, a lighting program to generate control signals to control a plurality of light emitting diodes (LEDs);
(D) during execution of the lighting program in the act (C), generating at least one of the control signals based at least in part on the at least one characteristic of the audio input; and
(E) transmitting the speaker-compatible signal to a speaker to generate audible sound indicative of the audio signal.
88. The method of claim 87, wherein the act (A) includes an act of processing information, received from another device, indicative of an audio signal to generate the speaker-compatible signal.
89. The method of claim 87, wherein the act (A) includes an act of reading digital information, stored on a computer readable medium coupled to the computer, indicative of the audio signal to generate the speaker-compatible signal.
90. A method for authoring a hghting program to control a plurality of light emitting diodes (LEDs) is response to at least one characteristic of an audio input, the method comprising acts of:
(A) providing a graphical user interface (GUI) that displays information representative of the plurality of LEDs, a plurality of lighting effects to be assigned thereto, and the at least one characteristic of the audio input;
(B) selecting, based on at least one user input provided via the GUI, at least one of the plurality of lighting effects to correspond to at least one of the plurality of LEDs in response to the at least one characteristic of the audio input; and
(C) creating a lighting program, based on the at least one user input, for generating control information for the plurality of LEDs.
91. A method for executing a lighting program to control a plurality of light emitting diodes (LEDs), the method comprising acts of: (A) receiving an audio input; (B) analyzing the audio input to determine at least one characteristic of the audio input;
(C) storing information related to the at least one characteristic of the audio input;
(D) executing the lighting program, after completion of the act (C), to generate confrol signals to control the plurality of LEDs; and
(E) during execution of the lighting program in the act (D), reading the stored information and generating at least one of the control signals based at least in part on the at least one characteristic of the audio input.
92. A method for executing a lighting program to control a plurality of light emitting diodes (LEDs) to create a light show, the method comprising acts of: (A) receiving an audio input having a duration and varying in time during the duration of the audio input;
(B) processing the audio input to determine at least one first characteristic of the audio input at a first time during the duration;
(C) executing the lighting program in synchronization with the audio input to generate control signals to control the plurality of LEDs; and
(D) during execution of the lighting program in the act (C) at a time that is prior to the first time during the duration of the audio input, generating at least one of the control signals based at least in part on the least one first characteristic of the audio input so that the light show anticipates changes in the audio input.
PCT/US2001/019782 2000-06-21 2001-06-21 Method and apparatus for controlling a lighting system in response to an audio input WO2001099475A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
AT01948546T ATE539593T1 (en) 2000-06-21 2001-06-21 METHOD AND DEVICE FOR CONTROLLING A LIGHTING SYSTEM DEPENDENT ON AN AUDIO INPUT
ES01948546T ES2380075T3 (en) 2000-06-21 2001-06-21 Method and apparatus for controlling a lighting system in response to an audio input
EP01948546A EP1295515B1 (en) 2000-06-21 2001-06-21 Method and apparatus for controlling a lighting system in response to an audio input
JP2002504188A JP4773673B2 (en) 2000-06-21 2001-06-21 Method and apparatus for controlling a lighting system in response to audio input
AU2001270018A AU2001270018A1 (en) 2000-06-21 2001-06-21 Method and apparatus for controlling a lighting system in response to an audio input
HK03106910.1A HK1054839A1 (en) 2000-06-21 2003-09-25 Method and apparatus for controlling a lighting system in response to an audio input

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US21304200P 2000-06-21 2000-06-21
US60/213,042 2000-06-21

Publications (1)

Publication Number Publication Date
WO2001099475A1 true WO2001099475A1 (en) 2001-12-27

Family

ID=22793505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/019782 WO2001099475A1 (en) 2000-06-21 2001-06-21 Method and apparatus for controlling a lighting system in response to an audio input

Country Status (8)

Country Link
US (1) US7228190B2 (en)
EP (2) EP2364067B1 (en)
JP (1) JP4773673B2 (en)
AT (1) ATE539593T1 (en)
AU (1) AU2001270018A1 (en)
ES (2) ES2380075T3 (en)
HK (1) HK1054839A1 (en)
WO (1) WO2001099475A1 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003015477A1 (en) * 2001-07-23 2003-02-20 Martin Professional A/S Creating and sharing light shows
WO2003022009A1 (en) * 2001-08-30 2003-03-13 Radiant Research Limited Illumination system
WO2002101702A3 (en) * 2001-06-13 2003-05-01 Color Kinetics Inc Systems and methods of controlling light systems
WO2005022963A1 (en) 2003-09-02 2005-03-10 Richard Brown Lighting apparatus with proximity sensor
GB2412022A (en) * 2004-03-11 2005-09-14 Giga Byte Tech Co Ltd Controlling a fluorescent lamp from a motherboard
JP2006512820A (en) * 2002-12-24 2006-04-13 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and system for enhancing an audio signal
WO2006135990A1 (en) * 2005-03-31 2006-12-28 Philippe Haumann Wellness system
WO2008050281A1 (en) 2006-10-24 2008-05-02 Koninklijke Philips Electronics N.V. A system, method and computer-readable medium for displaying light radiation
WO2008053409A1 (en) * 2006-10-31 2008-05-08 Koninklijke Philips Electronics N.V. Control of light in response to an audio signal
WO2008072152A1 (en) * 2006-12-11 2008-06-19 Koninklijke Philips Electronics N.V. Visual display system with varying illumination
WO2008078233A1 (en) * 2006-12-20 2008-07-03 Koninklijke Philips Electronics N.V. A system, method and computer-readable medium for displaying light radiation
WO2008129505A1 (en) * 2007-04-24 2008-10-30 Koninklijke Philips Electronics N.V. Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
NL2000926C2 (en) * 2007-10-12 2009-04-15 Jan Jonquiere Light and sound column.
WO2009150592A1 (en) * 2008-06-12 2009-12-17 Koninklijke Philips Electronics N.V. System and method for generation of an atmosphere
WO2010048907A1 (en) * 2008-10-29 2010-05-06 Jaroslav Nusl Method for controlling in particular lighting technology by audio signal and a device for performing this method
NL1036887C2 (en) * 2009-04-21 2010-10-22 Firstfocus B V LIGHTING DEVICE, LIGHTING DEVICE DRIVING DEVICE AND LIGHTING DEVICE SYSTEM.
BE1019823A3 (en) * 2011-02-16 2013-01-08 Robrecht Karel V Noens METHOD AND DEVICE FOR CONVERTING AN AUDIO SIGNAL TO A CONTROL SIGNAL FOR AN AUDIO-VISUALIZATION SYSTEM AND AUDIO-VISUALIZATION SYSTEM.
US8427311B2 (en) 2008-01-17 2013-04-23 Koninklijke Philips Electronics N.V. Lighting device and method for producing sequential lighting stimuli
WO2013113958A1 (en) * 2012-02-02 2013-08-08 Iborra Badia Gerardo Association of lighting effects with discrete frequency signals extracted from audio signals
WO2014026659A3 (en) * 2012-08-15 2014-04-10 Univerzita Tomase Bati Ve Zline General-purpose single chip control device
WO2014111826A3 (en) * 2013-01-17 2014-11-13 Koninklijke Philips N.V. A controllable stimulus system and a method of controlling an audible stimulus and a visual stimulus
FR3011356A1 (en) * 2013-09-30 2015-04-03 Fivefive DEVICE AND METHOD FOR MULTIMEDIA RENDERING
EP2473004B1 (en) 2006-11-09 2015-04-29 Apple Inc. Brightness control of a status indicator light
US9084314B2 (en) 2006-11-28 2015-07-14 Hayward Industries, Inc. Programmable underwater lighting system
CN105874885A (en) * 2013-12-24 2016-08-17 Ag有限公司 Lighting device and frame with said lighting device attached thereto
EP3099143A1 (en) * 2015-05-29 2016-11-30 Helvar Oy Ab Method and arrangement for creating lighting effects
US20170213451A1 (en) 2016-01-22 2017-07-27 Hayward Industries, Inc. Systems and Methods for Providing Network Connectivity and Remote Monitoring, Optimization, and Control of Pool/Spa Equipment
CN109729622A (en) * 2017-10-27 2019-05-07 金志鹏 It is a kind of that formula lamp light control system and method are played based on audio
WO2019162193A1 (en) 2018-02-26 2019-08-29 Signify Holding B.V. Resuming a dynamic light effect in dependence on an effect type and/or user preference
EP3513626A4 (en) * 2016-09-14 2020-04-22 Lutron Ketra, LLC Illumination system for controlling color temperature as a function of brightness
US10718507B2 (en) 2010-04-28 2020-07-21 Hayard Industries, Inc. Underwater light having a sealed polymer housing and method of manufacture therefor
US10731831B2 (en) 2017-05-08 2020-08-04 Gemmy Industries Corp. Clip lights and related systems
US20200319621A1 (en) 2016-01-22 2020-10-08 Hayward Industries, Inc. Systems and Methods for Providing Network Connectivity and Remote Monitoring, Optimization, and Control of Pool/Spa Equipment
US10976713B2 (en) 2013-03-15 2021-04-13 Hayward Industries, Inc. Modular pool/spa control system
US11011368B2 (en) 2012-01-04 2021-05-18 Nordson Corporation Microwave excited ultraviolet lamp system with data logging and retrieval circuit and method
US11168876B2 (en) 2019-03-06 2021-11-09 Hayward Industries, Inc. Underwater light having programmable controller and replaceable light-emitting diode (LED) assembly
US11202354B2 (en) 2016-09-14 2021-12-14 Lutron Technology Company Llc Illumination system and method that presents a natural show to emulate daylight conditions with smoothing dimcurve modification thereof
US11202352B2 (en) 2016-09-14 2021-12-14 Lutron Technology Company Llc Illumination device for adjusting color temperature based on brightness and time of day
US11871495B2 (en) 2020-07-14 2024-01-09 Lutron Technology Company Llc Lighting control system with light show overrides
US12060989B2 (en) 2019-03-06 2024-08-13 Hayward Industries, Inc. Underwater light having a replaceable light-emitting diode (LED) module and cord assembly

Families Citing this family (262)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7139617B1 (en) * 1999-07-14 2006-11-21 Color Kinetics Incorporated Systems and methods for authoring lighting sequences
US20030133292A1 (en) * 1999-11-18 2003-07-17 Mueller George G. Methods and apparatus for generating and modulating white light illumination conditions
US6806659B1 (en) * 1997-08-26 2004-10-19 Color Kinetics, Incorporated Multicolored LED lighting method and apparatus
US7038398B1 (en) * 1997-08-26 2006-05-02 Color Kinetics, Incorporated Kinetic illumination system and methods
US7242152B2 (en) * 1997-08-26 2007-07-10 Color Kinetics Incorporated Systems and methods of controlling light systems
US6965205B2 (en) * 1997-08-26 2005-11-15 Color Kinetics Incorporated Light emitting diode based products
US20020043938A1 (en) * 2000-08-07 2002-04-18 Lys Ihor A. Automatic configuration systems and methods for lighting and other applications
US6548967B1 (en) * 1997-08-26 2003-04-15 Color Kinetics, Inc. Universal lighting network methods and systems
US7187141B2 (en) * 1997-08-26 2007-03-06 Color Kinetics Incorporated Methods and apparatus for illumination of liquids
US7385359B2 (en) * 1997-08-26 2008-06-10 Philips Solid-State Lighting Solutions, Inc. Information systems
US6975079B2 (en) * 1997-08-26 2005-12-13 Color Kinetics Incorporated Systems and methods for controlling illumination sources
US6777891B2 (en) * 1997-08-26 2004-08-17 Color Kinetics, Incorporated Methods and apparatus for controlling devices in a networked lighting system
US6720745B2 (en) * 1997-08-26 2004-04-13 Color Kinetics, Incorporated Data delivery track
US20040052076A1 (en) * 1997-08-26 2004-03-18 Mueller George G. Controlled lighting methods and apparatus
US7764026B2 (en) * 1997-12-17 2010-07-27 Philips Solid-State Lighting Solutions, Inc. Systems and methods for digital entertainment
US7014336B1 (en) * 1999-11-18 2006-03-21 Color Kinetics Incorporated Systems and methods for generating and modulating illumination conditions
US7233831B2 (en) 1999-07-14 2007-06-19 Color Kinetics Incorporated Systems and methods for controlling programmable lighting systems
JP2003510856A (en) * 1999-09-29 2003-03-18 カラー・キネティックス・インコーポレーテッド Combined illumination and calibration apparatus and calibration method for multiple LEDs
US7176372B2 (en) * 1999-10-19 2007-02-13 Medialab Solutions Llc Interactive digital music recorder and player
US9818386B2 (en) 1999-10-19 2017-11-14 Medialab Solutions Corp. Interactive digital music recorder and player
US20020176259A1 (en) * 1999-11-18 2002-11-28 Ducharme Alfred D. Systems and methods for converting illumination
US20070020573A1 (en) * 1999-12-21 2007-01-25 Furner Paul E Candle assembly with light emitting system
US7642730B2 (en) * 2000-04-24 2010-01-05 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for conveying information via color of light
US7550935B2 (en) * 2000-04-24 2009-06-23 Philips Solid-State Lighting Solutions, Inc Methods and apparatus for downloading lighting programs
US20050275626A1 (en) * 2000-06-21 2005-12-15 Color Kinetics Incorporated Entertainment lighting system
US7202613B2 (en) * 2001-05-30 2007-04-10 Color Kinetics Incorporated Controlled lighting methods and apparatus
US7161556B2 (en) * 2000-08-07 2007-01-09 Color Kinetics Incorporated Systems and methods for programming illumination devices
US7303300B2 (en) * 2000-09-27 2007-12-04 Color Kinetics Incorporated Methods and systems for illuminating household products
US7038399B2 (en) * 2001-03-13 2006-05-02 Color Kinetics Incorporated Methods and apparatus for providing power to lighting devices
US6883929B2 (en) 2001-04-04 2005-04-26 Color Kinetics, Inc. Indication systems and methods
US7598684B2 (en) * 2001-05-30 2009-10-06 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for controlling devices in a networked lighting system
US7364488B2 (en) 2002-04-26 2008-04-29 Philips Solid State Lighting Solutions, Inc. Methods and apparatus for enhancing inflatable devices
US7358679B2 (en) * 2002-05-09 2008-04-15 Philips Solid-State Lighting Solutions, Inc. Dimmable LED-based MR16 lighting apparatus and methods
ATE297634T1 (en) * 2002-07-04 2005-06-15 Koninkl Philips Electronics Nv DISPLAY DEVICE
US7023543B2 (en) * 2002-08-01 2006-04-04 Cunningham David W Method for controlling the luminous flux spectrum of a lighting fixture
WO2004021747A2 (en) * 2002-08-28 2004-03-11 Color Kinetics, Inc Methods and systems for illuminating environments
US7300192B2 (en) * 2002-10-03 2007-11-27 Color Kinetics Incorporated Methods and apparatus for illuminating environments
US8008561B2 (en) * 2003-01-17 2011-08-30 Motorola Mobility, Inc. Audio file format with mapped lighting effects and method for controlling lighting effects using an audio file format
US8841847B2 (en) 2003-01-17 2014-09-23 Motorola Mobility Llc Electronic device for controlling lighting effects using an audio file
US7178941B2 (en) * 2003-05-05 2007-02-20 Color Kinetics Incorporated Lighting methods and systems
US8086448B1 (en) * 2003-06-24 2011-12-27 Creative Technology Ltd Dynamic modification of a high-order perceptual attribute of an audio signal
US9035741B2 (en) * 2003-06-27 2015-05-19 Stryker Corporation Foot-operated control console for wirelessly controlling medical devices
US7883458B2 (en) * 2003-06-27 2011-02-08 Stryker Corporation System for remotely controlling two or more medical devices
US7641364B2 (en) * 2003-07-02 2010-01-05 S. C. Johnson & Son, Inc. Adapter for light bulbs equipped with volatile active dispenser and light emitting diodes
US7457670B2 (en) * 2003-08-07 2008-11-25 Production Resource Group, Llc Gobo virtual machine
US7966034B2 (en) * 2003-09-30 2011-06-21 Sony Ericsson Mobile Communications Ab Method and apparatus of synchronizing complementary multi-media effects in a wireless communication device
WO2005052751A2 (en) 2003-11-20 2005-06-09 Color Kinetics Incorporated Light system manager
KR20060108757A (en) * 2003-12-11 2006-10-18 컬러 키네틱스 인코포레이티드 Thermal management methods and apparatus for lighting devices
BRPI0418563B8 (en) * 2004-02-19 2022-11-22 Nokia Corp MOBILE COMMUNICATION TERMINAL, METHOD FOR CONTROLLING THE ACTIVATION OF THE LIGHTS OF THE MOBILE COMMUNICATION TERMINAL, COMPUTER TERMINAL, AND MUSIC SEQUENCE OR DATA FILE
WO2005084339A2 (en) * 2004-03-02 2005-09-15 Color Kinetics Incorporated Entertainment lighting system
EP1754121A4 (en) * 2004-03-15 2014-02-12 Philips Solid State Lighting Methods and systems for providing lighting systems
US7354172B2 (en) * 2004-03-15 2008-04-08 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for controlled lighting based on a reference gamut
US7515128B2 (en) * 2004-03-15 2009-04-07 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for providing luminance compensation
CA2559718C (en) * 2004-03-15 2012-05-22 Color Kinetics Incorporated Power control methods and apparatus
US20060221606A1 (en) * 2004-03-15 2006-10-05 Color Kinetics Incorporated Led-based lighting retrofit subassembly apparatus
US20050210725A1 (en) * 2004-03-26 2005-09-29 Richard Stolls Display apparatus
US20050259424A1 (en) 2004-05-18 2005-11-24 Zampini Thomas L Ii Collimating and controlling light produced by light emitting diodes
WO2006023149A2 (en) * 2004-07-08 2006-03-02 Color Kinetics Incorporated Led package methods and systems
US8115091B2 (en) * 2004-07-16 2012-02-14 Motorola Mobility, Inc. Method and device for controlling vibrational and light effects using instrument definitions in an audio file format
US20060028212A1 (en) * 2004-08-06 2006-02-09 Steiner J P System and method for graphically grouping electrical devices
US20060076908A1 (en) * 2004-09-10 2006-04-13 Color Kinetics Incorporated Lighting zone control methods and apparatus
WO2006031810A2 (en) * 2004-09-10 2006-03-23 Color Kinetics Incorporated Power control methods and apparatus for variable loads
US7710369B2 (en) * 2004-12-20 2010-05-04 Philips Solid-State Lighting Solutions, Inc. Color management methods and apparatus for lighting devices
CN1703131B (en) * 2004-12-24 2010-04-14 北京中星微电子有限公司 Method for controlling brightness and colors of light cluster by music
US7348736B2 (en) * 2005-01-24 2008-03-25 Philips Solid-State Lighting Solutions Methods and apparatus for providing workspace lighting and facilitating workspace customization
WO2006093889A2 (en) * 2005-02-28 2006-09-08 Color Kinetics Incorporated Configurations and methods for embedding electronics or light emitters in manufactured materials
US7703951B2 (en) * 2005-05-23 2010-04-27 Philips Solid-State Lighting Solutions, Inc. Modular LED-based lighting fixtures having socket engagement features
US7766518B2 (en) * 2005-05-23 2010-08-03 Philips Solid-State Lighting Solutions, Inc. LED-based light-generating modules for socket engagement, and methods of assembling, installing and removing same
US8061865B2 (en) 2005-05-23 2011-11-22 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for providing lighting via a grid system of a suspended ceiling
EP1894075A4 (en) 2005-06-06 2008-06-25 Color Kinetics Inc Methods and apparatus for implementing power cycle control of lighting devices based on network protocols
US7501571B2 (en) * 2005-06-14 2009-03-10 Jon Forsman Lighting display responsive to vibration
TWM281223U (en) * 2005-06-30 2005-11-21 Mobinote Technology Corp Illuminating audio player
TWI433588B (en) * 2005-12-13 2014-04-01 Koninkl Philips Electronics Nv Led lighting device
US7619370B2 (en) * 2006-01-03 2009-11-17 Philips Solid-State Lighting Solutions, Inc. Power allocation methods for lighting devices having multiple source spectrums, and apparatus employing same
WO2007085986A1 (en) * 2006-01-25 2007-08-02 Koninklijke Philips Electronics N.V. Control device for selecting the color of light emitted by a light source
KR101300007B1 (en) * 2006-02-10 2013-08-27 필립스 솔리드-스테이트 라이팅 솔루션스, 인크. Methods and apparatus for high power factor controlled power delivery using a single switching stage per load
FI20060910A0 (en) * 2006-03-28 2006-10-13 Genelec Oy Identification method and device in an audio reproduction system
CN101416563B (en) 2006-03-31 2012-09-26 Tp视觉控股有限公司 Event based ambient lighting control
BRPI0709214A2 (en) * 2006-03-31 2011-07-12 Konink Philips Eletronics N V method and device for controlling an ambient lighting element; and, application embedded in a computer readable medium configured to control an ambient lighting element.
US7766511B2 (en) 2006-04-24 2010-08-03 Integrated Illumination Systems LED light fixture
US7543951B2 (en) * 2006-05-03 2009-06-09 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for providing a luminous writing surface
US7658506B2 (en) * 2006-05-12 2010-02-09 Philips Solid-State Lighting Solutions, Inc. Recessed cove lighting apparatus for architectural surfaces
US20100318201A1 (en) * 2006-10-18 2010-12-16 Ambx Uk Limited Method and system for detecting effect of lighting device
WO2008051464A1 (en) * 2006-10-19 2008-05-02 Philips Solid-State Lighting Solutions Networkable led-based lighting fixtures and methods for powering and controlling same
KR101460004B1 (en) * 2006-11-10 2014-11-10 필립스 솔리드-스테이트 라이팅 솔루션스, 인크. Methods and apparatus for controlling series-connected leds
US7729941B2 (en) 2006-11-17 2010-06-01 Integrated Illumination Systems, Inc. Apparatus and method of using lighting systems to enhance brand recognition
US20080136796A1 (en) * 2006-11-20 2008-06-12 Philips Solid-State Lighting Solutions Methods and apparatus for displaying images on a moving display unit
US7504930B2 (en) * 2006-12-15 2009-03-17 Joseph William Beyda Alarm clock synchronized with an electric coffeemaker
ES2436283T3 (en) * 2007-01-05 2013-12-30 Philips Solid-State Lighting Solutions, Inc. Methods and apparatus for simulating resistive loads
US8013538B2 (en) 2007-01-26 2011-09-06 Integrated Illumination Systems, Inc. TRI-light
US7587289B1 (en) * 2007-02-13 2009-09-08 American Megatrends, Inc. Data cable powered sensor fixture
US8011794B1 (en) 2007-02-13 2011-09-06 American Megatrends, Inc. Data cable powered light fixture
US8035320B2 (en) 2007-04-20 2011-10-11 Sibert W Olin Illumination control network
CN101690397B (en) * 2007-07-02 2012-07-18 皇家飞利浦电子股份有限公司 Driver device for a load and method of driving a load with such a driver device
US8742686B2 (en) 2007-09-24 2014-06-03 Integrated Illumination Systems, Inc. Systems and methods for providing an OEM level networked lighting system
US10321528B2 (en) 2007-10-26 2019-06-11 Philips Lighting Holding B.V. Targeted content delivery using outdoor lighting networks (OLNs)
US20090128921A1 (en) * 2007-11-15 2009-05-21 Philips Solid-State Lighting Solutions Led collimator having spline surfaces and related methods
TW200939878A (en) 2007-11-28 2009-09-16 Koninkl Philips Electronics Nv Method and device for the programming of dynamic light scenarios
US8118447B2 (en) 2007-12-20 2012-02-21 Altair Engineering, Inc. LED lighting apparatus with swivel connection
US7712918B2 (en) 2007-12-21 2010-05-11 Altair Engineering , Inc. Light distribution using a light emitting diode assembly
US20090253509A1 (en) * 2008-04-02 2009-10-08 Howard Tripp Illuminated game controller
US8543249B2 (en) * 2008-04-14 2013-09-24 Digital Lumens Incorporated Power management unit with modular sensor bus
US8805550B2 (en) 2008-04-14 2014-08-12 Digital Lumens Incorporated Power management unit with power source arbitration
US8339069B2 (en) 2008-04-14 2012-12-25 Digital Lumens Incorporated Power management unit with power metering
US8823277B2 (en) 2008-04-14 2014-09-02 Digital Lumens Incorporated Methods, systems, and apparatus for mapping a network of lighting fixtures with light module identification
US8610377B2 (en) * 2008-04-14 2013-12-17 Digital Lumens, Incorporated Methods, apparatus, and systems for prediction of lighting module performance
US8368321B2 (en) * 2008-04-14 2013-02-05 Digital Lumens Incorporated Power management unit with rules-based power consumption management
US8373362B2 (en) * 2008-04-14 2013-02-12 Digital Lumens Incorporated Methods, systems, and apparatus for commissioning an LED lighting fixture with remote reporting
US10539311B2 (en) 2008-04-14 2020-01-21 Digital Lumens Incorporated Sensor-based lighting methods, apparatus, and systems
EP3576501A3 (en) * 2008-04-14 2020-01-08 Digital Lumens Incorporated Modular lighting systems
US8754589B2 (en) * 2008-04-14 2014-06-17 Digtial Lumens Incorporated Power management unit with temperature protection
US8552664B2 (en) * 2008-04-14 2013-10-08 Digital Lumens Incorporated Power management unit with ballast interface
US8531134B2 (en) * 2008-04-14 2013-09-10 Digital Lumens Incorporated LED-based lighting methods, apparatus, and systems employing LED light bars, occupancy sensing, local state machine, and time-based tracking of operational modes
US8866408B2 (en) 2008-04-14 2014-10-21 Digital Lumens Incorporated Methods, apparatus, and systems for automatic power adjustment based on energy demand information
US8841859B2 (en) * 2008-04-14 2014-09-23 Digital Lumens Incorporated LED lighting methods, apparatus, and systems including rules-based sensor data logging
US8610376B2 (en) * 2008-04-14 2013-12-17 Digital Lumens Incorporated LED lighting methods, apparatus, and systems including historic sensor data logging
WO2009134885A1 (en) * 2008-04-29 2009-11-05 Ivus Industries, Inc. Wide voltage, high efficiency led driver circuit
EP2120512A1 (en) * 2008-05-13 2009-11-18 Koninklijke Philips Electronics N.V. Stochastic dynamic atmosphere
US8255487B2 (en) 2008-05-16 2012-08-28 Integrated Illumination Systems, Inc. Systems and methods for communicating in a lighting network
US8360599B2 (en) 2008-05-23 2013-01-29 Ilumisys, Inc. Electric shock resistant L.E.D. based light
US7976196B2 (en) 2008-07-09 2011-07-12 Altair Engineering, Inc. Method of forming LED-based light and resulting LED-based light
US7946729B2 (en) 2008-07-31 2011-05-24 Altair Engineering, Inc. Fluorescent tube replacement having longitudinally oriented LEDs
DE102008038340B4 (en) * 2008-08-19 2010-04-22 Austriamicrosystems Ag Circuit arrangement for controlling a light source and method for generating a drive signal for the same
US8674626B2 (en) 2008-09-02 2014-03-18 Ilumisys, Inc. LED lamp failure alerting system
US8256924B2 (en) 2008-09-15 2012-09-04 Ilumisys, Inc. LED-based light having rapidly oscillating LEDs
US7938562B2 (en) 2008-10-24 2011-05-10 Altair Engineering, Inc. Lighting including integral communication apparatus
US8324817B2 (en) 2008-10-24 2012-12-04 Ilumisys, Inc. Light and light sensor
US8653984B2 (en) 2008-10-24 2014-02-18 Ilumisys, Inc. Integration of LED lighting control with emergency notification systems
US8214084B2 (en) 2008-10-24 2012-07-03 Ilumisys, Inc. Integration of LED lighting with building controls
US8444292B2 (en) 2008-10-24 2013-05-21 Ilumisys, Inc. End cap substitute for LED-based tube replacement light
US8901823B2 (en) 2008-10-24 2014-12-02 Ilumisys, Inc. Light and light sensor
US8476844B2 (en) * 2008-11-21 2013-07-02 B/E Aerospace, Inc. Light emitting diode (LED) lighting system providing precise color control
LT5671B (en) * 2008-12-23 2010-08-25 Vytautas JANU�ONIS Stroboscope light for subwoofers and low frequency speakers
US8556452B2 (en) 2009-01-15 2013-10-15 Ilumisys, Inc. LED lens
US8362710B2 (en) 2009-01-21 2013-01-29 Ilumisys, Inc. Direct AC-to-DC converter for passive component minimization and universal operation of LED arrays
US8664880B2 (en) 2009-01-21 2014-03-04 Ilumisys, Inc. Ballast/line detection circuit for fluorescent replacement lamps
US8954170B2 (en) * 2009-04-14 2015-02-10 Digital Lumens Incorporated Power management unit with multi-input arbitration
US8593135B2 (en) 2009-04-14 2013-11-26 Digital Lumens Incorporated Low-cost power measurement circuit
US8536802B2 (en) * 2009-04-14 2013-09-17 Digital Lumens Incorporated LED-based lighting methods, apparatus, and systems employing LED light bars, occupancy sensing, and local state machine
US8585245B2 (en) 2009-04-23 2013-11-19 Integrated Illumination Systems, Inc. Systems and methods for sealing a lighting fixture
US8330381B2 (en) 2009-05-14 2012-12-11 Ilumisys, Inc. Electronic circuit for DC conversion of fluorescent lighting ballast
US8299695B2 (en) 2009-06-02 2012-10-30 Ilumisys, Inc. Screw-in LED bulb comprising a base having outwardly projecting nodes
US8740701B2 (en) 2009-06-15 2014-06-03 Wms Gaming, Inc. Controlling wagering game system audio
EP2446715A4 (en) 2009-06-23 2013-09-11 Ilumisys Inc Illumination device including leds and a switching power control system
WO2011005798A1 (en) 2009-07-07 2011-01-13 Wms Gaming, Inc. Controlling wagering game lighting content
WO2011005797A1 (en) 2009-07-07 2011-01-13 Wms Gaming, Inc. Controlling gaming effects for gaming network nodes
WO2011007293A2 (en) * 2009-07-15 2011-01-20 Koninklijke Philips Electronics N.V. Method for controlling a second modality based on a first modality
US10269207B2 (en) 2009-07-31 2019-04-23 Bally Gaming, Inc. Controlling casino lighting content and audio content
US9011247B2 (en) 2009-07-31 2015-04-21 Wms Gaming, Inc. Controlling casino lighting content and audio content
US8622830B2 (en) * 2009-08-20 2014-01-07 Wms Gaming, Inc. Controlling sound distribution in wagering game applications
CA2777998A1 (en) * 2009-10-19 2011-04-28 Emteq, Inc. Led lighting system
US9549451B2 (en) * 2009-10-26 2017-01-17 Eldolab Holding B.V. Method for operating a lighting grid and lighting unit for use in a lighting grid
US8269646B2 (en) * 2009-10-30 2012-09-18 Robert Francis Exman Audio driven synchronized light display
US8613667B2 (en) 2009-12-21 2013-12-24 Wms Gaming, Inc. Position-based lighting coordination in wagering game systems
CN105354940A (en) * 2010-01-26 2016-02-24 踏途音乐公司 Digital jukebox device with improved user interfaces, and associated methods
US20110181201A1 (en) * 2010-01-27 2011-07-28 Dale Hollis LED Display System and Method for Use with an Audio System
WO2011119921A2 (en) 2010-03-26 2011-09-29 Altair Engineering, Inc. Led light with thermoelectric generator
WO2011119958A1 (en) 2010-03-26 2011-09-29 Altair Engineering, Inc. Inside-out led bulb
US9057493B2 (en) 2010-03-26 2015-06-16 Ilumisys, Inc. LED light tube with dual sided light distribution
US8917905B1 (en) * 2010-04-15 2014-12-23 Don K. Dill Vision-2-vision control system
US8840464B1 (en) 2010-04-26 2014-09-23 Wms Gaming, Inc. Coordinating media in a wagering game environment
US9367987B1 (en) 2010-04-26 2016-06-14 Bally Gaming, Inc. Selecting color in wagering game systems
US8814673B1 (en) 2010-04-26 2014-08-26 Wms Gaming, Inc. Presenting lighting content in wagering game systems
US8912727B1 (en) 2010-05-17 2014-12-16 Wms Gaming, Inc. Wagering game lighting device chains
WO2011145381A1 (en) * 2010-05-21 2011-11-24 シャープ株式会社 Controller, method of controlling illumination, and network system
US8454193B2 (en) 2010-07-08 2013-06-04 Ilumisys, Inc. Independent modules for LED fluorescent light tube replacement
KR101107775B1 (en) * 2010-07-09 2012-01-20 김범수 Installation for emotional lighting having a function of chaos rhythm
WO2012009260A2 (en) 2010-07-12 2012-01-19 Altair Engineering, Inc. Circuit board mount for led light tube
JP2012027227A (en) * 2010-07-23 2012-02-09 Sony Corp Trigger generation device, display control device, trigger generation method, display control method, trigger generation program, and display control program
US8827805B1 (en) 2010-08-06 2014-09-09 Wms Gaming, Inc. Balancing community gaming effects
US8697977B1 (en) * 2010-10-12 2014-04-15 Travis Lysaght Dynamic lighting for musical instrument
WO2012058556A2 (en) 2010-10-29 2012-05-03 Altair Engineering, Inc. Mechanisms for reducing risk of shock during installation of light tube
CA2816978C (en) 2010-11-04 2020-07-28 Digital Lumens Incorporated Method, apparatus, and system for occupancy sensing
JP5477357B2 (en) * 2010-11-09 2014-04-23 株式会社デンソー Sound field visualization system
US8870415B2 (en) 2010-12-09 2014-10-28 Ilumisys, Inc. LED fluorescent tube replacement light with reduced shock hazard
US10630820B2 (en) 2011-03-11 2020-04-21 Ilumi Solutions, Inc. Wireless communication methods
US10321541B2 (en) 2011-03-11 2019-06-11 Ilumi Solutions, Inc. LED lighting device
US8890435B2 (en) 2011-03-11 2014-11-18 Ilumi Solutions, Inc. Wireless lighting control system
US8569606B2 (en) * 2011-03-15 2013-10-29 Panasonic Corporation Music and light synchronization system
US9066381B2 (en) 2011-03-16 2015-06-23 Integrated Illumination Systems, Inc. System and method for low level dimming
AU2012230991A1 (en) 2011-03-21 2013-10-10 Digital Lumens Incorporated Methods, apparatus and systems for providing occupancy-based variable lighting
US8710770B2 (en) 2011-07-26 2014-04-29 Hunter Industries, Inc. Systems and methods for providing power and data to lighting devices
US10874003B2 (en) 2011-07-26 2020-12-22 Hunter Industries, Inc. Systems and methods for providing power and data to devices
US11917740B2 (en) 2011-07-26 2024-02-27 Hunter Industries, Inc. Systems and methods for providing power and data to devices
US20150237700A1 (en) 2011-07-26 2015-08-20 Hunter Industries, Inc. Systems and methods to control color and brightness of lighting devices
US9521725B2 (en) 2011-07-26 2016-12-13 Hunter Industries, Inc. Systems and methods for providing power and data to lighting devices
US9609720B2 (en) 2011-07-26 2017-03-28 Hunter Industries, Inc. Systems and methods for providing power and data to lighting devices
US9072171B2 (en) 2011-08-24 2015-06-30 Ilumisys, Inc. Circuit board mount for LED light
KR20130041690A (en) 2011-10-17 2013-04-25 엘지이노텍 주식회사 Lighting apparatus, lighting system comprising same and dricing method for thereof
CA3045805A1 (en) 2011-11-03 2013-05-10 Digital Lumens Incorporated Methods, systems, and apparatus for intelligent lighting
US9143595B1 (en) * 2011-11-29 2015-09-22 Ryan Michael Dowd Multi-listener headphone system with luminescent light emissions dependent upon selected channels
US20130147395A1 (en) * 2011-12-07 2013-06-13 Comcast Cable Communications, Llc Dynamic Ambient Lighting
US9204519B2 (en) 2012-02-25 2015-12-01 Pqj Corp Control system with user interface for lighting fixtures
RU2635089C2 (en) * 2012-03-01 2017-11-09 Филипс Лайтинг Холдинг Б.В. Method and device for interpolation of transmissions with low frequency of frames in lighting systems
WO2013131002A1 (en) 2012-03-02 2013-09-06 Ilumisys, Inc. Electrical connector header for an led-based light
EP2637327A1 (en) * 2012-03-09 2013-09-11 Harman International Industries Ltd. Audio mixing console with lighting control and method of mixing by means of a mixing console
CA2867898C (en) 2012-03-19 2023-02-14 Digital Lumens Incorporated Methods, systems, and apparatus for providing variable illumination
WO2014008463A1 (en) 2012-07-06 2014-01-09 Ilumisys, Inc. Power supply assembly for led-based light tube
US9271367B2 (en) 2012-07-09 2016-02-23 Ilumisys, Inc. System and method for controlling operation of an LED-based light
US8894437B2 (en) 2012-07-19 2014-11-25 Integrated Illumination Systems, Inc. Systems and methods for connector enabling vertical removal
JP5966784B2 (en) * 2012-09-07 2016-08-10 ソニー株式会社 Lighting device and program
US9379578B2 (en) 2012-11-19 2016-06-28 Integrated Illumination Systems, Inc. Systems and methods for multi-state power management
US9420665B2 (en) 2012-12-28 2016-08-16 Integration Illumination Systems, Inc. Systems and methods for continuous adjustment of reference signal to control chip
US9485814B2 (en) 2013-01-04 2016-11-01 Integrated Illumination Systems, Inc. Systems and methods for a hysteresis based driver using a LED as a voltage reference
US9652118B2 (en) * 2013-01-16 2017-05-16 Marcus Thomas Llc System and method for generating a color palette based on audio content
US9380443B2 (en) 2013-03-12 2016-06-28 Comcast Cable Communications, Llc Immersive positioning and paring
US20140285326A1 (en) * 2013-03-15 2014-09-25 Aliphcom Combination speaker and light source responsive to state(s) of an organism based on sensor data
US9285084B2 (en) 2013-03-14 2016-03-15 Ilumisys, Inc. Diffusers for LED-based lights
US20140265865A1 (en) * 2013-03-15 2014-09-18 Abl Ip Holding Llc Systems and methods for providing a preview bar of a light show
US20140266766A1 (en) * 2013-03-15 2014-09-18 Kevin Dobbe System and method for controlling multiple visual media elements using music input
EP2992395B1 (en) 2013-04-30 2018-03-07 Digital Lumens Incorporated Operating light emitting diodes at low temperature
US9267650B2 (en) 2013-10-09 2016-02-23 Ilumisys, Inc. Lens for an LED-based light
CA2926260C (en) 2013-10-10 2023-01-24 Digital Lumens Incorporated Methods, systems, and apparatus for intelligent lighting
US9360206B2 (en) * 2013-10-24 2016-06-07 Grover Musical Products, Inc. Illumination system for percussion instruments
DE102013112127A1 (en) 2013-11-05 2015-05-07 Eaton Electrical Ip Gmbh & Co. Kg Multicolor signal arrangement, method for defining modes of a multi-color signal arrangement and system, comprising a multicolor signal arrangement and an RFID transmitter
EP3097748A1 (en) 2014-01-22 2016-11-30 iLumisys, Inc. Led-based light with addressed leds
JP6484079B2 (en) * 2014-03-24 2019-03-13 株式会社 ハイディープHiDeep Inc. Kansei transmission method and terminal for the same
US9934180B2 (en) 2014-03-26 2018-04-03 Pqj Corp System and method for communicating with and for controlling of programmable apparatuses
US9510400B2 (en) 2014-05-13 2016-11-29 Ilumisys, Inc. User input systems for an LED-based light
KR101527328B1 (en) * 2014-07-23 2015-06-09 엘지전자 주식회사 Lighting apparatus and Method for controlling thereof
TWI579733B (en) * 2014-09-19 2017-04-21 天使學園網路股份有限公司 Healthy environment a apparatus and settings thereof
KR101664561B1 (en) * 2014-10-07 2016-10-17 엘지이노텍 주식회사 Lighting apparatus, lighting system and controling method of lighting apparatus
US9485838B2 (en) * 2014-12-12 2016-11-01 Osram Sylvania Inc. Lighting system for contained environments
WO2016123007A1 (en) * 2015-01-26 2016-08-04 Eventide Inc. Lighting systems and methods
US10228711B2 (en) 2015-05-26 2019-03-12 Hunter Industries, Inc. Decoder systems and methods for irrigation control
US10918030B2 (en) 2015-05-26 2021-02-16 Hunter Industries, Inc. Decoder systems and methods for irrigation control
US10060599B2 (en) 2015-05-29 2018-08-28 Integrated Illumination Systems, Inc. Systems, methods and apparatus for programmable light fixtures
US10030844B2 (en) 2015-05-29 2018-07-24 Integrated Illumination Systems, Inc. Systems, methods and apparatus for illumination using asymmetrical optics
US10161568B2 (en) 2015-06-01 2018-12-25 Ilumisys, Inc. LED-based light with canted outer walls
US11978336B2 (en) 2015-07-07 2024-05-07 Ilumi Solutions, Inc. Wireless control device and methods thereof
US10339796B2 (en) 2015-07-07 2019-07-02 Ilumi Sulutions, Inc. Wireless control device and methods thereof
EP3320702B1 (en) 2015-07-07 2022-10-19 Ilumi Solutions, Inc. Wireless communication methods
KR102358025B1 (en) * 2015-10-07 2022-02-04 삼성전자주식회사 Electronic device and music visualization method thereof
SG10201604137QA (en) * 2016-05-24 2017-12-28 Creative Tech Ltd An apparatus for controlling lighting behavior of a plurality of lighting elements and a method therefor
EP3407683B1 (en) * 2016-01-21 2020-09-16 AlphaTheta Corporation Lighting control device, lighting control method and lighting control program
US9854654B2 (en) 2016-02-03 2017-12-26 Pqj Corp System and method of control of a programmable lighting fixture with embedded memory
US10319395B2 (en) 2016-03-11 2019-06-11 Limbic Media Corporation System and method for predictive generation of visual sequences
WO2017181291A1 (en) 2016-04-22 2017-10-26 Nanoleaf (Hk) Limited Systems and methods for connecting and controlling configurable lighting units
US10923151B2 (en) * 2016-05-12 2021-02-16 Alphatheta Corporation Illumination control device, illumination control method and illumination control program
CN109154800A (en) * 2016-05-19 2019-01-04 奥佐集团股份公司 For controlling the method and control equipment, computer program product and BAS of utensil based on media file
US10010806B2 (en) * 2016-05-24 2018-07-03 Creative Technology Ltd Apparatus for controlling lighting behavior of a plurality of lighting elements and a method therefor
WO2018027297A1 (en) * 2016-08-12 2018-02-15 9255-7248 Québec Inc. Method and system for synchronizing lighting to music
ES2874191T3 (en) * 2016-10-03 2021-11-04 Signify Holding Bv Procedure and apparatus for controlling luminaires of a lighting system based on a current mode of an entertainment device
JP6992798B2 (en) * 2017-02-24 2022-01-13 ソニーグループ株式会社 Master playback device, slave playback device, and their light emission method
US10625170B2 (en) * 2017-03-09 2020-04-21 Lumena Inc. Immersive device
US11058961B2 (en) * 2017-03-09 2021-07-13 Kaleb Matson Immersive device
CN209184850U (en) * 2017-05-26 2019-07-30 酷码科技股份有限公司 Lamp control system
WO2019016135A1 (en) * 2017-07-19 2019-01-24 Philips Lighting Holding B.V. Speech control
GB2560395B (en) * 2017-08-23 2019-03-27 Allen & Heath Ltd A programmable audio level indicator
US10194505B1 (en) * 2017-11-29 2019-01-29 Harman International Industries, Incorporated Audio bus lighting control
CN112335340B (en) 2018-06-15 2023-10-31 昕诺飞控股有限公司 Method and controller for selecting media content based on lighting scene
US10877652B2 (en) * 2018-06-21 2020-12-29 Bose Corporation Synchronizing timed events with media
US10764984B2 (en) * 2018-07-30 2020-09-01 David KASLE Method and apparatus for musical instrument with dynamic animation and lighting
US10728643B2 (en) * 2018-09-28 2020-07-28 David M. Solak Sound conversion device
AU2019376177A1 (en) * 2018-11-09 2021-05-27 Akili Interactive Labs, Inc. Audio-only interference training for cognitive disorder screening and treatment
US10801714B1 (en) 2019-10-03 2020-10-13 CarJamz, Inc. Lighting device
US20210318847A1 (en) * 2020-04-13 2021-10-14 Steven Sullivan Apparatus for generating audio and/or performance synchronized optical output, and musical instrument and systems therefor
CN113692091B (en) * 2021-08-11 2024-04-02 深圳市智岩科技有限公司 Equipment control method, device, terminal equipment and storage medium
CN114222411B (en) * 2021-12-17 2023-11-03 广西世纪创新显示电子有限公司 PC digital audio/video rhythm control method and device and storage medium
CN114245511B (en) * 2021-12-27 2024-02-23 自贡海天文化股份有限公司 Sound control lamplight interaction system and control method thereof
CN114422846A (en) * 2021-12-31 2022-04-29 深圳市智岩科技有限公司 Lamp effect control method, system, device, electronic equipment and storage medium
CN117580225B (en) * 2023-12-26 2024-06-28 深圳市动力科技电子有限公司 Control method, device and system of LED electronic candle and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2045098A (en) * 1979-01-19 1980-10-29 Group Nh Ltd Soft toys
GB2135536A (en) * 1982-12-24 1984-08-30 Wobbot International Limited Sound responsive lighting system and devices incorporating same
US4753148A (en) * 1986-12-01 1988-06-28 Johnson Tom A Sound emphasizer
US4843627A (en) * 1986-08-05 1989-06-27 Stebbins Russell T Circuit and method for providing a light energy response to an event in real time
US4962687A (en) * 1988-09-06 1990-10-16 Belliveau Richard S Variable color lighting system
EP0495305A2 (en) 1991-01-14 1992-07-22 Vari-Lite, Inc. Creating and controlling lighting designs
US5402702A (en) * 1992-07-14 1995-04-04 Jalco Co., Ltd. Trigger circuit unit for operating light emitting members such as leds or motors for use in personal ornament or toy in synchronization with music
US5461188A (en) 1994-03-07 1995-10-24 Drago; Marcello S. Synthesized music, sound and light system
EP0823813A2 (en) * 1994-03-23 1998-02-11 Kopin Corporation Portable communication device
US5734590A (en) * 1992-10-16 1998-03-31 Tebbe; Gerold Recording medium and device for generating sounds and/or pictures
US5769527A (en) 1986-07-17 1998-06-23 Vari-Lite, Inc. Computer controlled lighting system with distributed control resources
EP0935234A1 (en) 1998-02-05 1999-08-11 Casio Computer Co., Ltd. Musical performance training data transmission
EP0942631A2 (en) 1998-03-11 1999-09-15 BRUNSWICK BOWLING & BILLIARDS CORPORATION Bowling center lighting system
US6008783A (en) * 1996-05-28 1999-12-28 Kawai Musical Instruments Manufacturing Co. Ltd. Keyboard instrument with the display device employing fingering guide
US6016038A (en) 1997-08-26 2000-01-18 Color Kinetics, Inc. Multicolored LED lighting method and apparatus
GB2354602A (en) 1999-09-07 2001-03-28 Peter Stefan Jones Digital controlling system for electronic lighting devices

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3111057A (en) * 1959-04-14 1963-11-19 Stanley S Cramer Means for providing variable lighting effects
US3163077A (en) * 1961-10-23 1964-12-29 Shafford Electronics & Dev Cor Color display apparatus
US3240099A (en) * 1963-04-12 1966-03-15 Dale M Irons Sound responsive light system
US3205755A (en) * 1963-11-27 1965-09-14 Audiomotor Corp Production of colored lights from audio impulses
US3241419A (en) * 1964-01-06 1966-03-22 Wed Entpr Inc Audio frequency-responsive lighting display
US3215022A (en) * 1964-05-15 1965-11-02 Elden G Chapman Apparatus for projected light effects
US3307443A (en) * 1964-12-03 1967-03-07 Orvil F Shallenberger Apparatus for displaying colored light
US3550497A (en) * 1968-09-26 1970-12-29 Gregory S Marsh Color display for sound reproducing systems
US3540343A (en) * 1969-06-11 1970-11-17 Curtis Electro Lighting Inc Sound-controlled lighting system
US3845468A (en) * 1972-10-10 1974-10-29 R Smith Display system for musical tones
US4176581A (en) * 1977-11-28 1979-12-04 Stuyvenberg Bernard R Audio amplitude-responsive lighting display
WO1981000637A1 (en) 1979-08-27 1981-03-05 N Louez Method of representing sound by colour
JPS56501622A (en) 1979-11-27 1981-11-05
AU532795B2 (en) 1979-11-27 1983-10-13 Ingord Limited Audio-visual display system
US4376404A (en) * 1980-10-23 1983-03-15 Agricultural Aviation Engineering Co. Apparatus for translating sound into a visual display
JPS6168894A (en) * 1984-09-11 1986-04-09 日立プラント建設株式会社 Method of controlling under ice light emitting device
JPH0670748B2 (en) 1985-03-20 1994-09-07 ペイスト.ロジヤ−.エム Video display
US5209560A (en) * 1986-07-17 1993-05-11 Vari-Lite, Inc. Computer controlled lighting system with intelligent data distribution network
US5329431A (en) * 1986-07-17 1994-07-12 Vari-Lite, Inc. Computer controlled lighting system with modular control resources
US5010459A (en) * 1986-07-17 1991-04-23 Vari-Lite, Inc. Console/lamp unit coordination and communication in lighting systems
US5078039A (en) * 1988-09-06 1992-01-07 Lightwave Research Microprocessor controlled lamp flashing system with cooldown protection
US5752225A (en) * 1989-01-27 1998-05-12 Dolby Laboratories Licensing Corporation Method and apparatus for split-band encoding and split-band decoding of audio information using adaptive bit allocation to adjacent subbands
US5191319A (en) * 1990-10-15 1993-03-02 Kiltz Richard M Method and apparatus for visual portrayal of music
US5892833A (en) * 1993-04-28 1999-04-06 Night Technologies International Gain and equalization system and method
US6097352A (en) * 1994-03-23 2000-08-01 Kopin Corporation Color sequential display panels
DK0889746T3 (en) * 1995-06-08 2003-07-21 Claus Hvass Method and apparatus for converting audio signals to light
US5737254A (en) * 1995-10-27 1998-04-07 Motorola Inc. Symmetrical filtering apparatus and method therefor
JPH09139289A (en) 1995-11-14 1997-05-27 Sony Corp Quantity-of-light adjusting device and quantity-of-light adjusting method
GB2319346B (en) * 1996-11-13 2001-03-21 Sony Uk Ltd Analysis of audio signals
US6270229B1 (en) * 1996-12-24 2001-08-07 Tseng-Lu Chien Audio device including an illumination arrangement
US6211626B1 (en) * 1997-08-26 2001-04-03 Color Kinetics, Incorporated Illumination components
US6292901B1 (en) * 1997-08-26 2001-09-18 Color Kinetics Incorporated Power/data protocol
JPH11249652A (en) * 1998-01-05 1999-09-17 Yamaha Corp Keyboard instrument and play supporting device therefor
DE60018626T2 (en) * 1999-01-29 2006-04-13 Yamaha Corp., Hamamatsu Device and method for entering control files for music lectures
US6618031B1 (en) * 1999-02-26 2003-09-09 Three-Five Systems, Inc. Method and apparatus for independent control of brightness and color balance in display and illumination systems
US6225546B1 (en) * 2000-04-05 2001-05-01 International Business Machines Corporation Method and apparatus for music summarization and creation of audio summaries

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2045098A (en) * 1979-01-19 1980-10-29 Group Nh Ltd Soft toys
GB2135536A (en) * 1982-12-24 1984-08-30 Wobbot International Limited Sound responsive lighting system and devices incorporating same
US5769527A (en) 1986-07-17 1998-06-23 Vari-Lite, Inc. Computer controlled lighting system with distributed control resources
US4843627A (en) * 1986-08-05 1989-06-27 Stebbins Russell T Circuit and method for providing a light energy response to an event in real time
US4753148A (en) * 1986-12-01 1988-06-28 Johnson Tom A Sound emphasizer
US4962687A (en) * 1988-09-06 1990-10-16 Belliveau Richard S Variable color lighting system
EP0495305A2 (en) 1991-01-14 1992-07-22 Vari-Lite, Inc. Creating and controlling lighting designs
US5402702A (en) * 1992-07-14 1995-04-04 Jalco Co., Ltd. Trigger circuit unit for operating light emitting members such as leds or motors for use in personal ornament or toy in synchronization with music
US5734590A (en) * 1992-10-16 1998-03-31 Tebbe; Gerold Recording medium and device for generating sounds and/or pictures
US5461188A (en) 1994-03-07 1995-10-24 Drago; Marcello S. Synthesized music, sound and light system
EP0823813A2 (en) * 1994-03-23 1998-02-11 Kopin Corporation Portable communication device
US6008783A (en) * 1996-05-28 1999-12-28 Kawai Musical Instruments Manufacturing Co. Ltd. Keyboard instrument with the display device employing fingering guide
US6016038A (en) 1997-08-26 2000-01-18 Color Kinetics, Inc. Multicolored LED lighting method and apparatus
EP0935234A1 (en) 1998-02-05 1999-08-11 Casio Computer Co., Ltd. Musical performance training data transmission
EP0942631A2 (en) 1998-03-11 1999-09-15 BRUNSWICK BOWLING & BILLIARDS CORPORATION Bowling center lighting system
GB2354602A (en) 1999-09-07 2001-03-28 Peter Stefan Jones Digital controlling system for electronic lighting devices

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002101702A3 (en) * 2001-06-13 2003-05-01 Color Kinetics Inc Systems and methods of controlling light systems
WO2003015477A1 (en) * 2001-07-23 2003-02-20 Martin Professional A/S Creating and sharing light shows
WO2003022009A1 (en) * 2001-08-30 2003-03-13 Radiant Research Limited Illumination system
US6963175B2 (en) 2001-08-30 2005-11-08 Radiant Research Limited Illumination control system
JP2006512820A (en) * 2002-12-24 2006-04-13 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and system for enhancing an audio signal
GB2420457A (en) * 2003-09-02 2006-05-24 Richard Brown Lighting apparatus with proximity sensor
WO2005022963A1 (en) 2003-09-02 2005-03-10 Richard Brown Lighting apparatus with proximity sensor
GB2420457B (en) * 2003-09-02 2006-10-25 Richard Brown Lighting apparatus with proximity sensor
FR2867654A1 (en) * 2004-03-11 2005-09-16 Giga Byte Tech Co Ltd APPARATUS AND METHOD FOR CONTROLLING CCFL LAMP
GB2412022B (en) * 2004-03-11 2006-05-31 Giga Byte Tech Co Ltd Apparatus and method for controlling fluorescent lamp
GB2412022A (en) * 2004-03-11 2005-09-14 Giga Byte Tech Co Ltd Controlling a fluorescent lamp from a motherboard
WO2006135990A1 (en) * 2005-03-31 2006-12-28 Philippe Haumann Wellness system
WO2008050281A1 (en) 2006-10-24 2008-05-02 Koninklijke Philips Electronics N.V. A system, method and computer-readable medium for displaying light radiation
WO2008053409A1 (en) * 2006-10-31 2008-05-08 Koninklijke Philips Electronics N.V. Control of light in response to an audio signal
US8461443B2 (en) 2006-10-31 2013-06-11 Tp Vision Holding B.V. Control of light in response to an audio signal
CN101536609A (en) * 2006-10-31 2009-09-16 皇家飞利浦电子股份有限公司 Control of light in response to an audio signal
EP2473004B1 (en) 2006-11-09 2015-04-29 Apple Inc. Brightness control of a status indicator light
US9084314B2 (en) 2006-11-28 2015-07-14 Hayward Industries, Inc. Programmable underwater lighting system
US8174488B2 (en) 2006-12-11 2012-05-08 Koninklijke Philips Electronics N.V. Visual display system with varying illumination
WO2008072152A1 (en) * 2006-12-11 2008-06-19 Koninklijke Philips Electronics N.V. Visual display system with varying illumination
CN101563958A (en) * 2006-12-20 2009-10-21 皇家飞利浦电子股份有限公司 A system, method and computer-readable medium for displaying light radiation
WO2008078233A1 (en) * 2006-12-20 2008-07-03 Koninklijke Philips Electronics N.V. A system, method and computer-readable medium for displaying light radiation
US8228353B2 (en) 2006-12-20 2012-07-24 Tp Vision Holding B.V. System, method and computer-readable medium for displaying light radiation
CN101563958B (en) * 2006-12-20 2013-07-31 Tp视觉控股有限公司 A system, method and computer-readable medium for displaying light radiation
CN101669406A (en) * 2007-04-24 2010-03-10 皇家飞利浦电子股份有限公司 Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
WO2008129505A1 (en) * 2007-04-24 2008-10-30 Koninklijke Philips Electronics N.V. Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
US8374880B2 (en) 2007-04-24 2013-02-12 Koninklijke Philips Electronics N.V. System for automatically creating a lighting atmosphere based on a keyword input
WO2009048333A1 (en) * 2007-10-12 2009-04-16 JONQUIÈRE, Jan Light and sound column
NL2000926C2 (en) * 2007-10-12 2009-04-15 Jan Jonquiere Light and sound column.
US8427311B2 (en) 2008-01-17 2013-04-23 Koninklijke Philips Electronics N.V. Lighting device and method for producing sequential lighting stimuli
WO2009150592A1 (en) * 2008-06-12 2009-12-17 Koninklijke Philips Electronics N.V. System and method for generation of an atmosphere
GB2479280A (en) * 2008-10-29 2011-10-05 Jaroslav Nusl Method for controlling in particular lighting technology by audio signal and a device for performing this method
WO2010048907A1 (en) * 2008-10-29 2010-05-06 Jaroslav Nusl Method for controlling in particular lighting technology by audio signal and a device for performing this method
NL1036887C2 (en) * 2009-04-21 2010-10-22 Firstfocus B V LIGHTING DEVICE, LIGHTING DEVICE DRIVING DEVICE AND LIGHTING DEVICE SYSTEM.
US10718507B2 (en) 2010-04-28 2020-07-21 Hayard Industries, Inc. Underwater light having a sealed polymer housing and method of manufacture therefor
BE1019823A3 (en) * 2011-02-16 2013-01-08 Robrecht Karel V Noens METHOD AND DEVICE FOR CONVERTING AN AUDIO SIGNAL TO A CONTROL SIGNAL FOR AN AUDIO-VISUALIZATION SYSTEM AND AUDIO-VISUALIZATION SYSTEM.
US11011368B2 (en) 2012-01-04 2021-05-18 Nordson Corporation Microwave excited ultraviolet lamp system with data logging and retrieval circuit and method
WO2013113958A1 (en) * 2012-02-02 2013-08-08 Iborra Badia Gerardo Association of lighting effects with discrete frequency signals extracted from audio signals
EP2811220A4 (en) * 2012-02-02 2015-11-04 Badia Gerardo Iborra Association of lighting effects with discrete frequency signals extracted from audio signals
WO2014026659A3 (en) * 2012-08-15 2014-04-10 Univerzita Tomase Bati Ve Zline General-purpose single chip control device
WO2014111826A3 (en) * 2013-01-17 2014-11-13 Koninklijke Philips N.V. A controllable stimulus system and a method of controlling an audible stimulus and a visual stimulus
US10976713B2 (en) 2013-03-15 2021-04-13 Hayward Industries, Inc. Modular pool/spa control system
US11822300B2 (en) 2013-03-15 2023-11-21 Hayward Industries, Inc. Modular pool/spa control system
FR3011356A1 (en) * 2013-09-30 2015-04-03 Fivefive DEVICE AND METHOD FOR MULTIMEDIA RENDERING
EP3102004A4 (en) * 2013-12-24 2017-11-22 AG Inc. Lighting device and frame with said lighting device attached thereto
CN105874885A (en) * 2013-12-24 2016-08-17 Ag有限公司 Lighting device and frame with said lighting device attached thereto
EP3099143A1 (en) * 2015-05-29 2016-11-30 Helvar Oy Ab Method and arrangement for creating lighting effects
US10272014B2 (en) 2016-01-22 2019-04-30 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US11720085B2 (en) 2016-01-22 2023-08-08 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US10363197B2 (en) 2016-01-22 2019-07-30 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US11096862B2 (en) 2016-01-22 2021-08-24 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US11129256B2 (en) 2016-01-22 2021-09-21 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US11122669B2 (en) 2016-01-22 2021-09-14 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US20200319621A1 (en) 2016-01-22 2020-10-08 Hayward Industries, Inc. Systems and Methods for Providing Network Connectivity and Remote Monitoring, Optimization, and Control of Pool/Spa Equipment
US10219975B2 (en) 2016-01-22 2019-03-05 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US11000449B2 (en) 2016-01-22 2021-05-11 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US20170213451A1 (en) 2016-01-22 2017-07-27 Hayward Industries, Inc. Systems and Methods for Providing Network Connectivity and Remote Monitoring, Optimization, and Control of Pool/Spa Equipment
US11202354B2 (en) 2016-09-14 2021-12-14 Lutron Technology Company Llc Illumination system and method that presents a natural show to emulate daylight conditions with smoothing dimcurve modification thereof
EP3513626A4 (en) * 2016-09-14 2020-04-22 Lutron Ketra, LLC Illumination system for controlling color temperature as a function of brightness
US11202352B2 (en) 2016-09-14 2021-12-14 Lutron Technology Company Llc Illumination device for adjusting color temperature based on brightness and time of day
US11641706B2 (en) 2016-09-14 2023-05-02 Lutron Technology Company Llc Illumination system and method that presents a natural show to emulate daylight conditions with smoothing dimcurve modification thereof
US11930570B2 (en) 2016-09-14 2024-03-12 Lutron Technology Company Llc Illumination device for adjusting color temperature based on brightness and time of day
US10731831B2 (en) 2017-05-08 2020-08-04 Gemmy Industries Corp. Clip lights and related systems
CN109729622A (en) * 2017-10-27 2019-05-07 金志鹏 It is a kind of that formula lamp light control system and method are played based on audio
US11140761B2 (en) 2018-02-26 2021-10-05 Signify Holding B.V. Resuming a dynamic light effect in dependence on an effect type and/or user preference
CN111742620A (en) * 2018-02-26 2020-10-02 昕诺飞控股有限公司 Restarting dynamic light effects according to effect type and/or user preference
CN111742620B (en) * 2018-02-26 2023-08-01 昕诺飞控股有限公司 Restarting dynamic light effects based on effect type and/or user preferences
WO2019162193A1 (en) 2018-02-26 2019-08-29 Signify Holding B.V. Resuming a dynamic light effect in dependence on an effect type and/or user preference
US11168876B2 (en) 2019-03-06 2021-11-09 Hayward Industries, Inc. Underwater light having programmable controller and replaceable light-emitting diode (LED) assembly
US11754268B2 (en) 2019-03-06 2023-09-12 Hayward Industries, Inc. Underwater light having programmable controller and replaceable light-emitting diode (LED) assembly
US12060989B2 (en) 2019-03-06 2024-08-13 Hayward Industries, Inc. Underwater light having a replaceable light-emitting diode (LED) module and cord assembly
US11871495B2 (en) 2020-07-14 2024-01-09 Lutron Technology Company Llc Lighting control system with light show overrides

Also Published As

Publication number Publication date
ES2443571T3 (en) 2014-02-19
US20020038157A1 (en) 2002-03-28
ATE539593T1 (en) 2012-01-15
EP2364067A2 (en) 2011-09-07
JP2004501497A (en) 2004-01-15
AU2001270018A1 (en) 2002-01-02
EP2364067B1 (en) 2013-12-11
ES2380075T3 (en) 2012-05-08
EP1295515A1 (en) 2003-03-26
HK1054839A1 (en) 2003-12-12
US7228190B2 (en) 2007-06-05
EP2364067A3 (en) 2011-12-14
EP1295515B1 (en) 2011-12-28
JP4773673B2 (en) 2011-09-14

Similar Documents

Publication Publication Date Title
EP2364067B1 (en) Method and apparatus for controlling a lighting system in response to an audio input
US7353071B2 (en) Method and apparatus for authoring and playing back lighting sequences
US20080140231A1 (en) Methods and apparatus for authoring and playing back lighting sequences
US7139617B1 (en) Systems and methods for authoring lighting sequences
JP4230145B2 (en) System and method for authoring lighting sequences
EP1729615B1 (en) Entertainment lighting system
US20050275626A1 (en) Entertainment lighting system
JP5836591B2 (en) Method and apparatus for facilitating the design, selection and / or customization of lighting effects or lighting shows
US20050077843A1 (en) Method and apparatus for controlling a performing arts show by an onstage performer
US20040252486A1 (en) Creating and sharing light shows
CN111869330B (en) Rendering dynamic light scenes based on one or more light settings
EP3808158B1 (en) Method and controller for selecting media content based on a lighting scene
WO2023144269A1 (en) Determining global and local light effect parameter values
JP6691608B2 (en) Lighting control device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref country code: JP

Ref document number: 2002 504188

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 2001948546

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2001948546

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642