JP4230145B2 - System and method for authoring lighting sequences - Google Patents

System and method for authoring lighting sequences Download PDF

Info

Publication number
JP4230145B2
JP4230145B2 JP2001510276A JP2001510276A JP4230145B2 JP 4230145 B2 JP4230145 B2 JP 4230145B2 JP 2001510276 A JP2001510276 A JP 2001510276A JP 2001510276 A JP2001510276 A JP 2001510276A JP 4230145 B2 JP4230145 B2 JP 4230145B2
Authority
JP
Japan
Prior art keywords
lighting
effect
sequence
user
lighting unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2001510276A
Other languages
Japanese (ja)
Other versions
JP2003504829A (en
Inventor
ダウリング,ケヴィン・ジェイ
ブラックウェル,マイケル・ケイ
モーガン,フレデリック・エム
ライス,イホー・エイ
Original Assignee
フィリップス ソリッド−ステート ライティング ソリューションズ インコーポレイテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
Priority to US14379099P priority Critical
Priority to US60/143,790 priority
Application filed by フィリップス ソリッド−ステート ライティング ソリューションズ インコーポレイテッド filed Critical フィリップス ソリッド−ステート ライティング ソリューションズ インコーポレイテッド
Priority to PCT/US2000/019274 priority patent/WO2001005195A1/en
Publication of JP2003504829A publication Critical patent/JP2003504829A/en
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=22505653&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=JP4230145(B2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Publication of JP4230145B2 publication Critical patent/JP4230145B2/en
Application granted granted Critical
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHTING NOT OTHERWISE PROVIDED FOR
    • H05B37/00Circuit arrangements for electric light sources in general
    • H05B37/02Controlling
    • H05B37/029Controlling a plurality of lamps following a preassigned sequence, e.g. theater lights, diapositive projector

Description

[0001]
This application is filed in US provisional application No. 10 filed Jul. 14, 1999. 60 / 143,790 and claims its benefits.
[0002]
[Field of the Invention]
The present invention relates generally to systems and methods for controlling lighting systems, and more particularly to computerized systems and methods for designing lighting sequences and performing such sequences for lighting systems.
[0003]
[Background of the invention]
Most modern lighting controllers are designed to control white light (or monochromatic light) in a theater or high-end business environment. Light that produces monochromatic light such as white, blue or red can vary primarily along one dimension—ie, brightness—from off (none) to full brightness. Current controllers allow the user to specify the brightness for each light over time.
[0004]
This method is becoming increasingly complex for light that can change the color of the emitted light. This is because the resulting color and intensity is a combination of the intensity of the three element primaries, each of which can be set independently of the others for a particular light. Thus, since the output is specified for each point in time, it is a three-dimensional function rather than a one-dimensional function, and the effort and time involved in producing the effect is significantly increased. U.S. Pat. 5,307,295 describe systems that generate lighting sequences that simplify certain aspects of generating lighting sequences, but many of the parameters are still in the same way as they are on a standard lighting control table. Need to be specified for light. A more intuitive way to design lighting sequences not only simplifies and speeds up the design process, but also allows users to design lighting sequences with less training and experience that is often required today. I will.
[0005]
In addition, although the sequence can be generated and played back by traditional methods, the contents of the sequence will normally proceed with time and will not be modified during playback. For example, if a dramatic scene is stimulated for a certain period of time and requires a flash of lighting, this effect usually synchronizes the performance with a programmed flash and matches critical moments. Or by manually performing a flush at a critical moment. Such techniques either require considerable trust in the opportunity or exclude trust in automation.
[0006]
Techniques that allow an intuitive approach to design lighting sequences will reduce the time and training required to achieve the desired effect, and use colored light for efficiency It will be possible to operate with minimal impact. Furthermore, a method for performing such an illumination sequence that facilitates flexibility in sequence playback allows for increased freedom in related performance, or programmed illumination sequences in situations that are inherently unpredictable. Would be possible to use.
[0007]
[Summary of Invention]
The systems and methods described herein relate to an intuitive interface for the design of lighting sequences, for example by providing a visual representation of the sequence, as the lighting sequence is designed. Furthermore, the systems and methods described herein relate to the reproduction of programmed illumination sequences that can be modified during reproduction of an illumination sequence, eg, based on external stimuli or cues. .
[0008]
A system for creating a lighting sequence according to the principles of the present invention includes an authoring interface that displays information representing a plurality of lighting effects, and a lighting effect for a user to execute the lighting effect. A unit, a sequence authoring module that allows to select a lighting effect start time and a lighting effect stop time.
[0009]
In accordance with the principles of the present invention, a method for creating a lighting sequence that can be executed by a processor includes providing a processor interface that includes information representing a plurality of lighting effects, and receiving information representing a lighting unit. Receiving information representing a first lighting effect to be performed by the lighting unit; receiving information representing a start time of the first lighting effect; and information representing a stop time of the first lighting effect. Receiving.
[0010]
In another aspect, a system for controlling a plurality of lighting units according to the principles of the present invention includes a data interface that receives instructions for controlling a plurality of lighting units, a signal interface that receives external signals, and the instructions And a data output unit for transmitting the data stream to a plurality of lighting units, and a processor for changing the conversion of the instructions based on the received external signal.
[0011]
In another aspect, a method for controlling a plurality of lighting units according to the principles of the present invention includes receiving a command to control a plurality of lighting units, receiving an external signal, and based on a received external signal. Converting the instructions into a data stream and transmitting the data stream to a plurality of lighting units may be included.
[0012]
In another aspect, a method for controlling a plurality of lighting units according to the principles of the present invention includes a main lighting effect and a sub-lighting effect designed to be performed on behalf of the main lighting effect at a predetermined condition. Receiving a command, sending a command to the lighting unit to perform a main lighting effect, receiving a signal indicative of the predetermined condition, and sending a command to the lighting unit to perform a secondary lighting effect. May be included.
[0013]
In another aspect, a method for controlling a plurality of lighting units according to the present invention includes receiving a command to perform a timed sequence of lighting effects and performing a sequence of lighting effects utilizing the plurality of lighting units. A step of receiving an external signal and altering the execution of the sequence of lighting effects.
[0014]
The following drawings illustrate certain illustrative embodiments of the invention, wherein like reference numerals indicate like components. The illustrated embodiments are to be understood as illustrative of the invention and not limiting in any way.
[0015]
Detailed Description of Exemplary Embodiments
The following description relates to some exemplary embodiments of the invention. Although many variations of the present invention can be envisioned by those skilled in the art, such variations and modifications are intended to be within the scope of this disclosure. Accordingly, the scope of the invention should in no way be limited by the following disclosure. The term “sequence” or “lighting sequence” as used herein refers to sequential display as well as non-sequential display, flow controlled display, interrupt driven or event driven. Intended display, or any other controlled, overlaid, or sequential display using one or more lights.
[0016]
The system and method described herein relates to a system, such as a processor 10 that supports a software application having an interface 15 as illustrated in FIG. The lighting program 20 can include one or more lighting sequences that can be executed by the lighting controller 30, and the lighting controller 30 controls one or more lighting units 40. To do. The term “sequence” in the context of this disclosure refers to any pattern, show (display), sequence, arrangement or set of commands used to operate a lighting unit or other device through the system. One skilled in the art will recognize that the sequence is also an ordered sequence or does not need to have a linear design. A sequence with non-linear, priority-based and / or overlapping instructions may still have a sequence. The software application can be a free-standing application such as a C ++ or Fortran program or other executable code and / or an executable image of a library, or for example a Java applet or one Such HTML web pages or the like may be executed in connection with or accessible by a web browser. The processor 10 may be any system that processes in response to signals or data, and is a microprocessor, microcontroller, other integrated circuit, computer software, computer hardware, electrical circuit, application specific integration. It should be understood to encompass circuits, personal computers, chips, and other devices that can provide processing functions, or combinations thereof. For example, the processor 10 may be a regular IBM PC workstation running a Windows operating system, or a Sun (SUN) workstation running a version of a Unix operating system such as Solaris, or It can be any suitable data processing platform, such as any other suitable workstation. The lighting controller 30 communicates with the lighting unit 40 by radio frequency (RF), ultrasound, audible wave, infrared (IR), light, microwave, laser, electromagnetic waves, or any other transmission or connection method or system. Can do. Any suitable protocol may be used for transmission, including pulse width modulated signals such as DMX, RS-485, RS-232, or any other suitable protocol. The lighting unit 40 may be a gas bulb, LED, fluorescent tube, halogen lamp, laser, or any other type of light source, for example, each lighting unit is unique to that lighting unit. Or it is comprised so that it may be linked | related with the predetermined address allocated overlapping with the address of the other lighting unit. In certain embodiments, a single component may allow a user to both generate a lighting program and control the lighting unit. The present invention is intended to encompass this and other variations of the system illustrated in FIG. 1 that can be used to perform the methods described below. In certain embodiments, the functionality of the software application is provided by a hardware device, such as a chip or card, or any other system that can provide any of the functions described herein. Can be.
[0017]
In accordance with the method 200 of generating a lighting sequence shown in FIG. 2, the user may select from a set of predetermined “stock” effects 210. Stock effects function as individual components or building blocks that are useful in assembling a sequence. In addition, the user may assemble a specific sequence and include that sequence in a stock effect, eliminating the need to create a new component that is repeated each time the effect is desired. For example, a set of stock effects may include a dimming effect and a brightening effect. The user can assemble the pulse effect by specifying alternation of dimming and brightening effects, and include the pulse effect in the set of stock effects described above. Thus, each time a pulse effect is required, the stock effect can be utilized without the need to repeatedly select a dimming effect and a brightening effect to achieve the same purpose. In certain embodiments, the stock effect may also be generated by the user via any programming language such as Java, C, C ++, or any other suitable language. . Effects by providing effects as plug-ins, by including them in an effect file, or by any other technique suitable for organizing effects in a manner that allows additions, deletions, or changes to a set of effects Can be added to a set of stock effects.
[0018]
In addition, the user may select an effect and indicate 220 the time at which the effect should begin. For example, the user may indicate that the brightening effect should begin 3 minutes after the sequence starts. In addition, the user may select an effect end time or effect duration (230). Thus, by instructing that the effect should end 5 minutes after the sequence starts, or in a similar sense, indicating that the effect should continue for 2 minutes, the user becomes brighter The time parameter of the effect to be performed can be set. Additional parameters may be specified by the user (240) so that they may be appropriate for a particular effect. For example, the brightening effect or the dimming effect can be further defined by an initial brightness and an end brightness. The rate of change can be predetermined, i.e. the dimming effect can be applied by a linear rate that dims over the assigned time interval or can be changed by the user, e.g. Or it may allow it to slowly dim at the beginning followed by any other scheme specified by the user. Similarly, as described above, the pulse effect may instead be characterized by maximum brightness, minimum brightness, and periodicity or rate of change. Further, the mode of alternation can be changed by the user, for example, the change in brightness can reflect a sinusoidal function or an alternating linear change. In embodiments using color changing light, parameters such as initial color, final color, rate of change, etc. may be specified by the user. Many additional effects and appropriate parameters therefor are known to, or will be apparent to, those skilled in the art and will fall within the scope of this disclosure.
[0019]
In certain embodiments, the user may specify a transition between two effects that occur one after the other. For example, when the pulsing effect is followed by a dimming effect, the pulsing effect may change little or no quickly, with a gradual growth of dimness, or little change between the maximum and minimum brightness towards the end of the effect. . The technique of transitioning between these or between them and others is for each transition by the user, eg by selecting a transition effect from a set of predetermined transition effects, or one or both Can be determined by setting transition parameters for the beginning and / or end of the effect.
[0020]
In further embodiments, the user may specify multiple lighting effects for the same lighting unit that place the effects in time or location overlap. These overlapping effects can be used additively or subtracted so that multiple effects can interact. For example, a user can impose a brightening effect on the pulse effect, which imposes a minimum pulse parameter on the pulse to give a slow pulsing effect and grows into a stable light.
[0021]
In another embodiment, overlapping lighting effects can have priorities or cues attached to them, which allows a particular lighting unit to change the effect when it receives the cue. This queue can be any type of queue that is received externally or internally to the system, and a user-initiated queue such as a manual switch or bump button, and a certain User defined cues such as certain keystroke combinations or timing keys that allow tapping or pacing for effects and internal clocking mechanisms, internal memory mechanisms or software based Cues generated by a system such as the mechanism of the above and from analog or digital devices mounted on the system such as clocks, external light sensors, music synchronizers, audio level detectors or manual devices (eg switches etc.) Cue generated and wire or cable, RF signal or IR signal And queues received via a transmission medium such as, including queue received from the lighting unit mounted in the system, but not limited to. The priority can allow the system to select a default priority effect, which is the effect used by the lighting unit if a particular cue is not received, at which time the system commands the use of a different effect. To do. This change in effect can only occur temporarily while the queue occurs or is defined during the specified period, or it does not permanently allow other effects or further reception of the queue, or priority. It may be rank-based, returning to the original effect or waiting for a new queue to select a new effect. Alternatively, the system can select an effect based on the status of the queue and the importance of the desired effect. For example, if the voice sensor senses a sudden noise, it can trigger a high priority alarm lighting effect that would otherwise override all existing effects or wait for execution. The priority may also be a state that depends if the queue selects an alternative effect or is ignored depending on the current state of the system.
[0022]
In certain embodiments, the result of one effect can be programmed to depend on the second effect. For example, the effect assigned to one lighting unit may be a random color effect, and the effect assigned to the second lighting unit may be designed to match the color of the random color effect. Alternatively, one lighting unit can be programmed to perform one effect, such as the flash effect, whenever a certain condition is met such that the second lighting unit is turned off. Even more complex configurations such as an effect that starts with one effect of certain conditions and matches the color of another effect, ie the proportion of the third effect, can be produced by this scheme. Other combinations of effects where at least one parameter or occurrence of the effect depends on the parameter or occurrence of the second effect will be apparent to those skilled in the art and are intended to fall within the scope of this disclosure.
[0023]
In yet other embodiments, the systems and methods described herein allow lighting sequences to be affected by external inputs during execution. For example, a lighting sequence or effect may be programmed to start upon receipt of a trigger signal, the sequence or effect may take precedence when a signal is received, and the sequence or effect indicates to repeat or continue until a signal is received. May be done and so on. Thus, instead of assigning a separate start time to an effect or sequence, the user may instead indicate that effect or sequence to begin when receiving a certain stimulus. In addition, during generation, the user indicates two or more effects for overlapping or simultaneous time periods, and assigns different priorities or conditions to the effects, which effects execute during playback. Decide what will be done. In yet another embodiment, the user can link the parameters for the effect to external inputs including analog, digital and manual inputs, so that the color, speed or other attribute of the effect is, for example, It may depend on signals from external devices that measure volume, brightness, temperature, pitch, slope, wavelength, or any other suitable condition. Thus, selection of lighting sequences, selection of effects, or selection of parameters can be determined or influenced by input from an external source such as a user, chronometer, device or sensor.
[0024]
In event-driven embodiments such as aspects using external inputs and aspects using other effect outputs as inputs, menus may be provided to define the inputs and their results. For example, a palette of predetermined inputs can be provided to the user. Each input, such as a specified transducer or another effect output, is selected and placed within the authored illumination sequence as a trigger for a new effect or as a trigger for a modification of an existing effect. May be taken. Known inputs are, for example, thermistors, clocks, keyboards, numeric keypads, instrument digital interface (MIDI) inputs, DMX control signals, TTL or CMOS logic signals, other visual or audible signals, or any other Protocols, standards, or other signaling or control techniques having a predetermined format from any of analog, digital, manual or other formats may be included. The palette also includes custom inputs that are represented, for example, as icons in the palette or as selections in drop-down menus. Custom inputs define voltage, current, duration, and / or type (ie, sinusoidal, pulse, stepped, modulated) for input signals that the user will act as a control or trigger in the sequence. Enable.
[0025]
For example, a theater lighting sequence may include programmed lighting sequences and special effects in the order in which they occur but require input at specified times before the next sequence or part thereof is executed. . In this way, scene changes can occur in a director, producer, stager, or other party queue, rather than automatically as a function of timing alone. Similarly, effects that require actors to be timed with actions on stage, such as brightening when switching on or switching on, dramatic lighting flashes, etc. are directors, producers Can be accurately directed by the stage clerk or other actors (ie, even actors), thereby reducing the difficulty and danger of relying solely on pre-programmed timekeeping.
[0026]
The input from the sensor can also be used to modify the illumination sequence. For example, to correct the brightness of the light, for example to maintain a constant lighting level regardless of the amount of sunlight entering the room, or to ensure that the lighting effect is noticeable despite the presence of other light sources, For example, an optical sensor can be used. A motion sensor or other detector may be used as a trigger to initiate or change the illumination sequence. For example, a user may program a lighting sequence such that an advertisement or display purpose changes when a person approaches a sales counter or display. A temperature sensor can also be used to provide an input. For example, the color of the freezer light depends on the temperature, e.g. gives a blue light to indicate a cold temperature, and gradually changes to red as the temperature rises until a dangerous temperature is reached. It can be programmed to flash on or other warning effects can begin. Similarly, an alarm system can be used to provide a signal that triggers a lighting sequence or effect to provide an alarm, a danger signal, or other indication. For example, an interactive lighting sequence can be created where the effect performed varies according to the position, movement or other behavior of the person.
[0027]
In certain embodiments, the user may provide information representing the number and type of lighting units and the spatial relationship between them. For example, as illustrated in FIG. 3, such as a grid or other two-dimensional array that allows the user to arrange icons or other representative components to represent the array of lighting units used. An interface 300 may be provided. In one embodiment, illustrated in FIG. 3, the interface 300 illuminates the user with a selection of multiple standard types of lighting units 310, such as indirect light, lamps, spotlights, etc., for example, with a menu or palette or toolbar. Give by giving a choice of unit type. The user can then select and arrange lighting units on the interface, for example, in a layout space 320 in an arrangement that resembles the physical arrangement of actual lighting units.
[0028]
In certain embodiments, the lighting units can be organized into various groups, for example, to facilitate operation of a very large number of lighting units. The lighting units may be organized into groups based on spatial relationships, functional relationships, lighting unit types, or any other scheme desired by the user. The spatial arrangement can help to easily enter and execute lighting effects. For example, if groups of lights are arranged in a row and this information is provided to the system, the system will not display the rainbow or sequential flashes without the need for the user to specify a separate and individual program for each lighting unit. Such effects can be implemented. All implementations or effects of the types described above can be used for a group of units as well as a single lighting unit. The use of groups also allows the user to enter a single command or cue to control a given selection of lighting units.
[0029]
Lighting sequences can be tested or run on the lighting system to experience user generated effects. Further, the interface 300 reproduces the user-generated lighting sequence by recreating the programmed effect as if the icon on the interface is a lighting unit to be controlled. Can do. Thus, if a lighting unit with a lighting sequence has been specified to gradually lighten to an intermediate intensity during playback, the icon representing that lighting unit may start from black and gradually lighten gray. Similarly, color changes, blinking and other effects can be visually represented on the interface. This feature allows the user to present a fully or partially generated lighting sequence on a monitor or other video terminal device to provide a highly interactive method for show generation and pause playback. , And may allow the lighting sequence to be modified before resuming playback. In further embodiments, the system allows for fast forward, reverse, rewind, or other functions to allow editing of any part of the lighting sequence. In yet another embodiment, the system can use additional interface functions, such as functions known to those skilled in the art. This can include, but is not limited to, non-linear editing such as that used in Adobe, or devices or controllers such as scrolls, drag bars, or other devices or controllers.
[0030]
An alternative interface 400 for playing back lighting sequences is shown in FIG. The interface 400 includes displays of the lighting element 410 and the reproduction control unit 420. Other techniques for visualizing lighting sequences will be apparent to those skilled in the art and may be employed without departing from the scope and spirit of this disclosure.
[0031]
Interfaces that can represent lighting sequences can also be used during the input of lighting sequences. For example, a grid such as the interface 15 of FIG. 1 may be used, where lighting units available therein are represented along one axis and time is represented along a second axis. Thus, when the user specifies that a lighting unit is gradually brightened to an intermediate intensity, the portion of the grid defined by that lighting unit, the start time and the end time is black at one end of the grid portion. Appear and gradually lighten gray at the other end of the grid portion. In this way, the effect can be visually represented on the interface to the user as the lighting sequence is generated. In certain embodiments, effects that are difficult to express using a static display such as blinking, random color change, etc. can be, for example, blinking or randomly defining a defined grid portion color. By changing it, it can be expressed kinetically on the interface. An example of an interface 500 representing a sequence for a combination of three lighting units is shown in FIG. The time chart 510 visually shows the output of each of the three lighting units at each moment in time according to the time axis 515. At a glance, users can easily determine what effects are assigned to which lighting units at which time, simplify the coordination of effects across multiple lighting units, and quickly review lighting units. enable.
[0032]
Further, FIG. 5 shows a palette 520 that includes a stock effect from which the user can select a lighting effect. However, other techniques that provide a set of stock effects, such as through menus, toolbars, etc., may be employed in the systems and methods described herein. The palette 520 includes a fixed color effect 552, a cross fade 554 between the two color effects, a random color effect 558, a color high effect 560, a chasing rainbow effect. Icons for stock effects for illumination of 565, strobe effect 564 and flash effect 568 are provided. This list is by no means exhaustive and other types of effects can be included as will be apparent to those skilled in the art. To assign an effect to a lighting unit, the user can select the effect from a palette and select a range of grids corresponding to one or more appropriate lighting units, and a desired time interval for the effect. Additional parameters can be entered by entering a numerical value, by selecting an option from a palette, menu or toolbar, by subtracting a vector, or by any other technique known to those skilled in the art, such as parameter entry field 525. It can be set by any suitable technique. Other interfaces and techniques for input of lighting sequences suitable for performing some or all of the various functions described herein can be used and are encompassed by the scope of this disclosure. Is intended to be.
[0033]
The method described above can be easily adapted to control units other than the lighting unit. For example, in a theater setting, a smoke machine that can be controlled by a computer, a sound effect, a wind generator, a curtain, a foam generator, a projector, a stage practicing machine, a stage elevator, a fireworks manufacturing device, The backdrop, and any other features, can be controlled by sequences as described herein. In this way, multiple events can be automated and timed. For example, the user may program the lighting to begin to brighten as the curtain rises, followed by a firing sound as the fog stands on the stage. At home, for example, a program can be used to turn on the light at 7: 0 and sound an alarm, and turn on the coffee maker after 15 minutes. For example, a holiday lighting arrangement in a tree or house can be synchronized with the movement of a mechanical ornamental figurine or musical recording. Exhibition or entertainment vehicles can work with precipitation, wind, sound and light in simulated thunderstorms. Greenhouses, livestock sheds, or other living environments for growing organisms can synchronize ambient feeds with automated feed and water supply devices. Any combination of electromechanical devices can be timed and / or adjusted by the systems and methods described herein. Such an apparatus is an interface for generating a sequence as an additional line on the grid, for example as one line for each separate component being controlled, or by any other suitable means. Can be represented above. The effects of these other devices can also be visually represented to the user. For example, continued use of smoke generators could slow down other grids and the coffee maker would be able to make coffee on the interface that appears to make coffee when the coffee making action occurs on the device. It could be represented by the manufacturer's small display, and the interface could show a bar that slowly changes color when feed is fed in the barn. Other such static or dynamic effects will be readily apparent to those skilled in the art and are all incorporated within this disclosure.
[0034]
In certain embodiments in which the lighting unit can be moved, for example, by sliding, rotating, tilting, etc., the user may include instructions for movement or movement of the lighting unit. This function can be achieved by any means. For example, if the lighting unit includes a motor or other system capable of producing motion, the desired motion can be derived from a set of motion effects as described for a lighting effect above. It can be executed by selecting. Thus, for example, a lighting unit that can rotate on its base can be selected and a rainbow wash effect can be programmed to occur simultaneously with the rotational motion effect. In other embodiments, the lighting units are mounted on a movable platform, or which can be controlled independently of the lighting, for example by providing additional lines on the grid interface as described above. You can support what you can do. The exercise effect may also have parameters, such as speed and amount (eg, angle, distance, etc.) that can be specified by the user. Such illumination / motion combinations are useful in a wide range of situations, such as light shows, planetarium screenings, moving spotlights, and any other scenario where programmable moving light is desirable.
[0035]
Similarly, to control objects placed between the lighting unit and the object being illuminated (such as shielding plates, stencils, filters, lenses, diaphragms and any other objects through which light passes) The instructions can be provided by a user according to the systems and methods described herein. In this way, a wider array of lighting effects can be designed and programmed for later execution.
[0036]
One embodiment of the systems and methods described herein is a computer system such as the processor 10 shown in FIG. This computer system is in accordance with the systems and methods described herein, for example, in a computer language that is either interpreted or compiled, eg, Fortran, C, Java, C ++, etc. The computer program is configured to design and generate a lighting sequence. In an alternative embodiment, the systems and methods described herein allow the user to generate or design a lighting sequence that can be used to control multiple lighting units. Associated with a disk, CD or other permanent computer readable storage medium that encodes a computer program capable of executing part or all of the program.
[0037]
The lighting sequence can be stored as a compact disk, floppy disk, hard drive, magnetic tape, volatile or non-volatile solid-state memory device, or any other permanent computer-readable storage medium. It can be recorded on a medium. A lighting sequence represents a final data stream suitable for directly controlling an effect and its parameters, such as those generated by the user, or its format, for example a lighting unit or other device. It can be stored in a format or in a way to convert it to any other format suitable for performing lighting sequences. In embodiments where the sequence is stored as a data stream, the system allows the user to select from a group of choices of data formats such as DMX, RS-485, RS-232, etc. . Furthermore, the lighting sequences can be linked to each other, eg, at the end of one sequence, so that another sequence is executed, or the main sequence coordinates the execution of multiple sub-sequences, eg random Can be generated based on external signals, conditions, and time. In certain embodiments, the illumination sequence 20 may be performed directly from the processor 10, but in other embodiments, the illumination sequence 20 may be performed using a controller 30 as described below. .
[0038]
As shown in FIG. 6, the controller 30 may be used to execute a lighting sequence 20 that is programmed, designed, or generated on a different device. Since controller 30 provides a narrower range of functions than the processor used to generate the sequence, controller 30 includes less hardware, enables authoring, includes a video monitor, or other It can be less expensive than a more complex system with auxiliary functions. Controller 30 may receive illumination sequence 20 from any suitable loader interface 610 for receiving illumination sequence 20, for example, a storage medium such as a compact disk, diskette, magnetic tape, smart card, or other device. An interface for reading, or an interface for receiving transmissions from another system, such as a serial port, USB port, parallel port, IR receiver, or other connection that receives the illumination sequence 20 may be employed. . In certain embodiments, the lighting sequence 20 can be transmitted over the Internet. The controller 30 may also include an interface for communicating with the plurality of lighting units 40.
[0039]
The controller 30 loads the lighting sequence 20, or receives a command or signal from a user or device or sensor, or executes the lighting sequence 20 at a specified time or at any other suitable condition. Get started. The starting conditions can be included in the lighting sequence 20 or can be determined by the configuration of the controller 30. Further, in certain embodiments, the controller may begin executing the illumination sequence 20 starting at some point in the middle of the illumination sequence 20. For example, when the controller 30 receives a request from the user, the lighting sequence 20 starts from a point 3 minutes after the beginning of the sequence, or any other specified point, such as the fifth effect. Can be performed. Controller 30 may pause playback when receiving a signal from a user or device or sensor, and resume playback from the pause point when receiving an appropriate signal. The controller may continue to execute the lighting sequence 20 until the sequence is complete or until a command or signal is received from a user or device or sensor, or for a specified time, or to any suitable condition. .
[0040]
The controller 30 converts a plurality of predetermined stock effects and instructions to convert them into a data format such as DMX, RS-485 or RS-232 suitable for controlling a plurality of lighting units. A storage device, database or other suitable module 620 for storing may be included. The memory module 620 can be preconfigured for a set of stock effects, the memory module 620 can receive effects and instructions from the lighting sequence 20, and the memory module 620 can be configured as a preconfigured set of effects. A stock effect can be included, which can be supplemented by additional effects stored by the lighting sequence 20. By pre-configuring the memory module 620 with a set of stock effects, the memory required to store the lighting sequence 20 can be reduced. This is because the lighting sequence 20 can omit conversion instructions for preconfigured effects in the controller 30. In embodiments where the lighting sequence 20 includes a stock effect designed by an author, appropriate instructions are included in the lighting sequence 20 and, for example, upon loading or execution of the lighting sequence 20, the memory sequence It can be stored in module 620.
[0041]
The controller 30 can include an external interface 650 so that the controller 30 can receive external signals useful for modifying the execution of the lighting sequence 20. For example, the external interface 650 can include a user interface, which can include any other device such as a switch, button, dial, slider, console, keypad, or sensor, thereby A user can provide commands or signals to the controller 30 or otherwise affect the execution or output of the lighting sequence 20. The external interface 650 may receive time information from one or more chronometers such as the local time module 660, which may be used when the controller 30 is turned on or the counter is reset. Or as a counter that measures time from a predetermined starting point, such as a date and time module 665 that calculates the current date and time. In addition, controller 30 may receive commands or signals from one or more external devices via external input 6680. Such a device can be coupled directly to the controller 30 or the signal can be received by the controller via an IR sensor or other suitable interface. The signal received by the controller 30 can be compared with or interpreted by the cue table 630, which creates the lighting sequence 20 to affect the execution or output of the lighting sequence 20. May include information associated with various inputs or outputs designed by a person. Thus, if the controller 30 compares the input to the cue table 630 and determines that certain conditions have been met or that the indicated signal has been received, the controller 30 may instruct the program to As such, the execution or output of the lighting sequence 20 may be varied.
[0042]
In certain embodiments, the controller may respond to the external signal in a manner that is not determined by the content and instructions of the illumination sequence 20. For example, the external interface 650 may include a dial, slider, or other mechanism that allows the user to change the rate of progression of the lighting sequence 20, for example, by changing the speed of the local time counter 660 or by the controller 30. It can be changed by changing the interpretation of this counter. Similarly, external interface 650 may include a mechanism by which a user can adjust the brightness, color, or other characteristics of the output. In certain embodiments, the lighting sequence 20 may include instructions for receiving parameters for effects from one mechanism on the external interface 650 or other user interface so that the user can output the lighting unit or Allows control through special effects during playback rather than through the entire system.
[0043]
Controller 30 may also include a transient memory 640. The transient memory 640 may contain temporary information such as the current state of each lighting unit under its control, which may be useful as a basis for the execution of the lighting sequence 20. For example, as described above, some effects may use the output of another effect to define a parameter, and such an effect is output in other effects because it is stored in the transient memory 640. You can search for. Those skilled in the art will recognize other situations where the transient memory 640 may be useful, and such use is intended to be encompassed by this disclosure.
[0044]
The controller 30 may send the data generated by the execution of the lighting sequence 20 to the lighting unit by providing the data to the network output 680, optionally through the intervention of the output buffer 670. Signals to additional devices may be sent via network output 680 or through separate external output 662 when convenient or desirable. Data can be IR or RF transmission, other suitable methods of data transfer, or any combination of methods that can control lighting units and / or other devices via data connection members such as wires or cables Can be sent as
[0045]
In certain embodiments, the controller 30 may not communicate directly with the lighting unit, but instead may communicate with one or more sub-controllers, which may then communicate with the lighting unit. Alternatively, a sub-controller at another level is controlled. The use of sub-controllers allows distributed allocation of computational requirements. An example of such a system using this type of distributed scheme is described in US Pat. No. 5,769,525, in which it is disclosed as a “master / slave” control system. For the systems and methods described herein, communication between various levels can be unidirectional, where controller 30 provides instructions or subroutines to be executed by the sub-controller. Or if the sub-controller relays information back to the controller 30, the communication between the various levels is bi-directional and is described above for synchronization or for any other possible purpose. As such, it provides useful information for effects that rely on the output of other effects.
[0046]
While the above description describes one particular configuration of controller 30, other configurations for accomplishing the same or similar functions will be apparent to those skilled in the art, and such variations and modifications Are intended to be encompassed by the present invention. The following example describes one embodiment of the controller 30 as described above in more detail.
[0047]
The following is a system described herein as illustrated in FIG. 6, including show display design and format, external input and output management, show interpretation and execution, and DMX compliant output generation. And one embodiment of a controller according to the method is described. The controller architecture of this embodiment uses a Java-based object oriented design, but other object oriented structured or other programming languages may be used with the present invention.
[0048]
The controller architecture allows effects to be based on external environmental conditions or other inputs. The effect is a predetermined output related to one or more lighting units. For example, fixed colors, color wash and rainbow paint are all kinds of effects. An effect can be further defined by one or more parameters, which specify, for example, the light to be controlled, the color to be used, the speed of the effect, or other aspects of the effect. The environment modifies effects such as current time or external inputs such as switches, buttons, or other transducers that can generate control signals, or events generated by other software or effects, or Refers to any external information that can be used as an input to control. Finally, an effect can include one or more states so that the effect can retain information over time. A combination of state, environment and parameters can be used to fully define the output of the effect at any moment and over time.
[0049]
In addition, the controller can implement effect priorities. For example, different effects can be assigned to the same light. By utilizing a priority scheme, only the highest priority effect will determine the light output. When multiple effects control light with the same priority, the final output may be an average or other combination of effect outputs.
[0050]
The aforementioned illumination sequence can be deployed as a program fragment. Such fragments can be compiled in an intermediate format by compiling the program as byte code using an available Java compiler. In such a byte code format, a fragment may be referred to as a sequence. The sequence can be interpreted or executed by the controller 30. Sequences are not self-contained programs, and stick to a defined format, such as an instantiation of an object from a class, that the controller 30 can use to generate effects. When downloading into the controller 30 (via a serial port, infrared port, smart card, or some other interface), the controller 30 interprets the sequence and based on time or input stimulus Execute.
[0051]
Building blocks that generate shows are effect objects. The effect object is an initial parameter (such as which light to control, start color, wash period, etc.) with one specific effect such as color paint, cross fade, or fixed color. And instructions that generate based on input (such as time, environmental conditions, or results from other effect objects, etc.). The sequence contains all information for generating all effect objects for the show. The controller 30 instantiates all effect objects once when the show is started, and then activates each one periodically and sequentially. Based on the state of the entire system, each effect object can programmatically determine whether there is light it is controlling and how it will change.
[0052]
The runtime environment software that executes on the controller 30 may be referred to as a conductor. Conductors download sequences, create and maintain lists of effect object instances, manage interfaces to external inputs and outputs (including DMX), manage time clocks, and periodically each effect object You can be in charge of calling. The conductor also maintains a memory that the objects can use to communicate with each other.
[0053]
Controller 30 may maintain a two different but synchronized display of time. The first isLocalTime(Local time), Which is the number of milliseconds since the controller 30 was turned on.LocalTimeCan be represented as a 32-bit integer that will roll over after reaching its maximum value. The other time display isDateTime(Date), Which is a defined structure that holds the time of day (in seconds resolution) and day, month and year.
[0054]
LocalTimeCan be used by effects to calculate relative changes, such as color changes since the last run in the color paint effect.LocalTimeRoll over should not lead to failure or malfunction. The conductor may provide utility functions for common operations such as time delta.
[0055]
The effect objectEffect
(effect) Can be an instance of a class. Each effect object to produce the desired effectEffectTwo public methods that are classified into subclasses can be given. They are,constructor(constructor)as well asrun ()It is a method.
[0056]
constructorA method can be called by a sequence when an instance of an effect is created. It has any number and type of parameters necessary to produce the desired effect deformation. The authoring software may be responsible for generating the appropriate constructor parameters when generating the sequence.
[0057]
constructorThe first argument to may be an integer identifier (ID). The ID can be assigned by the show authoring software and can be unique.
[0058]
constructorIssuper ()Can be used to perform any conductor-specific initialization.
effectClass is alsonextas well asprevCan contain members,nextas well asprevMembers are used by sequences and conductors to maintain a linked list of effects. These members cannot be accessed internally by effect methods.
[0059]
A typical effect can be used over and over again. These typical effects can be provided by the conductor to minimize the size of sequence storage / download. Typical effects can be further subclassified if desired.
[0060]
A sequence is the usual means of bundling together all the information necessary to create a show. The sequence is the only required public method:init ()It is called once by the conductor before performing the show.init ()The method can instantiate all the effects used by the show, ID and any parametersconstructorPass as an argument. Next,init ()The method can link effect objects together in a linked list and return that list to the conductor.
[0061]
A linked list of effect objectsnextas well asprevMaintained through members. Of the first objectprevThe member is zero and the last objectnextThe member is zero (nil). The first effect isinit ()Is returned as the value of.
[0062]
Voluntarydispose ()The method will be called when the sequence is deactivated. This method can be used to clean up any resources allocated by the sequence. Multiple automated processes can be used independently to handle any allocated memory. Base classd ispose ()Will go through the linked list and free the effect object, wheredispose ()Are classified into subclasses,super ()It may be necessary to call
[0063]
Optional public methodsString getSequenceInfo ()Can be used to return version and copyright information. Some additionalgetSequence * ()It may be desirable to execute a routine to return information that may be useful to the controller / user interface.
[0064]
A sequence may require additional supporting classes. These can be included with a sequence object in a file such as a JAR (Java® Archive) file. The JAR file can then be downloaded to the conductor. The tool for JAR files is part of the standard Java development tool.
[0065]
Any DMX communicationDMX Interface(DMX interface) Can be handled by class.DMX InterfaceEach of the instances controls one DMX region (universe).DMX InterfaceThe base class is classified into subclasses and can communicate via a specific type of hardware interface (serial, parallel, USB).
[0066]
A “channel” is a single data byte at a specific location in the DMX region. A “frame” can be any channel in that DMX region. The number of channels in the DMX region is specified when the class is instantiated.
[0067]
Internally,DMX InterfaceHolds three buffers, each of which is the length of the number of channels: the last frame of the channel sent, the next frame of the channel waiting to be sent, and the It is the newest priority of data. The effect moduleSetChannel ()Can modify the channel data waiting to be sent via the method, and the conductorSendFrame ()Find the frame sent via.
[0068]
When an effects object sets data for a particular channel, it can also assign a priority to that data. If the priority is higher than the priority of the last data set for that channel, the new data can replace the old data. If the priority is lower, it can be left at the old value. If the priorities are equal, a new data value can be added to the current sum and the counter for that channel can be incremented. When a frame is sent, the sum of the data values for each channel can be divided by the channel counter to produce an average value for the highest priority data.
[0069]
After each frame is sent, the channel priority can be reset to all zeros. If the data to be sent can be retained and therefore not written to the new data given channel, it will maintain its last value and also copied to the buffer if any effect object is interested The
[0070]
TypicalDMX InterfaceCan execute the following methods: That is,
DMX Interface (int num_channels)The method isnum_channels (24..512)A constructor that sets up the DMX area of the channel. When classified into a subclass, a method can take additional arguments and specify hardware port information.
[0071]
void SetChannel (int channel, int data, int priority)The method sets the data (0..255) to be sent for that channel if the priority is higher than the current data priority. The method can throw error handling exceptions such as ChannelOutOfRange and DataOutOfRange exceptions.
[0072]
void SetChannels (int first_channel, int num_channels, int data [], int priority)Method from array datafirst_channelOf data to be sent for starting atnum_channelsSet. Its method isChannelOutOfRange,DataOutOfRangeas well asArrayIndexOutOfBoundsError handling exceptions such as exceptions can be thrown.
[0073]
int GetChannelLast (int channel)The method returns the last data sent for the channel. Its method isChannelOutOfRangeOrNoDataSentError handling exceptions such as exceptions can be thrown.
[0074]
void SendFrame (void)The method should send the current frame. This is accomplished via a separate thread so that processing by the conductor will not pause. If a frame is already in progress, it is terminated and a new frame is started.
[0075]
int FrameInProgress (void)Returns zero if no frame is currently being sent. If a frame is in progress, it returns the number of the last channel sent.
[0076]
A conductor is a runtime component of a controller that combines various data and input elements. The conductor may download the sequence, manage the user interface, manage the sequence via the time clock and other external inputs, and active effects objects.
[0077]
The technique for downloading the sequence JAR file into the conductor can vary depending on the hardware and transport mechanism. Various Java tools can be used to interpret the JAR format. In one embodiment, the sequence object and various requested classes can be loaded into memory along with the criteria for the sequence object.
[0078]
In one embodiment, more than one sequence object can be loaded into the conductor and only one sequence can be active. The conductor can activate the sequence based on an external input such as a user interface or date and time.
[0079]
If the sequence is already active, before you activate the new sequence,dispose ()The method is called for an already active sequence.
[0080]
In order to make a sequence active,init ()The method is called and executed for completion.
The controller may call a method that measures time. The time value isGetLocalTime ()as well asGetDateTime ()It can be accessed via a method. Other inputs can be listed and accessed by reference integers. All input values can also be mapped to integers.GetInput (int ref)Method inputrefThe value of andNoSuchInputExceptions such as exceptions can be thrown.
[0081]
The effect list isinit ()Can be generated and returned by the method. At fixed intervals, the conductor willrun ()You can call methods sequentially.
[0082]
The spacing can be specific to the particular controller hardware and can be changed, for example, by an external interface. If execution of the effects list does not end in one interval period, the next iteration can be delayed until the next interval time. The effect object may not need to operate at every interval to calculate the change, but may use the difference between the current time and the previous time.
[0083]
Effects can be designed to minimize the use of processing power so that the entire effects list can be operated quickly. If the effect requires a very large amount of computation, it can start a low priority thread to do the task. While the thread is running,run ()The method can return to leave immediately, so the light will remain unchanged.run ()When a method detects that a thread has terminated, it can use the result to update the light output.
[0084]
The memory allows various effects to communicate with each other. Like an external input, the memory element can be an integer. A memory element can be referenced by two pieces of information: the ID of the effect that generated the information, and a reference integer that is unambiguous for that effect. The accessor (accessor) method is
void SetScratch (int effect_id, int ref_num, int value)
int GetScratch (int effect_id, int ref_num)
It is.
[0085]
Both methods areNoSuchEffectas well asNoSuchReferenceError handling exceptions such as exceptions can be thrown.
The effects can operate in any order. Effects that use results from other effects may expect to receive results from previous iterations.
[0086]
Additional routines may include:
int DeltaTime (int last)The method calculates the time change between the current time and the last time.
[0087]
DMX_Interface GetUniverse (int num)The method is the number of areas (universe number)numRelated toDMX InterfaceReturns the object. This value should not change while the sequence is running, so it can be cached. The method isNoSuchUniverseError handling exceptions such as exceptions can be thrown.
[0088]
int [] HSB to RGB (int hue, int sat, int bright)The method converts hue (0-1535), saturation (0-255), and luminance (0-255) to red / green / blue values, which are the resulting array. Are written to the first three components. The method isValueOutOfRangeError handling exceptions such as exceptions can be thrown.
[0089]
int LightToDMX (int light)The method returns the optical DMX address using the optical logical number. The method isDMXAddressOutOfRangeError handling exceptions such as exceptions can be thrown.
[0090]
void LinkEffect (Effect a, Effect b)The methods are: a. next = b and b. Set prev = a.
Each controller may have a configuration file that is used by show authoring software. The configuration file contains mappings between input reference integers and more useful descriptions of their functions and values, egInput2 = “Slider” range = (0-99)(Input 2 = “slider” range = (0-99)). The configuration file can also contain other useful information such as the number of DMX regions.
[0091]
The following is an example of a code indicating an illumination sequence that has been authored according to the principles of the present invention. It will be understood that the following examples are in no way limiting.
[0092]
Example 1
[0093]
[Table 1]
[0094]
[Table 2]
[0095]
[Table 3]
All the articles, patents and other references mentioned above are hereby incorporated by reference. Although the invention has been disclosed in connection with the embodiments shown and described in detail, various equivalents, modifications and improvements will be apparent to those skilled in the art from the foregoing description. Such equivalents, modifications, and improvements are intended to be encompassed by the following claims.
[Brief description of the drawings]
FIG. 1 illustrates a system for generating a lighting sequence and performing that lighting sequence for a plurality of lighting units as described herein.
FIG. 2 represents an exemplary method for generating a lighting effect as described herein.
FIG. 3 illustrates an exemplary interface describing the arrangement of lighting units.
FIG. 4 represents an alternative interface for graphically reproducing a lighting sequence.
FIG. 5 represents an exemplary interface for generating an illumination sequence as described herein.
FIG. 6 illustrates one embodiment of a controller that performs an illumination sequence as described herein.

Claims (33)

  1. A processor (10) configured to design or generate a lighting sequence by providing a display interface (15, 300, 400, 500) adapted to display color information representative of a plurality of lighting effects;
    A sequence authoring interface (310) adapted to allow a user to select a lighting effect, at least one lighting unit that performs the lighting effect, a lighting effect start time, and a lighting effect stop time. , 320, 420, 520, 525), comprising a lighting sequence (20) comprising:
    The display interface is adapted to display a grid, the lighting unit is represented along one axis of the grid, and the time is represented along a second axis of the grid; and The interface visually reproduces the color of the selected lighting effect on the area of the grid defined by the lighting unit, the start time, and the stop time associated with the selected lighting effect. Said system being adapted to represent.
  2.   The sequence authoring interface is adapted to receive information representative of the arrangement of the plurality of lighting units, and the display interface provides a representation of the arrangement of the plurality of lighting units based on the received information. The system of claim 1, wherein the system is adapted for visual display.
  3.   The system according to claim 1 or 2, wherein the display interface is adapted to regenerate a lighting sequence generated by a user.
  4.   4. A system according to claim 1, 2 or 3, wherein the lighting unit is one of a plurality of lighting units and each lighting unit is associated with a unique address.
  5.   The system according to claim 1, wherein the lighting unit comprises an LED lighting unit capable of emitting a range of different colors of light.
  6.   6. The system according to any of claims 1-5, further comprising a storage medium adapted to store user selections.
  7.   The sequence authoring interface (500) is adapted to allow a user to select a color (552) for a selected lighting effect according to any of claims 1-6. system.
  8.   8. The sequence authoring interface (500) is adapted to allow a user to select a start color and a stop color for a selected lighting effect. The described system.
  9.   9. The sequence authoring interface is adapted to allow a user to select a transition effect for a transition between a first lighting effect and a second lighting effect. The system described in Crab.
  10.   9. The sequence authoring interface is adapted to allow a user to determine a priority for a first lighting effect that shares temporal overlap with a second lighting effect. A system according to any of the above.
  11.   11. A system according to any of claims 1 to 10, wherein the sequence authoring interface is adapted to allow a user to determine brightness for a selected lighting effect.
  12.   12. A system according to any of claims 1 to 11, wherein the sequence authoring interface is adapted to allow a user to issue a command to initiate a selected lighting effect based on an external stimulus. .
  13.   13. A system according to any preceding claim, wherein the sequence authoring interface is adapted to allow a user to determine the movement of the lighting unit.
  14.   14. A system according to any of claims 1 to 13, wherein the sequence authoring interface is adapted to allow a user to design at least one user-configured lighting effect.
  15.   The system according to any of the preceding claims, further comprising a controller (30) adapted to perform a lighting sequence so as to control at least one lighting unit.
  16.   16. The system of claim 15, wherein the controller includes at least one storage medium (620) that stores the lighting sequence in a data format representing a data stream that can directly control at least one lighting unit.
  17. In a method of creating an illumination sequence (20) that can be executed by a controller (30),
    Displaying color information representing a plurality of lighting effects;
    Selecting a lighting effect including a plurality of colors for a lighting sequence based on the displayed information;
    Selecting at least one lighting unit for performing the lighting effect;
    Selecting a start time for the selected lighting effect;
    Selecting a stop time for the selected lighting effect, comprising:
    Further displaying a grid, wherein the lighting unit is represented along one axis of the grid, time is represented along a second axis of the grid, and associated with the selected lighting effect, The method of claim 1, further comprising reproducing and visually representing the color of the selected lighting effect on a region of a grid defined by a lighting unit, the start time and the stop time.
  18. 18. The method of claim 17, further comprising: receiving information representative of a plurality of lighting unit arrangements; and displaying a representation of the plurality of lighting unit arrangements based on the received information.
  19. 19. A method according to claim 17 or 18 , further comprising selecting a second lighting unit and selecting a lighting effect to be performed by the second lighting unit.
  20. Further comprising the method of any of claims 17 to 19 storing a user selection on the electronic storage medium.
  21. 21. A method according to any of claims 17 to 20 , further comprising selecting at least one color for the selected lighting effect.
  22. Selecting a second lighting effect of the lighting sequence based on the displayed information, selecting a start time for the second lighting effect, and selecting a stop time for the second lighting effect The method according to any one of claims 17 to 21 , further comprising:
  23. 23. The method of claim 22 , further comprising selecting a transition effect between the selected lighting effect and the second lighting effect.
  24. 24. The method of claim 19, 22 or 23 , further comprising determining a priority for a plurality of selected lighting effects.
  25. 25. A method according to any of claims 17 to 24 , further comprising selecting a brightness for the selected lighting effect.
  26. 26. A method according to any of claims 17 to 25 , wherein selecting a lighting unit comprises selecting a plurality of lighting units for performing a selected lighting effect.
  27. 27. A method according to any of claims 17 to 26 , wherein selecting a lighting unit comprises selecting an LED lighting unit capable of emitting any light of a range of colors.
  28. 28. A method according to any of claims 17 to 27 , wherein selecting a start time includes issuing an instruction to start a selected lighting effect based on an external stimulus.
  29. 29. A method according to any of claims 18 to 28 , wherein each lighting unit is associated with a unique address.
  30. Further comprising the method of any of claims 1 7 to 29 that specifies the motion of the selected lighting unit.
  31. To control at least one lighting unit, further comprising performing a lighting sequence, the method according to any one of claims 17 to 30.
  32. 32. A method according to any of claims 17 to 31 , further comprising storing the lighting sequence in a data format representing a data stream in which at least one lighting unit can be directly controlled.
  33. When executed, implement the method of any of claims 17-32, encoded with at least one program, a computer readable medium.
JP2001510276A 1999-07-14 2000-07-14 System and method for authoring lighting sequences Active JP4230145B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14379099P true 1999-07-14 1999-07-14
US60/143,790 1999-07-14
PCT/US2000/019274 WO2001005195A1 (en) 1999-07-14 2000-07-14 Systems and methods for authoring lighting sequences

Publications (2)

Publication Number Publication Date
JP2003504829A JP2003504829A (en) 2003-02-04
JP4230145B2 true JP4230145B2 (en) 2009-02-25

Family

ID=22505653

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001510276A Active JP4230145B2 (en) 1999-07-14 2000-07-14 System and method for authoring lighting sequences

Country Status (7)

Country Link
EP (3) EP1224845B1 (en)
JP (1) JP4230145B2 (en)
AT (3) AT431065T (en)
AU (1) AU6347300A (en)
DE (3) DE60042177D1 (en)
ES (3) ES2326744T3 (en)
WO (1) WO2001005195A1 (en)

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252486A1 (en) * 2001-07-23 2004-12-16 Christian Krause Creating and sharing light shows
DE10261028A1 (en) 2002-12-24 2004-07-08 Robert Bosch Gmbh Process for the transmission of location-related information
US7145558B2 (en) 2003-09-03 2006-12-05 Motorola, Inc. Selective illumination of regions of an electronic display
EP1787498A4 (en) * 2004-08-17 2010-03-24 Jands Pty Ltd Lighting control
MX2007008199A (en) 2005-01-06 2007-09-07 Johnson & Son Inc S C Method and apparatus for storing and defining light shows.
JP2009519489A (en) 2005-12-15 2009-05-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ System and method for creating an artificial atmosphere
DE102006019145A1 (en) * 2006-04-21 2007-10-25 Erco Leuchten Gmbh Light Control System
EP3406969A1 (en) 2006-11-28 2018-11-28 Hayward Industries, Inc. Programmable underwater lighting system
JP4872129B2 (en) * 2007-01-23 2012-02-08 レシップホールディングス株式会社 Dimming data creation method, dimming data creation program, and recording medium recording the dimming data creation program
TW200935972A (en) 2007-11-06 2009-08-16 Koninkl Philips Electronics Nv Light management system with automatic identification of light effects available for a home entertainment system
US8118447B2 (en) 2007-12-20 2012-02-21 Altair Engineering, Inc. LED lighting apparatus with swivel connection
US8360599B2 (en) 2008-05-23 2013-01-29 Ilumisys, Inc. Electric shock resistant L.E.D. based light
US7946729B2 (en) 2008-07-31 2011-05-24 Altair Engineering, Inc. Fluorescent tube replacement having longitudinally oriented LEDs
US8653984B2 (en) 2008-10-24 2014-02-18 Ilumisys, Inc. Integration of LED lighting control with emergency notification systems
US8324817B2 (en) 2008-10-24 2012-12-04 Ilumisys, Inc. Light and light sensor
US8901823B2 (en) 2008-10-24 2014-12-02 Ilumisys, Inc. Light and light sensor
US7938562B2 (en) 2008-10-24 2011-05-10 Altair Engineering, Inc. Lighting including integral communication apparatus
US8214084B2 (en) 2008-10-24 2012-07-03 Ilumisys, Inc. Integration of LED lighting with building controls
DE102008055938B4 (en) * 2008-11-05 2013-10-17 Insta Elektro Gmbh Procedure for running actions of actuators connected to a control module, plug-in and lighting control system with several such plug-ins
US8664880B2 (en) 2009-01-21 2014-03-04 Ilumisys, Inc. Ballast/line detection circuit for fluorescent replacement lamps
DE102009007525A1 (en) * 2009-02-05 2010-08-19 E:Cue Control Gmbh Control device for a plurality of light sources and lighting unit comprising a control device
US8330381B2 (en) 2009-05-14 2012-12-11 Ilumisys, Inc. Electronic circuit for DC conversion of fluorescent lighting ballast
US8299695B2 (en) 2009-06-02 2012-10-30 Ilumisys, Inc. Screw-in LED bulb comprising a base having outwardly projecting nodes
US8421366B2 (en) 2009-06-23 2013-04-16 Ilumisys, Inc. Illumination device including LEDs and a switching power control system
CA2794541C (en) 2010-03-26 2018-05-01 David L. Simon Inside-out led bulb
CA2792940A1 (en) 2010-03-26 2011-09-19 Ilumisys, Inc. Led light with thermoelectric generator
PL2554024T3 (en) * 2010-03-26 2016-09-30 Method of imposing a dynamic color scheme on light of a lighting unit
CA2794512A1 (en) 2010-03-26 2011-09-29 David L. Simon Led light tube with dual sided light distribution
US8454193B2 (en) 2010-07-08 2013-06-04 Ilumisys, Inc. Independent modules for LED fluorescent light tube replacement
JP2013531350A (en) 2010-07-12 2013-08-01 イルミシス,インコーポレイテッドiLumisys,Inc. Circuit board mount for LED arc tube
EP2633227B1 (en) 2010-10-29 2018-08-29 iLumisys, Inc. Mechanisms for reducing risk of shock during installation of light tube
US8870415B2 (en) 2010-12-09 2014-10-28 Ilumisys, Inc. LED fluorescent tube replacement light with reduced shock hazard
DE102011007416A1 (en) * 2011-04-14 2012-10-18 Trilux Gmbh & Co. Kg Luminaire and adapter for controlling the luminaire
US9072171B2 (en) 2011-08-24 2015-06-30 Ilumisys, Inc. Circuit board mount for LED light
JP2013131384A (en) * 2011-12-21 2013-07-04 Fujikom Corp Lighting apparatus control system
WO2013131002A1 (en) 2012-03-02 2013-09-06 Ilumisys, Inc. Electrical connector header for an led-based light
WO2014008463A1 (en) 2012-07-06 2014-01-09 Ilumisys, Inc. Power supply assembly for led-based light tube
US9271367B2 (en) 2012-07-09 2016-02-23 Ilumisys, Inc. System and method for controlling operation of an LED-based light
US9285084B2 (en) 2013-03-14 2016-03-15 Ilumisys, Inc. Diffusers for LED-based lights
US9267650B2 (en) 2013-10-09 2016-02-23 Ilumisys, Inc. Lens for an LED-based light
DE102013112127A1 (en) 2013-11-05 2015-05-07 Eaton Electrical Ip Gmbh & Co. Kg Multicolor signal arrangement, method for defining modes of a multi-color signal arrangement and system, comprising a multicolor signal arrangement and an RFID transmitter
EP3072364A1 (en) * 2013-11-18 2016-09-28 Philips Lighting Holding B.V. Method and system for providing a dynamic lighting effect to specular and refractive objects
KR20160111975A (en) 2014-01-22 2016-09-27 일루미시스, 인크. Led-based light with addressed leds
DE102014205301A1 (en) * 2014-03-21 2015-09-24 Zumtobel Lighting Gmbh Method for operating a lamp with a plurality of lamps or groups of lamps
WO2015169632A1 (en) * 2014-05-05 2015-11-12 Koninklijke Philips N.V. Lighting system and method
US9510400B2 (en) 2014-05-13 2016-11-29 Ilumisys, Inc. User input systems for an LED-based light
US10161568B2 (en) 2015-06-01 2018-12-25 Ilumisys, Inc. LED-based light with canted outer walls
US9807855B2 (en) 2015-12-07 2017-10-31 Pentair Water Pool And Spa, Inc. Systems and methods for controlling aquatic lighting using power line communication
US10219975B2 (en) 2016-01-22 2019-03-05 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
GB2550086B (en) 2016-04-08 2018-12-26 Rotolight Ltd Lighting system and control thereof
WO2018027297A1 (en) * 2016-08-12 2018-02-15 9255-7248 Québec Inc. Method and system for synchronizing lighting to music
US20190380188A1 (en) * 2016-11-25 2019-12-12 Signify Holding B.V. Lighting control

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5225667B1 (en) * 1971-04-18 1977-07-08
US5769527A (en) 1986-07-17 1998-06-23 Vari-Lite, Inc. Computer controlled lighting system with distributed control resources
US4980806A (en) * 1986-07-17 1990-12-25 Vari-Lite, Inc. Computer controlled lighting system with distributed processing
GB8727605D0 (en) * 1987-11-25 1987-12-31 Advanced Lighting Systems Scot Programmable control system
FR2628335B1 (en) * 1988-03-09 1991-02-15 Univ Alsace Installation for ensuring the rule of sound, light and / or other physical effects of a spectacle
US5307295A (en) * 1991-01-14 1994-04-26 Vari-Lite, Inc. Creating and controlling lighting designs
US5406176A (en) * 1994-01-12 1995-04-11 Aurora Robotics Limited Computer controlled stage lighting system
US5629587A (en) * 1995-09-26 1997-05-13 Devtek Development Corporation Programmable lighting control system for controlling illumination duration and intensity levels of lamps in multiple lighting strings
EP1040398B1 (en) * 1997-12-17 2018-02-21 Philips Lighting North America Corporation Digitally controlled illumination methods and systems

Also Published As

Publication number Publication date
ES2361969T3 (en) 2011-06-24
AT500714T (en) 2011-03-15
AT431065T (en) 2009-05-15
DE60023730T2 (en) 2006-07-06
DE60023730D1 (en) 2005-12-08
EP1224845A1 (en) 2002-07-24
EP1624728B1 (en) 2009-05-06
WO2001005195A1 (en) 2001-01-18
EP1624728A1 (en) 2006-02-08
EP1224845B1 (en) 2005-11-02
EP2139299A3 (en) 2010-01-20
JP2003504829A (en) 2003-02-04
AU6347300A (en) 2001-01-30
EP2139299B1 (en) 2011-03-02
EP2139299A2 (en) 2009-12-30
ES2251396T3 (en) 2006-05-01
DE60045697D1 (en) 2011-04-14
ES2326744T3 (en) 2009-10-19
DE60042177D1 (en) 2009-06-18
AT308869T (en) 2005-11-15

Similar Documents

Publication Publication Date Title
US10146398B2 (en) Generating a virtual-room of a virtual room-based user interface
US8853972B2 (en) Methods and apparatuses for operating groups of high-power LEDs
JP5529861B2 (en) Creation, recording and reproduction of lighting
EP1459600B1 (en) Controlled lighting methods and apparatus
ES2615130T3 (en) Integrated lighting control module and power switch
EP0673520B1 (en) Programmable lighting control system with normalized dimming for different light sources
JP5825561B2 (en) Interactive lighting control system and method
KR101143095B1 (en) Coordinating animations and media in computer display output
US8080819B2 (en) LED package methods and systems
US5790124A (en) System and method for allowing a performer to control and interact with an on-stage display device
US6466234B1 (en) Method and system for controlling environmental conditions
US7031920B2 (en) Lighting control using speech recognition
US7550931B2 (en) Controlled lighting methods and apparatus
US20130234598A1 (en) Wireless Lighting Control System
CN101341799B (en) User interface and method for control of light systems
JP2014056670A (en) Lighting control system
ES2463716T3 (en) Remote lighting control
US20040141321A1 (en) Lighting and other perceivable effects for toys and other consumer products
US5209560A (en) Computer controlled lighting system with intelligent data distribution network
US6801003B2 (en) Systems and methods for synchronizing lighting effects
EP0534710A1 (en) Computer controlled lighting system with intelligent data distribution networks
US6842650B2 (en) Method and system of programming at least one appliance to change state upon the occurrence of a trigger event
US20060002110A1 (en) Methods and systems for providing lighting systems
US8766556B2 (en) Remotely controllable track lighting system
US20060076908A1 (en) Lighting zone control methods and apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20051221

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20051221

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080318

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080613

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080715

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080922

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20081111

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20081203

R150 Certificate of patent or registration of utility model

Ref document number: 4230145

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111212

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111212

Year of fee payment: 3

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: R3D02

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: R3D04

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111212

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121212

Year of fee payment: 4

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131212

Year of fee payment: 5

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350