WO2010070517A1 - System for simulation of a lighting distribution - Google Patents

System for simulation of a lighting distribution Download PDF

Info

Publication number
WO2010070517A1
WO2010070517A1 PCT/IB2009/055500 IB2009055500W WO2010070517A1 WO 2010070517 A1 WO2010070517 A1 WO 2010070517A1 IB 2009055500 W IB2009055500 W IB 2009055500W WO 2010070517 A1 WO2010070517 A1 WO 2010070517A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
light sources
illuminated environment
simulation
influence
Prior art date
Application number
PCT/IB2009/055500
Other languages
French (fr)
Inventor
Matthias Wendt
Robert Van Herk
Original Assignee
Philips Intellectual Property & Standards Gmbh
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Intellectual Property & Standards Gmbh, Koninklijke Philips Electronics N.V. filed Critical Philips Intellectual Property & Standards Gmbh
Publication of WO2010070517A1 publication Critical patent/WO2010070517A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/06Special arrangements of screening, diffusing, or reflecting devices, e.g. in studio
    • G03B15/07Arrangements of lamps in studios
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control

Definitions

  • the invention relates to a system for simulation of a lighting distribution of one or more controllable light sources in an illuminated environment and a corresponding method.
  • Controllable light sources are used in lighting systems to create defined lighting distributions and gain significance in a variety of office and commercial lighting applications. Such applications include for example room lighting in department stores, restaurants, hotels or other environments, where a defined lighting distribution needs to be created. Controllable light sources typically allow to set one or more parameters, e.g. brightness or colour, so that a defined lighting distribution can be created by setting one or more parameters without a change in the lighting infrastructure.
  • parameters e.g. brightness or colour
  • a user e. g. a lighting designer
  • a set of control parameters to the light sources and then visually verifies the illumination in the environment. If necessary, the lighting designer further fine-tunes the parameters and checks the lighting distribution until the desired illumination is given.
  • a procedure is rather time-consuming and requires the lighting designer to be present at the site of the lighting system to obtain visual feedback and eventually fine-tune the control parameters.
  • simulation methods have been developed, which allow obtaining a preview of the lighting distribution, for example on a computer screen. Since a precise preview is needed, typically ray-tracing techniques are used for simulation of the lighting distribution. However, simulations using ray-tracing techniques require significant computational effort and time and are thus usually very slow, so that it is not possible for the lighting designer to apply changes to the parameters and to directly obtain real-time visual feedback for fine-tuning of the lighting distribution.
  • the basic idea of the invention is to provide a system and method for simulation of a lighting distribution of one or more controllable light sources, which uses influence data, representing the effect of said light sources on the illuminated environment, e. g. measurement data of the light sources, for generating a fast and accurate preview of the lighting distribution according to at least one control parameter.
  • the term "illuminated environment” refers to any physical (“real-world”) environment or surrounding or any section or spatial part of such environment, which is at least partly illuminated by at least one of said controllable light sources.
  • the illuminated environment is the part of the environment of the light sources, which is of interest to the user for the simulation, such as for example a specific point or a small area in the environment, the stage area of a theatre or even a special sales area, for example in a department store.
  • said illuminated environment is entirely illuminated by each of said light sources or that said light sources only illuminate said illuminated environment or section.
  • light distribution refers to the illumination in the illuminated environment in general and is sometimes also referred to as "lighting scene”, “lighting scenario” or “lighting atmosphere”.
  • the inventive system is advantageously not dependent on a certain type of light source and may be employed for the simulation of any kind of light source, as far as at least one parameter of each light source is controllable by sending a corresponding control command to the respective light source.
  • the inventive system may be used for simulation of the illumination of commercially available halogen, gas discharge or LED lighting units.
  • the control parameter may be the on/off state of the respective light source, but the parameter may also refer to further operational states, such as brightness, i.e. dimming state, color or position of the output beam.
  • the inventive system comprises at least an input interface and an image processing unit, which may exemplary be provided by a data processing system, comprising for example one or more preferably networked computers, micro controllers or any other type of suitable electronic circuitry.
  • the input interface is configured to obtain at least one or more simulation control parameters of said one or more light sources.
  • the inventive system uses these parameters to determine a graphical representation of said lighting distribution, i.e. the system determines how the illuminated environment will look like, in case the controllable light sources would be supplied with the simulation control parameters.
  • the one or more parameters do not need to refer to all of the controllable light sources. It is sufficient, that the parameter relates to at least one light source.
  • the input interface may comprise or be connected to a suitable input device, such as a keyboard, mouse and touch screen or any other device for user input.
  • the input interface may also be configured to obtain the simulation control parameters over a computer network using a network interface, for example from a database server, or from a file on a suitable storage medium, such as an optical or magnetic medium or random access memory.
  • the input interface unit is further configured to obtain influence data, which data represent the effect of said one or more light sources on the illumination of said illuminated environment.
  • the term "effect" of the light sources may refer to any measurable value describing the impact of light sources on objects (e. g. reflecting walls, counters, etc.) within the observed space, i.e. within said illuminated environment. In a simple embodiment, this may be a geometric brightness distribution, describing the intensity of illumination of a certain object or area by a light source. Also, there may be spectral information, preferably relating to color, but not necessarily limited to the visible range. Generally, the effect may be written as p(x, y, z, lambda), where p is the power distribution measured at a geometric location x, y, z and lambda is the wavelength.
  • the influence data is given in a linear color space, such as linear RGB, RGBE or CIE XYZ.
  • the influence data may thus be formed by any type of information, which renders possible a mapping between at least one control parameter and the effect of the control parameter on the illumination of the illuminated environment.
  • Said influence data may preferably be obtained directly from one or more corresponding detector units, e.g. digital cameras, observing the illuminated environment.
  • the influence data may also be obtained from a database, file or from a network system, especially in case the inventive system is located remote from the light sources.
  • the input interface may also be configured to allow entering the influence data manually using the afore-mentioned input devices. Details on the acquisition of influence data is disclosed in WO 2008/001259, which is incorporated herein by reference.
  • the input interface may further provide storage, such as random access memory, magnetic or optical devices, or any other suitable device for storing the obtained data and/or simulation control parameters.
  • the image processing unit determines a graphical representation of the simulated lighting distribution using said influence data and said one or more simulation control parameters.
  • the term "graphical representation" refers to any representation of the lighting distribution, which enables to display a preview of the lighting distribution to the user, e.g. using a computer device.
  • the graphical representation may for example comprise one or more images, photometric information or a model of the illuminated environment.
  • the processing unit determines the influence data, which describes the impact of the obtained simulation control parameter on the illuminated environment. If multiple control parameters are obtained, all associated influence data are determined and the overall influence is computed.
  • an approximation may be calculated using the "closest match" comprised in the influence data.
  • the control parameter sets lamp 1 to 55% brightness, but the influence data only comprise information on the influence of the control commands "50% brightness” and "60 % brightness”, an approximation preferably using linear interpolation, is computed by overlay of the corresponding information.
  • the graphical representation may be transferred to a user terminal for display to the user or e. g. stored in a database for further processing.
  • the corresponding control parameters may be sent to the one or more controllable light sources to set the illumination in the "real- world" illuminated environment.
  • the inventive system advantageously provides a very fast and accurate preview of the illumination according to the desired simulation control parameters.
  • the invention thus allows the user to "fine-tune” the parameters and to directly obtain real-time visual feedback without the need to actually apply the control parameters to the light sources.
  • the inventive system uses data referring to the influence on the environment, the inventive system advantageously operates independent from the type of light sources used and can thus be employed for the simulation of a variety of different types of light sources. Information on the specific radiation pattern of each of the light sources is advantageously not necessarily needed.
  • the input interface may preferably be adapted to receive an image of said illuminated environment without the light sources being operated, which is then merged with said influence data.
  • the influence data comprises image data, e.g. one or more images, representing the effect of the at least one control parameter on the illumination of the illuminated environment.
  • the image data may be for example obtained from one or more detectors, e. g. cameras, observing the illuminated environment.
  • the input interface is further configured to obtain three-dimensional object data of one or more objects of said illuminated environment.
  • the image processing unit then generates the graphical representation using the additional object data to further improve the graphical representation.
  • the term "object” may refer to any object in the illuminated environment, e.g. a reflecting surface, such as a wall, ceiling, floor, but also shelves, counters or mannequins.
  • the object data comprises at least positional information of the related object or surface in a three-dimensional space, e.g. geometric coordinates (x, y, z), but may comprise further information, such as shape, texture, color or reflectance of the associated object to further improve the simulation.
  • the object data may preferably be in the form of one or more polygons, e.g. describing reflecting surfaces, to reduce computational effort for generating the graphical representation.
  • the object data is then transferred to the image processing unit to generate the graphical representation of the illuminated environment and the simulated lighting distribution.
  • the image processing unit preferably generates a three-dimensional model of the illuminated environment using the object data, e.g. in the form of a wireframe model or polygon mesh model.
  • the generated model is rendered using the obtained influence data, which is in the field of 3D-graphics also termed as "wrapping" or texture mapping.
  • the graphical representation is then calculated from the rendered three-dimensional model.
  • the present embodiment thus enables to obtain a further improved preview of the illuminated environment.
  • the input interface is configured to receive viewpoint selection data, which is then used by the image processing unit to calculate the graphical representation from the specified viewpoint.
  • viewpoint selection data may for example be obtained from an input device, as mentioned before.
  • the object data may be obtained by the input interface for example from a suitable database.
  • the object data may be obtained from manual input by the user or derived from a CAD-file.
  • the system may comprise an object processing unit, configured to determine said three-dimensional object data from image data of said illuminated environment and to provide said object data to said input interface.
  • an object processing unit configured to determine said three-dimensional object data from image data of said illuminated environment and to provide said object data to said input interface.
  • object data may be obtained by triangulation from multiple images of the illuminated environment, taken from known, distant positions. The difference in position then establishes the baseline for the images and allows the derivation of a third coordinate. In a subsequent segmentation step, the detected objects are segmented into multiple polygons.
  • the three- dimensional object data may obtained by matching the image data with a library of predefined 3D forms to provide said object data.
  • the object data is segmented into triangles.
  • polygons with a variable number of corners to reduce the overall number of polygons.
  • the system comprises at least a detector unit, connected to said object processing unit to provide image data of said illuminated environment.
  • the at least one detector unit therefore is arranged to observe the illuminated environment at least partly.
  • the system comprises at least two detector units, arranged distant to each other to observe said illuminated environment.
  • the at least one detector unit is configured to further obtain said influence data.
  • the detector unit may be for example a simple digital camera, but may also be of laser scanner type or may use patterned light to determine 3D shapes and positions of objects.
  • Several of such detectors are known in the art, an example of a suitable detector is disclosed in "A 3D multi-aperture image sensor architecture"; IEEE Custom Integrated Circuits Conference 2006, pages 281-284.
  • the system further comprises a display unit and said image processing unit is configured to provide said graphical representation to said display unit.
  • the display unit may be of any suitable type to provide the user with a preview of the simulated illuminated environment.
  • the display unit may be a computer monitor, LCD, TFT or plasma screen.
  • the display unit may be integrally formed with the further components of the inventive system, however, it may also be possible, that the display unit is formed separately and is connected to the processing unit by means of a suitable computer network.
  • a database unit may preferably be provided for storing the generated graphical representation together with the at least one associated simulation parameter.
  • the graphical representation may in this case be obtained from the database when needed.
  • the inventive method may at least partly be carried out by a computer program when executed on a computer.
  • fig. 1 shows a symbolic representation of a first embodiment of a system for simulation of a lighting distribution
  • fig. 2 shows an exemplary embodiment of an illuminated environment in a perspective view
  • fig. 3 shows an embodiment of the data structure of influence data
  • fig- 4 shows an embodiment of a method for operating the inventive system in a schematic diagram
  • fig. 5 shows a symbolic representation of a second embodiment of a system for simulation of a lighting distribution
  • fig. 6 shows a further embodiment of the data structure of influence data according to the embodiment of fig. 5
  • fig. 7 shows a symbolic representation of a third embodiment of a system for simulation of a lighting distribution.
  • Figure 1 shows a symbolic representation of the main components of a system for simulating a lighting distribution of one or more controllable light sources 22 in an illuminated environment 23 according to a first embodiment.
  • the system comprises a client computer 1, having a connected monitor 2 and an input terminal 3.
  • the client computer 1 is connected to a server computer 4 over a communication network 5 using a suitable network interface (not shown).
  • the network 5 may be of LAN, WAN, WLAN or any other suitable type of computer network.
  • the communication network 5 is connected to the internet, using the TCP/IP communication protocol. Further components may be connected to the communication network 5, for example, further computer devices.
  • the client computer 1 serves as an input/output-terminal for interaction with a user, e.g. a light designer.
  • the server 4 computes a preview of the lighting distribution, i.e. at least an image of the simulated lighting scenario of the one or more controllable light sources 22 in the illuminated environment 23.
  • the illuminated environment 23 according to the present embodiment is a section of a department store, shown in a perspective view in fig. 2.
  • the server 4 may be configured for multi-user operation, so that the system may comprise more than one client computer 1, connected to the server 4 at a time.
  • All components of the system shown in fig. 1 may be located at a single installation site, e.g. in close proximity to the light sources 22, but may also be located distant from each other.
  • the client computer 1 may be located at the office of a lighting designer, while server 4 may be located at a suitable data processing centre.
  • the server 4 comprises at least an input interface 8 for sending and receiving information over the communication network 5, having a corresponding network interface (not shown).
  • the input interface 8 is further connected to a database 6 and an image processing unit 9.
  • the server 4 may comprise further components, such as for example additional memory, processors or interfaces.
  • the database 6 stores influence data 7, representing the effect of one or more controllable light sources 22 on the illuminated environment 23.
  • the influence data 7 comprise multiple datasets, allowing a mapping between at least one control parameter of a light source 22 on the effect of the illumination in the illuminated environment 23.
  • Database 6 may be of any suitable type for storing the influence data 7 and may e.g. comprise magnetic, optical or random access memory.
  • an exemplary method may include that images of the illuminated environment 23 are taken.
  • the system therefore comprises at least one digital, e.g. CCD camera 21, observing the illuminated environment 23.
  • the CCD camera 21 is connected to the server system 4 over the network 5.
  • a lighting control unit 24 is provided to control the light sources 22, which light sources 22 are according to the present example arranged for illumination purposes and for ambient lighting, as can be seen from fig. 2.
  • the control unit 24 is connected to the network 5 for communication with the further components of the system.
  • a calibration step is conducted by the server system 4 using the connected CCD camera 21 and the control unit 24 to operate the light sources 22.
  • the CCD camera 21 takes an image of the illuminated environment 23 with all light sources 22 being switched off. Then a specific one of the light sources 22 is driven in accordance with a defined parameter setting and further image is taken by CCD camera 21. The impact of the defined parameter setting is then determined by the server 4 from a comparison between the two images (before/after) and a corresponding set of photometric influence data 7 is generated, representing the effect of said parameter setting on the illuminated environment 23.
  • a heuristic method is applied to all light sources 22 and for every parameter setting of each respective light source 22.
  • Each set of photometric data then represents one specific setting, i. e. a set of values for controllable parameters for each light source 22, for example colour, dimming level, light pattern, etc.
  • the influence data 7 is obtained in or converted to a linear colour space, for example linear sRGB.
  • sensors 25 may be connected to the server 4, such as daylight or scattered light sensors, to compensate for any effect of daylight or further light sources, e. g. non-controllable light sources.
  • the influence data 7 is transferred to the database 6 and stored for further processing.
  • the database 6 then comprises a representation of the effect of the single parameters of the controllable light sources 22 on the illuminated environment 23.
  • FIG 3 An embodiment of the data structure of the influence data 7, stored in said database 6 is shown in figure 3.
  • a set of photometric data I(k) is stored for each parameter setting k, representing the influence of the respective parameter k on each pixel (x, y) with respective RGB values.
  • the image processing unit 9 for example using the sequence of operations according to fig. 4.
  • a first step 41 the database 6 is queried by the client computer 1 to obtain information on the controllable light sources 22 and to inform the user of possible parameter settings of the light sources 22. These possible control parameter settings are then displayed on monitor 2.
  • the user then may set various control parameters for the simulation using the input interface 3.
  • Client computer 1 transfers the set of simulation control parameters - or mathematically a simulation control vector c - to server 4 over the network 5 in step 42.
  • the parameters are received by input interface 8 of the server computer 4.
  • Input interface 8 polls the database 6 in step 43 to obtain the datasets of influence data 7, which corresponds to the parameter settings of control vector c.
  • the obtained influence data 7 is then transferred to image processing unit 9 in step 44, which then generates the preview image. To obtain the latter, image processing unit 9 computes an overlay of the influence data 7 received for the control vector c.
  • K 1n refers to the m-th tri-stimulus value in the respective linear colour space
  • x, y are co-ordinates of the data point
  • i refers to the z-th light source of the lighting system.
  • a vector or matrix I(k) is determined holding the k-th photometric image, which corresponds to a defined parameter k.
  • a spatial filtering (CVDM or S-CIELAB) may be applied to I(k).
  • I(k) is expressed in a device-independent colour space.
  • Such digital pictures are normally stored as Xr x Yr x 3 matrices holding Nb-bit values (where Nb is the colour depth).
  • the preview image can then be computed according to the expression
  • the influence data 7 of intermediate parameter settings is interpolated by the server system 4, for example using linear interpolation.
  • the obtained preview image is then transferred to the client computer 1 in step 45 and displayed on monitor 2, allowing the user to obtain a preview of the lighting distribution in the illuminated environment 23.
  • step 42 the parameters are sent to the control unit 24 in step 46 and the simulated lighting distribution is applied to the light sources 22 in the illuminated environment 23.
  • Fig. 5 shows a second embodiment of a system for simulation of a lighting distribution in a symbolic representation.
  • the present embodiment corresponds to the embodiment of fig. 1, with the addition that multiple CCD cameras 21 are arranged to observe the illuminated environment 23 and are used to obtain influence data 7 from different viewpoints.
  • the influence data 7 comprise three sets of photometric image information I(k) for each parameter setting k, which correspond to the three viewpoints of the cameras 21.
  • An exemplary embodiment of a corresponding data structure is shown in fig. 6.
  • the image processing unit 9 calculates multiple preview images of the illuminated environment 23 according to the method, explained with reference to the embodiment of fig. 1.
  • the sequence of operations according to the present embodiment corresponds to the sequence, explained with reference to fig. 4, with the addition that in step 45 the user chooses the viewpoint according to the given positions of the cameras 21 and the according preview image is then displayed on monitor 2.
  • the present embodiment is not limited to the number of three cameras 21, which may be adapted according to the application.
  • the illuminated environment 23 is rather large, it may be advantageous to provide a higher number of cameras 21 to enable the light designer to obtain a complete and detailed impression of the lighting scenario.
  • Fig. 7 shows a symbolic representation of a third embodiment of a system for simulation of a lighting distribution.
  • the system corresponds to the system according to fig. 5 with the addition of a 3D database 10 and an object processing unit 11, connected to the input interface 8 of the server computer 4.
  • the 3D database 10 stores three-dimensional object data of the illuminated environment 23, e.g. shelves, counters, walls, floor or ceiling in the form of a polygon mesh, i.e. multiple polygons with x, y, z coordinates for further improving the preview, generated by the image processing unit 9.
  • the 3D database 10 may be of any suitable type for storing the object data and may e.g. comprise magnetic, optical or random access memory.
  • the object processing unit 11 of the server 4 obtains images of the illuminated environment 23 from the cameras 21. Since the cameras 21 observe the illuminated environment 23 from different viewpoints, it is possible to generate the three-dimensional object data of the illuminated environment 23 by detecting common points and back calculate where these points are located in the illuminated environment 23, using a common triangulation method. To obtain object data with enhanced accuracy, the positions of the cameras 21 may optionally be provided, for example by manual input. After the objects, i.e. reflecting surfaces, that are visible from the positions of the cameras 23 have been registered, the objects or surfaces are segmented into multiple polygons. The object data is then stored in the 3D database 10, which then comprises a polygon mesh 3D model of the illuminated environment 23.
  • the sequence of operations to obtain the preview image according to the present embodiment corresponds to the embodiment of fig. 5, with the exception that in step 44, after the effect of the light sources 22 according to the control vector c is determined, the generated preview images are combined with the information from the 3D database 10, i.e. the surfaces of the polygons, obtained from the 3D database 10 are rendered with the calculated images according to the respective control vector c. To simplify the calculation, all surfaces are considered completely diffuse. Thus, a rendered 3D model of the illuminated environment 23 is generated.
  • an interpolation using diffuse light is used to show these areas with minimal detail.
  • the diffuse light level is calculated from all lamp control parameters summed up at all, i.e. pixels.
  • these objects are rendered with grey color to show that no measurement information on the effect of the light sources 22 is available.
  • the rendered 3D model is then transferred to client 1, allowing the user to choose a viewpoint using input terminal 3, so that a preview image of the illuminated environment 23 according to the chosen viewpoint is displayed on the monitor 2.
  • the preview image will certainly be most realistic when the viewpoint corresponds to the viewpoint of the cameras 23.
  • the view is interpolated, for instance with the assumption that the surfaces are entirely diffuse.
  • the invention in an embodiment, wherein: at least parts of the functionality of the system is implemented in software, instead of applying the control parameters to the light sources 22 directly in step 46, the control parameters are stored together with the according graphical representation in a further database, for example on the internet, for future reference, - the CCD cameras 21 and light sources 22 are directly connected to the server 4, the databases 6 and/or 10 are directly connected to the communication network 5 or connected to a further server computer, the object data, stored in the 3D database 10 comprise information on width, height, colour, transparency and/or reflectance of the associated object, the influence data 7 and the object data are stored in a common database, the influence data 7, stored in database 6 is associated to respective object data the functionality of server 4 and client 1 is integrated into a single computing unit and/or the object data comprises influence data 7, which influence data 7 represents the effect of the one or more light sources 22 on the illumination on the object.
  • All or some of the exemplary components of the system may be implemented in software.
  • a corresponding computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

A system for simulation of a lighting distribution of one or more controllable light sources (22) in an illuminated environment (23) is disclosed, allowing to generate a graphical representation of the simulated lighting distribution from one or more simulation control parameters and influence data (7), which data (7) represents the effect of said one or more controllable light sources (22) on the illumination of the illuminated environment (23).

Description

SYSTEM FOR SIMULATION OF A LIGHTING DISTRIBUTION
FIELD OF THE INVENTION
The invention relates to a system for simulation of a lighting distribution of one or more controllable light sources in an illuminated environment and a corresponding method.
BACKGROUND OF THE INVENTION
Controllable light sources are used in lighting systems to create defined lighting distributions and gain significance in a variety of office and commercial lighting applications. Such applications include for example room lighting in department stores, restaurants, hotels or other environments, where a defined lighting distribution needs to be created. Controllable light sources typically allow to set one or more parameters, e.g. brightness or colour, so that a defined lighting distribution can be created by setting one or more parameters without a change in the lighting infrastructure.
To create a lighting distribution, a user, e. g. a lighting designer, typically applies a set of control parameters to the light sources and then visually verifies the illumination in the environment. If necessary, the lighting designer further fine-tunes the parameters and checks the lighting distribution until the desired illumination is given. However, such a procedure is rather time-consuming and requires the lighting designer to be present at the site of the lighting system to obtain visual feedback and eventually fine-tune the control parameters. Depending on the environment, it may even not be possible to conduct the afore-mentioned procedure at all, because an illumination is needed at all times and the procedure typically requires to shut off the lighting system.
Therefore, simulation methods have been developed, which allow obtaining a preview of the lighting distribution, for example on a computer screen. Since a precise preview is needed, typically ray-tracing techniques are used for simulation of the lighting distribution. However, simulations using ray-tracing techniques require significant computational effort and time and are thus usually very slow, so that it is not possible for the lighting designer to apply changes to the parameters and to directly obtain real-time visual feedback for fine-tuning of the lighting distribution.
It is therefore an object of the present invention to provide a system and a method for simulation of a lighting distribution, which provides a fast and precise preview of the illumination. SUMMARY OF THE INVENTION The object is solved according to the invention by a system for simulation of a lighting distribution according to claim 1 and a corresponding method according to claim 8. Dependent claims relate to preferred embodiments of the invention.
The basic idea of the invention is to provide a system and method for simulation of a lighting distribution of one or more controllable light sources, which uses influence data, representing the effect of said light sources on the illuminated environment, e. g. measurement data of the light sources, for generating a fast and accurate preview of the lighting distribution according to at least one control parameter.
Within the context of the present invention, the term "illuminated environment" refers to any physical ("real-world") environment or surrounding or any section or spatial part of such environment, which is at least partly illuminated by at least one of said controllable light sources.
Typically, the illuminated environment is the part of the environment of the light sources, which is of interest to the user for the simulation, such as for example a specific point or a small area in the environment, the stage area of a theatre or even a special sales area, for example in a department store. Certainly, it is not necessary that said illuminated environment is entirely illuminated by each of said light sources or that said light sources only illuminate said illuminated environment or section.
The term "lighting distribution" refers to the illumination in the illuminated environment in general and is sometimes also referred to as "lighting scene", "lighting scenario" or "lighting atmosphere".
The inventive system is advantageously not dependent on a certain type of light source and may be employed for the simulation of any kind of light source, as far as at least one parameter of each light source is controllable by sending a corresponding control command to the respective light source. For example, the inventive system may be used for simulation of the illumination of commercially available halogen, gas discharge or LED lighting units. In the simplest case, the control parameter may be the on/off state of the respective light source, but the parameter may also refer to further operational states, such as brightness, i.e. dimming state, color or position of the output beam.
The inventive system comprises at least an input interface and an image processing unit, which may exemplary be provided by a data processing system, comprising for example one or more preferably networked computers, micro controllers or any other type of suitable electronic circuitry.
The input interface is configured to obtain at least one or more simulation control parameters of said one or more light sources. The inventive system uses these parameters to determine a graphical representation of said lighting distribution, i.e. the system determines how the illuminated environment will look like, in case the controllable light sources would be supplied with the simulation control parameters. Certainly, the one or more parameters do not need to refer to all of the controllable light sources. It is sufficient, that the parameter relates to at least one light source.
To obtain the simulation control parameters, the input interface may comprise or be connected to a suitable input device, such as a keyboard, mouse and touch screen or any other device for user input. The input interface may also be configured to obtain the simulation control parameters over a computer network using a network interface, for example from a database server, or from a file on a suitable storage medium, such as an optical or magnetic medium or random access memory. The input interface unit is further configured to obtain influence data, which data represent the effect of said one or more light sources on the illumination of said illuminated environment.
Within the context of influence data, the term "effect" of the light sources may refer to any measurable value describing the impact of light sources on objects (e. g. reflecting walls, counters, etc.) within the observed space, i.e. within said illuminated environment. In a simple embodiment, this may be a geometric brightness distribution, describing the intensity of illumination of a certain object or area by a light source. Also, there may be spectral information, preferably relating to color, but not necessarily limited to the visible range. Generally, the effect may be written as p(x, y, z, lambda), where p is the power distribution measured at a geometric location x, y, z and lambda is the wavelength. Preferably, the influence data is given in a linear color space, such as linear RGB, RGBE or CIE XYZ.
The influence data may thus be formed by any type of information, which renders possible a mapping between at least one control parameter and the effect of the control parameter on the illumination of the illuminated environment.
Said influence data may preferably be obtained directly from one or more corresponding detector units, e.g. digital cameras, observing the illuminated environment. Alternatively, the influence data may also be obtained from a database, file or from a network system, especially in case the inventive system is located remote from the light sources. The input interface may also be configured to allow entering the influence data manually using the afore-mentioned input devices. Details on the acquisition of influence data is disclosed in WO 2008/001259, which is incorporated herein by reference.
The input interface may further provide storage, such as random access memory, magnetic or optical devices, or any other suitable device for storing the obtained data and/or simulation control parameters. The image processing unit then determines a graphical representation of the simulated lighting distribution using said influence data and said one or more simulation control parameters. In context of the present invention, the term "graphical representation" refers to any representation of the lighting distribution, which enables to display a preview of the lighting distribution to the user, e.g. using a computer device. The graphical representation may for example comprise one or more images, photometric information or a model of the illuminated environment.
To achieve the latter, the processing unit determines the influence data, which describes the impact of the obtained simulation control parameter on the illuminated environment. If multiple control parameters are obtained, all associated influence data are determined and the overall influence is computed.
In case the influence data does not comprise information on the obtained simulation control parameter, an approximation may be calculated using the "closest match" comprised in the influence data. For example, in case the control parameter sets lamp 1 to 55% brightness, but the influence data only comprise information on the influence of the control commands "50% brightness" and "60 % brightness", an approximation preferably using linear interpolation, is computed by overlay of the corresponding information.
Once the graphical representation is generated, it may be transferred to a user terminal for display to the user or e. g. stored in a database for further processing. When the user is satisfied with the thus designed lighting distribution, the corresponding control parameters may be sent to the one or more controllable light sources to set the illumination in the "real- world" illuminated environment.
Because of the use of the influence data, i.e. measurement data describing the impact of the control parameters on the "real- world" environment, the inventive system advantageously provides a very fast and accurate preview of the illumination according to the desired simulation control parameters. The invention thus allows the user to "fine-tune" the parameters and to directly obtain real-time visual feedback without the need to actually apply the control parameters to the light sources. Since the inventive system uses data referring to the influence on the environment, the inventive system advantageously operates independent from the type of light sources used and can thus be employed for the simulation of a variety of different types of light sources. Information on the specific radiation pattern of each of the light sources is advantageously not necessarily needed.
Certainly, the accuracy of the graphical representation largely depends on the detail of the obtained influence data. To obtain a further improved graphical representation, the input interface may preferably be adapted to receive an image of said illuminated environment without the light sources being operated, which is then merged with said influence data. Most preferred, the influence data comprises image data, e.g. one or more images, representing the effect of the at least one control parameter on the illumination of the illuminated environment. The image data may be for example obtained from one or more detectors, e. g. cameras, observing the illuminated environment.
In a preferred embodiment, the input interface is further configured to obtain three-dimensional object data of one or more objects of said illuminated environment. The image processing unit then generates the graphical representation using the additional object data to further improve the graphical representation.
In the context of the present invention, the term "object" may refer to any object in the illuminated environment, e.g. a reflecting surface, such as a wall, ceiling, floor, but also shelves, counters or mannequins.
The object data comprises at least positional information of the related object or surface in a three-dimensional space, e.g. geometric coordinates (x, y, z), but may comprise further information, such as shape, texture, color or reflectance of the associated object to further improve the simulation. The object data may preferably be in the form of one or more polygons, e.g. describing reflecting surfaces, to reduce computational effort for generating the graphical representation.
The object data is then transferred to the image processing unit to generate the graphical representation of the illuminated environment and the simulated lighting distribution. To achieve the latter, the image processing unit preferably generates a three-dimensional model of the illuminated environment using the object data, e.g. in the form of a wireframe model or polygon mesh model. Then, the generated model is rendered using the obtained influence data, which is in the field of 3D-graphics also termed as "wrapping" or texture mapping. The graphical representation is then calculated from the rendered three-dimensional model. The present embodiment thus enables to obtain a further improved preview of the illuminated environment.
Details on surface rendering on the basis of three-dimensional object data is, by way of example disclosed in "Texturing & Modeling: A Procedural Approach", Morgan Kaufmann; 3rd edition, December 2, 2002, ISBN-10: 1558608486; "3D computer graphics (3rd edition)", Addison Wesley, 3rd edition, December 16, 1999, ISBN-10: 0201398559 and "Real-Time Rendering", AK Peters Ltd., 2nd edition, July 2002, ISBN-10: 1568811289.
Preferably, the input interface is configured to receive viewpoint selection data, which is then used by the image processing unit to calculate the graphical representation from the specified viewpoint. The viewpoint selection data may for example be obtained from an input device, as mentioned before.
The object data may be obtained by the input interface for example from a suitable database. Alternatively, the object data may be obtained from manual input by the user or derived from a CAD-file.
Preferably, the system may comprise an object processing unit, configured to determine said three-dimensional object data from image data of said illuminated environment and to provide said object data to said input interface.
Various methods exist in the art, which allow obtaining three-dimensional object information from two dimensional image data. For example, object data may be obtained by triangulation from multiple images of the illuminated environment, taken from known, distant positions. The difference in position then establishes the baseline for the images and allows the derivation of a third coordinate. In a subsequent segmentation step, the detected objects are segmented into multiple polygons. Alternatively, the three- dimensional object data may obtained by matching the image data with a library of predefined 3D forms to provide said object data.
For algorithmic simplicity, it is preferred that the object data is segmented into triangles. However, it may also be possible to create polygons with a variable number of corners to reduce the overall number of polygons.
In a preferred embodiment, the system comprises at least a detector unit, connected to said object processing unit to provide image data of said illuminated environment. The at least one detector unit therefore is arranged to observe the illuminated environment at least partly. Preferably, the system comprises at least two detector units, arranged distant to each other to observe said illuminated environment. Most preferably, the at least one detector unit is configured to further obtain said influence data.
The detector unit may be for example a simple digital camera, but may also be of laser scanner type or may use patterned light to determine 3D shapes and positions of objects. Several of such detectors are known in the art, an example of a suitable detector is disclosed in "A 3D multi-aperture image sensor architecture"; IEEE Custom Integrated Circuits Conference 2006, pages 281-284.
According to a preferred embodiment of the invention, the system further comprises a display unit and said image processing unit is configured to provide said graphical representation to said display unit. The display unit may be of any suitable type to provide the user with a preview of the simulated illuminated environment. For example, the display unit may be a computer monitor, LCD, TFT or plasma screen.
The display unit may be integrally formed with the further components of the inventive system, however, it may also be possible, that the display unit is formed separately and is connected to the processing unit by means of a suitable computer network.
Alternatively or additionally, a database unit may preferably be provided for storing the generated graphical representation together with the at least one associated simulation parameter. The graphical representation may in this case be obtained from the database when needed. According to the inventive method of simulating a lighting distribution of one or more controllable light sources in an illuminated environment, influence data are obtained, which data represent the effect of said one or more light sources on the illumination of said illuminated environment, at least one simulation control parameter of said one or more controllable light sources is obtained and a graphical representation of the simulated lighting distribution is generated from said at least one simulation control parameter and said influence data.
The inventive method may at least partly be carried out by a computer program when executed on a computer.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter and the corresponding figures, in which
BRIEF DESCRIPTION OF THE DRAWINGS fig. 1 shows a symbolic representation of a first embodiment of a system for simulation of a lighting distribution, fig. 2 shows an exemplary embodiment of an illuminated environment in a perspective view, fig. 3 shows an embodiment of the data structure of influence data, fig- 4 shows an embodiment of a method for operating the inventive system in a schematic diagram, fig. 5 shows a symbolic representation of a second embodiment of a system for simulation of a lighting distribution, fig. 6 shows a further embodiment of the data structure of influence data according to the embodiment of fig. 5 and fig. 7 shows a symbolic representation of a third embodiment of a system for simulation of a lighting distribution.
DETAILED DESCRIPTION OF EMBODIMENTS
Figure 1 shows a symbolic representation of the main components of a system for simulating a lighting distribution of one or more controllable light sources 22 in an illuminated environment 23 according to a first embodiment. The system comprises a client computer 1, having a connected monitor 2 and an input terminal 3. The client computer 1 is connected to a server computer 4 over a communication network 5 using a suitable network interface (not shown). The network 5 may be of LAN, WAN, WLAN or any other suitable type of computer network. Preferably, the communication network 5 is connected to the internet, using the TCP/IP communication protocol. Further components may be connected to the communication network 5, for example, further computer devices.
The client computer 1 serves as an input/output-terminal for interaction with a user, e.g. a light designer. The server 4 computes a preview of the lighting distribution, i.e. at least an image of the simulated lighting scenario of the one or more controllable light sources 22 in the illuminated environment 23. The illuminated environment 23 according to the present embodiment is a section of a department store, shown in a perspective view in fig. 2. The server 4 may be configured for multi-user operation, so that the system may comprise more than one client computer 1, connected to the server 4 at a time.
All components of the system shown in fig. 1 may be located at a single installation site, e.g. in close proximity to the light sources 22, but may also be located distant from each other. For example, the client computer 1 may be located at the office of a lighting designer, while server 4 may be located at a suitable data processing centre.
The server 4 comprises at least an input interface 8 for sending and receiving information over the communication network 5, having a corresponding network interface (not shown). The input interface 8 is further connected to a database 6 and an image processing unit 9. Certainly, the server 4 may comprise further components, such as for example additional memory, processors or interfaces.
The database 6 stores influence data 7, representing the effect of one or more controllable light sources 22 on the illuminated environment 23. The influence data 7 comprise multiple datasets, allowing a mapping between at least one control parameter of a light source 22 on the effect of the illumination in the illuminated environment 23. Database 6 may be of any suitable type for storing the influence data 7 and may e.g. comprise magnetic, optical or random access memory. To obtain the influence data 7, an exemplary method may include that images of the illuminated environment 23 are taken. The system therefore comprises at least one digital, e.g. CCD camera 21, observing the illuminated environment 23.
The CCD camera 21 is connected to the server system 4 over the network 5. A lighting control unit 24 is provided to control the light sources 22, which light sources 22 are according to the present example arranged for illumination purposes and for ambient lighting, as can be seen from fig. 2. The control unit 24 is connected to the network 5 for communication with the further components of the system.
To obtain the influence data 7 according to an exemplary method, a calibration step is conducted by the server system 4 using the connected CCD camera 21 and the control unit 24 to operate the light sources 22.
First, the CCD camera 21 takes an image of the illuminated environment 23 with all light sources 22 being switched off. Then a specific one of the light sources 22 is driven in accordance with a defined parameter setting and further image is taken by CCD camera 21. The impact of the defined parameter setting is then determined by the server 4 from a comparison between the two images (before/after) and a corresponding set of photometric influence data 7 is generated, representing the effect of said parameter setting on the illuminated environment 23. Such a heuristic method is applied to all light sources 22 and for every parameter setting of each respective light source 22. Each set of photometric data then represents one specific setting, i. e. a set of values for controllable parameters for each light source 22, for example colour, dimming level, light pattern, etc. To allow an easy addition of the light of different light sources 22, the influence data 7 is obtained in or converted to a linear colour space, for example linear sRGB.
Furthermore, other sensors 25 may be connected to the server 4, such as daylight or scattered light sensors, to compensate for any effect of daylight or further light sources, e. g. non-controllable light sources. Once the influence data 7 is determined for all parameter settings, it is transferred to the database 6 and stored for further processing. The database 6 then comprises a representation of the effect of the single parameters of the controllable light sources 22 on the illuminated environment 23.
An embodiment of the data structure of the influence data 7, stored in said database 6 is shown in figure 3. As can be seen, a set of photometric data I(k) is stored for each parameter setting k, representing the influence of the respective parameter k on each pixel (x, y) with respective RGB values.
Having the influence data 7, it is possible to generate the preview image, which shows the simulated lighting distribution. This is done by the image processing unit 9, for example using the sequence of operations according to fig. 4.
In a first step 41, the database 6 is queried by the client computer 1 to obtain information on the controllable light sources 22 and to inform the user of possible parameter settings of the light sources 22. These possible control parameter settings are then displayed on monitor 2. The user then may set various control parameters for the simulation using the input interface 3. Client computer 1 transfers the set of simulation control parameters - or mathematically a simulation control vector c - to server 4 over the network 5 in step 42. The parameters are received by input interface 8 of the server computer 4. Input interface 8 then polls the database 6 in step 43 to obtain the datasets of influence data 7, which corresponds to the parameter settings of control vector c. The obtained influence data 7 is then transferred to image processing unit 9 in step 44, which then generates the preview image. To obtain the latter, image processing unit 9 computes an overlay of the influence data 7 received for the control vector c.
According to the near- linearity of human colour perception, summarised by the Grasmann's law of additive colour mixing for linear colour spaces, the colour resulting from combining several coloured light sources can be calculated as the sum of the tri- stimulus values of the involved light sources, when taken separately: Kι(x,y) = ∑Kι,,(x,y)
Figure imgf000014_0001
, wherein K1n refers to the m-th tri-stimulus value in the respective linear colour space, x, y are co-ordinates of the data point and i refers to the z-th light source of the lighting system.
Thus, it is possible to calculate the impact of multiple light sources 22 on sections of the illuminated environment 23 by summarising the tri-stimulus values of each involved light source 22, i.e. all light sources 22, which are affected by the control vector c. Accordingly, when obtaining information of the impact of each parameter of the light sources 22 on the illuminated environment 23, it is possible to determine the distribution, which will apply, when multiple light sources are operated simultaneously (i.e. predict how it will look like).
In the above calibration procedure, a vector or matrix I(k) is determined holding the k-th photometric image, which corresponds to a defined parameter k. A spatial filtering (CVDM or S-CIELAB) may be applied to I(k). I(k) is expressed in a device-independent colour space. Such digital pictures are normally stored as Xr x Yr x 3 matrices holding Nb-bit values (where Nb is the colour depth).
According to the Grassman's Law, the preview image can then be computed according to the expression
/ = /prεd ({α?f}λ c /3) = J^ i1-
, thus according to the present embodiment, by a pixel- wise overlay of the obtained influence data 7.
Depending on the number of installed light sources 22, it may not be feasible to obtain influence data 7 on all possible parameter settings. In this case, the influence data 7 of intermediate parameter settings is interpolated by the server system 4, for example using linear interpolation. The obtained preview image is then transferred to the client computer 1 in step 45 and displayed on monitor 2, allowing the user to obtain a preview of the lighting distribution in the illuminated environment 23.
In case the user is not satisfied with the illumination, he can change the settings and the above method is executed again, beginning from step 42. If the user is satisfied, the parameters are sent to the control unit 24 in step 46 and the simulated lighting distribution is applied to the light sources 22 in the illuminated environment 23.
Fig. 5 shows a second embodiment of a system for simulation of a lighting distribution in a symbolic representation. The present embodiment corresponds to the embodiment of fig. 1, with the addition that multiple CCD cameras 21 are arranged to observe the illuminated environment 23 and are used to obtain influence data 7 from different viewpoints.
According to the present embodiment, the influence data 7 comprise three sets of photometric image information I(k) for each parameter setting k, which correspond to the three viewpoints of the cameras 21. An exemplary embodiment of a corresponding data structure is shown in fig. 6.
The image processing unit 9 according to the present embodiment calculates multiple preview images of the illuminated environment 23 according to the method, explained with reference to the embodiment of fig. 1. The sequence of operations according to the present embodiment corresponds to the sequence, explained with reference to fig. 4, with the addition that in step 45 the user chooses the viewpoint according to the given positions of the cameras 21 and the according preview image is then displayed on monitor 2.
Certainly, the present embodiment is not limited to the number of three cameras 21, which may be adapted according to the application. For example, in case the illuminated environment 23 is rather large, it may be advantageous to provide a higher number of cameras 21 to enable the light designer to obtain a complete and detailed impression of the lighting scenario.
Fig. 7 shows a symbolic representation of a third embodiment of a system for simulation of a lighting distribution. The system corresponds to the system according to fig. 5 with the addition of a 3D database 10 and an object processing unit 11, connected to the input interface 8 of the server computer 4. The 3D database 10 stores three-dimensional object data of the illuminated environment 23, e.g. shelves, counters, walls, floor or ceiling in the form of a polygon mesh, i.e. multiple polygons with x, y, z coordinates for further improving the preview, generated by the image processing unit 9. The 3D database 10 may be of any suitable type for storing the object data and may e.g. comprise magnetic, optical or random access memory.
To generate the three-dimensional object data, the object processing unit 11 of the server 4 obtains images of the illuminated environment 23 from the cameras 21. Since the cameras 21 observe the illuminated environment 23 from different viewpoints, it is possible to generate the three-dimensional object data of the illuminated environment 23 by detecting common points and back calculate where these points are located in the illuminated environment 23, using a common triangulation method. To obtain object data with enhanced accuracy, the positions of the cameras 21 may optionally be provided, for example by manual input. After the objects, i.e. reflecting surfaces, that are visible from the positions of the cameras 23 have been registered, the objects or surfaces are segmented into multiple polygons. The object data is then stored in the 3D database 10, which then comprises a polygon mesh 3D model of the illuminated environment 23.
The sequence of operations to obtain the preview image according to the present embodiment corresponds to the embodiment of fig. 5, with the exception that in step 44, after the effect of the light sources 22 according to the control vector c is determined, the generated preview images are combined with the information from the 3D database 10, i.e. the surfaces of the polygons, obtained from the 3D database 10 are rendered with the calculated images according to the respective control vector c. To simplify the calculation, all surfaces are considered completely diffuse. Thus, a rendered 3D model of the illuminated environment 23 is generated.
For hidden surfaces, i.e. surfaces of objects, which are not visible from the positions of the cameras 21, an interpolation using diffuse light is used to show these areas with minimal detail. The diffuse light level is calculated from all lamp control parameters summed up at all, i.e. pixels. Alternatively, these objects are rendered with grey color to show that no measurement information on the effect of the light sources 22 is available. The rendered 3D model is then transferred to client 1, allowing the user to choose a viewpoint using input terminal 3, so that a preview image of the illuminated environment 23 according to the chosen viewpoint is displayed on the monitor 2.
The preview image will certainly be most realistic when the viewpoint corresponds to the viewpoint of the cameras 23. In case the user chooses a viewpoint different to the viewpoints of the cameras 23, the view is interpolated, for instance with the assumption that the surfaces are entirely diffuse.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
For example, it is possible to operate the invention in an embodiment, wherein: at least parts of the functionality of the system is implemented in software, instead of applying the control parameters to the light sources 22 directly in step 46, the control parameters are stored together with the according graphical representation in a further database, for example on the internet, for future reference, - the CCD cameras 21 and light sources 22 are directly connected to the server 4, the databases 6 and/or 10 are directly connected to the communication network 5 or connected to a further server computer, the object data, stored in the 3D database 10 comprise information on width, height, colour, transparency and/or reflectance of the associated object, the influence data 7 and the object data are stored in a common database, the influence data 7, stored in database 6 is associated to respective object data the functionality of server 4 and client 1 is integrated into a single computing unit and/or the object data comprises influence data 7, which influence data 7 represents the effect of the one or more light sources 22 on the illumination on the object. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage.
All or some of the exemplary components of the system may be implemented in software. A corresponding computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. System for simulation of a lighting distribution of one or more controllable light sources (22) in an illuminated environment (23), comprising an input interface (8) configured for obtaining at least one or more simulation control parameters of said one or more controllable light sources (22) and influence data (7), which data (7) represent the effect of said one or more light sources (22) on the illumination of said illuminated environment (23), said system further comprising an image processing unit (9) configured to generate a graphical representation of the simulated lighting distribution from said influence data (7) and said at least one simulation control parameter.
2. System according to claim 1, wherein said influence data (7) comprise image data of said illuminated environment (23).
3. System according to any of the preceding claims, wherein said input interface (8) being further configured to obtain three- dimensional object data of one or more objects of said illuminated environment (23) and said image processing unit (9) is configured to generate said graphical representation from said influence data (7), said at least one simulation control parameter and said object data.
4. System according to any of the claims 3, wherein said system further comprises an object processing unit (11) , configured to determine said three- dimensional object data from image data of said illuminated environment (23) and to provide said object data to said input interface (8).
5. System according to claim 4, wherein said system further comprises at least a detector unit, connected to said object processing unit (11).
6. System according to any of the preceding claims, wherein said system further comprises a display unit (2) and said image processing unit (9) is configured to provide said graphical representation to said display unit (2).
7. System according to one of the preceding claims, wherein a database unit is provided for storing one or more graphical representations together with at least one associated simulation control parameter.
8. Method of simulating a lighting distribution of one or more controllable light sources (22) in an illuminated environment (23), in which influence data (7) are obtained, which data represent the effect of said one or more light sources (22) on the illumination of said illuminated environment (23), at least one simulation control parameter of said one or more controllable light sources (22) is obtained and a graphical representation of the simulated lighting distribution is generated from said at least one simulation control parameter and said influence data (7).
9. A computer program enabling to carry out the method according to claim 8 when executed by a computer.
10. A data-carrier comprising the computer program according to claim 9.
PCT/IB2009/055500 2008-12-15 2009-12-04 System for simulation of a lighting distribution WO2010070517A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP08171634 2008-12-15
EP08171634.2 2008-12-15

Publications (1)

Publication Number Publication Date
WO2010070517A1 true WO2010070517A1 (en) 2010-06-24

Family

ID=41582126

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/055500 WO2010070517A1 (en) 2008-12-15 2009-12-04 System for simulation of a lighting distribution

Country Status (1)

Country Link
WO (1) WO2010070517A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298887A (en) * 2011-09-21 2011-12-28 上海大三和弦城市环境艺术有限公司 Method, device and system for simulative preview of LED program
EP2503854A1 (en) * 2011-03-21 2012-09-26 LG Electronics Inc. Lighting system and method for controlling the same
EP2587439A1 (en) * 2011-10-26 2013-05-01 Toshiba Lighting & Technology Corporation Terminal apparatus and system with the terminal apparatus
WO2013098759A1 (en) * 2011-12-31 2013-07-04 Koninklijke Philips Electronics N.V. Personalized lighting for open area
CN104012180A (en) * 2011-12-31 2014-08-27 皇家飞利浦有限公司 Personalized Lighting For Open Area
GB2537666A (en) * 2015-04-23 2016-10-26 P4 Ltd Positioning of lighting units or luminaires
WO2017036071A1 (en) * 2015-08-28 2017-03-09 京东方科技集团股份有限公司 Lighting method and lighting apparatus
EP3364725A4 (en) * 2015-10-12 2019-05-29 Delight Innovative Technologies Limited Method and system for automatically realizing lamp control scenario
WO2019134805A1 (en) * 2018-01-05 2019-07-11 Signify Holding B.V. A controller for controlling a lighting device and a method thereof
US10736202B2 (en) 2017-01-04 2020-08-04 Signify Holding B.V. Lighting control
GB2581249A (en) * 2018-12-10 2020-08-12 Electronic Theatre Controls Inc Systems and methods for generating a lighting design

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008001259A2 (en) 2006-06-28 2008-01-03 Philips Intellectual Property & Standards Gmbh Method of controlling a lighting system based on a target light distribution

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008001259A2 (en) 2006-06-28 2008-01-03 Philips Intellectual Property & Standards Gmbh Method of controlling a lighting system based on a target light distribution

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"A 3D multi-aperture image sensor architecture", IEEE CUSTOM INTEGRATED CIRCUITS CONFERENCE, 2006, pages 281 - 284
"Real-Time Rendering", July 2002, AK PETERS LTD.
ADDISON WESLEY: "3D computer graphics (3rd edition)", 16 December 1999
ELIAS HUGO: "Radiosity", 1 October 2007 (2007-10-01), pages 1 - 18, XP007911573, Retrieved from the Internet <URL:http://web.archive.org/web/20071001024020/http://freespace.virgin.net/hugo.elias/radiosity/radiosity.htm> [retrieved on 20100208] *
MORGAN KAUFMANN: "Texturing & Modeling: A Procedural Approach", 2 December 2002
WIKIPEDIA: "Radiosity (3D computer graphics)", EN.WIKIPEDIA.ORG, 29 October 2008 (2008-10-29), pages 1 - 6, XP007911572, Retrieved from the Internet <URL:http://en.wikipedia.org/w/index.php?title=Radiosity_(3D_computer_graphics)&oldid=248499610> [retrieved on 20100208] *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2503854A1 (en) * 2011-03-21 2012-09-26 LG Electronics Inc. Lighting system and method for controlling the same
US8841865B2 (en) 2011-03-21 2014-09-23 Lg Electronics Inc. Lighting system and method for controlling the same
CN102298887A (en) * 2011-09-21 2011-12-28 上海大三和弦城市环境艺术有限公司 Method, device and system for simulative preview of LED program
EP2587439A1 (en) * 2011-10-26 2013-05-01 Toshiba Lighting & Technology Corporation Terminal apparatus and system with the terminal apparatus
RU2631335C2 (en) * 2011-12-31 2017-09-21 Филипс Лайтинг Холдинг Б.В. Personalized lighting of open site
WO2013098759A1 (en) * 2011-12-31 2013-07-04 Koninklijke Philips Electronics N.V. Personalized lighting for open area
CN104012180A (en) * 2011-12-31 2014-08-27 皇家飞利浦有限公司 Personalized Lighting For Open Area
US9426865B2 (en) 2011-12-31 2016-08-23 Koninklijke Philips N.V. Personalized lighting for open area
GB2537666A (en) * 2015-04-23 2016-10-26 P4 Ltd Positioning of lighting units or luminaires
WO2017036071A1 (en) * 2015-08-28 2017-03-09 京东方科技集团股份有限公司 Lighting method and lighting apparatus
US10021752B2 (en) 2015-08-28 2018-07-10 Boe Technology Group Co., Ltd. Lighting method and lighting device
EP3364725A4 (en) * 2015-10-12 2019-05-29 Delight Innovative Technologies Limited Method and system for automatically realizing lamp control scenario
US10736202B2 (en) 2017-01-04 2020-08-04 Signify Holding B.V. Lighting control
WO2019134805A1 (en) * 2018-01-05 2019-07-11 Signify Holding B.V. A controller for controlling a lighting device and a method thereof
US11310888B2 (en) 2018-01-05 2022-04-19 Signify Holding B.V. Controller for controlling a lighting device and a method thereof
GB2581249A (en) * 2018-12-10 2020-08-12 Electronic Theatre Controls Inc Systems and methods for generating a lighting design
GB2581249B (en) * 2018-12-10 2021-10-20 Electronic Theatre Controls Inc Systems and methods for generating a lighting design

Similar Documents

Publication Publication Date Title
WO2010070517A1 (en) System for simulation of a lighting distribution
CN111713181B (en) Illumination and internet of things design using augmented reality
EP2628363B1 (en) A method, a user interaction system and a portable electronic devicefor controlling a lighting system
US10937245B2 (en) Lighting and internet of things design using augmented reality
RU2549185C2 (en) Method and pc-based device for control of lighting infrastructure
CA2832238C (en) Systems and methods for display of controls and related data within a structure
CN111723902A (en) Dynamically estimating lighting parameters for a location in an augmented reality scene using a neural network
Fender et al. Optispace: Automated placement of interactive 3d projection mapping content
JP6610065B2 (en) Cosmetic material simulation system, method, and program
KR20210086837A (en) Interior simulation method using augmented reality(AR)
JP2018092503A (en) Indoor light simulation system, indoor light simulation method and program
US11341716B1 (en) Augmented-reality system and method
JP7031365B2 (en) Light environment evaluation device and program
Anrys et al. Image-based lighting design
CN114764840A (en) Image rendering method, device, equipment and storage medium
EP3794910B1 (en) A method of measuring illumination, corresponding system, computer program product and use
JP6679966B2 (en) Three-dimensional virtual space presentation system, three-dimensional virtual space presentation method and program
JP5999802B1 (en) Image processing apparatus and method
TWI671711B (en) Apparatus and method for simulating light distribution in environmental space
TWI603287B (en) Image synthesis method of a virtual object and the apparatus thereof
KR20160006087A (en) Device and method to display object with visual effect
TWI787853B (en) Augmented-reality system and method
JP6610064B2 (en) Architectural material uneven pattern image processing system, method, and program
EP4102468A1 (en) Augmented-reality system and method
WO2024132188A1 (en) Computer-implemented lighting planning of an interior room

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09787380

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09787380

Country of ref document: EP

Kind code of ref document: A1