WO2021058191A1 - Procédés d'éclairage d'un tableau - Google Patents

Procédés d'éclairage d'un tableau Download PDF

Info

Publication number
WO2021058191A1
WO2021058191A1 PCT/EP2020/072406 EP2020072406W WO2021058191A1 WO 2021058191 A1 WO2021058191 A1 WO 2021058191A1 EP 2020072406 W EP2020072406 W EP 2020072406W WO 2021058191 A1 WO2021058191 A1 WO 2021058191A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
artwork
global
local
function
Prior art date
Application number
PCT/EP2020/072406
Other languages
English (en)
Inventor
Alberto Alfier
Norbert Haas
Marco Angelini
Renato Frison
Carlo VENTURATI
Andrea Morra
Benjamin BRUDNJAK
Inna Susin
Guido Angenendt
Original Assignee
Osram Gmbh
Clay Paky S.P.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Osram Gmbh, Clay Paky S.P.A. filed Critical Osram Gmbh
Publication of WO2021058191A1 publication Critical patent/WO2021058191A1/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0266Field-of-view determination; Aiming or pointing of a photometer; Adjusting alignment; Encoding angular position; Size of the measurement area; Position tracking; Photodetection involving different fields of view for a single detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • H05B45/22Controlling the colour of the light using optical feedback
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/44Electric circuits
    • G01J2001/444Compensating; Calibrating, e.g. dark current, temperature drift, noise reduction or baseline correction; Adjusting
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • illumination/lighting systems such as illumination systems configured to illuminate artworks in an exposition area.
  • an illumination system comprises at least one light fixture comprising one or more light sources, such as lighting diodes (LEDs).
  • LEDs lighting diodes
  • Smart illumination can be adapted to specific requirements of persons or objects, or it can be adapted to sensor data.
  • the light sources can be switched on in the presence of persons and/or the light intensity may be varied based on the ambient light.
  • smart illumination may also be used for more complex applications.
  • smart illumination may be used in office buildings, factories, but also in a museum (including an art gallery, i.e. illumination of artworks) or in the entertainment sector, e.g. for effect lighting purposes.
  • a museum including an art gallery, i.e. illumination of artworks
  • the entertainment sector e.g. for effect lighting purposes.
  • US 7,796,034 B2 United States Patent Publication No. US 7,796,034 B2.
  • Figure 1 shows an example of a typical exposition area 160 comprising one or more artworks 140, such as a painting, a picture, a sculpture, an assortment of various pieces of art, people and the like.
  • An artwork may also encompass (at least in part) self-lit objects.
  • the exposition area 160 may be located in an art museum or an art gallery or an exhibition or somewhere else, both inside a building or outside.
  • the artworks 140 may be fixed to walls 163 of a room representing the exposition area 160.
  • a room may thus also comprise a ceiling 161, a floor 162, an entrance and/or exit 165, such as a door, and optionally one or more windows 164.
  • the exposition area 160 may thus be arranged one or more light fixtures configured to illuminate the floor 162 and/or the walls 163, in particular the artworks 140.
  • the one or more light fixtures may be configured to illuminate a sculpture positioned at a given place in the room 160.
  • an artwork may be illuminated by natural light, e.g. entering through a window 164, and/or artificial light provided by the one or more light fixtures.
  • Illumination in such exposition areas may play a crucial role.
  • modem light fixtures for home or industrial application often have variable/settable spectral characteristic.
  • such light fixtures may often be controlled remotely in order to set a brightness level and/or a color.
  • the light generated by the light fixtures often has to satisfy more stringent requirements.
  • pigments or other materials e.g. textiles or canvas
  • the illumination i.e. the color temperature or the color, plays an important role in the presentation of the artwork, as the right illumination can highlight certain aspects of the artwork, whereas the wrong illumination can ruin the whole impression of the artwork.
  • one or more of the above objectives is achieved by means of methods of illuminating an artwork.
  • Embodiments moreover concern related lighting systems, and related light fixtures and light sensors, as well as computer-program products, loadable into the memory of at least one processor and comprising portions of software code capable of implementing the steps of the method when the product is run on at least one processor.
  • reference to such a computer-program product is understood to be equivalent to a reference to a computer-readable, non-transitory medium containing instructions for controlling the processing system for coordinating implementation of the method according to the invention.
  • the reference to "at least one processor" is evidently intended to highlight the possibility that the present invention is implemented in a modular form and/or distributed.
  • an illumination system comprises a light fixture, an optional sensor and a control system. Possible embodiments of such lighting systems are detailed at the following point "Example 1".
  • a first aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the artwork for at least one wavelength or wavelength range.
  • the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork for at least one wavelength or wavelength range and measuring via the light sensor the global and/or local light intensity values of the light reflected by the artwork; during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork; and during a normal operation phase, measuring via the light sensor the global and/or the plurality of local light intensity values of the light reflected by the artwork, and estimating via the mathematical function or the dataset the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork.
  • a second aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a reference luminance target is installed in proximity of the artwork, whereby the reference luminance target is illuminated with the light emitted by the one or more light fixtures, and wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the reference luminance target for at least one wavelength or wavelength range.
  • the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork and/or at the reference luminance target for at least one wavelength or wavelength range, and measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target; during a training phase, determining a mathematical function and/or a dataset adapted to estimate the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity values of the light reflected by the reference luminance target; and during a normal operation phase, measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target and estimating the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity of the light reflected by the reference luminance target.
  • a third aspect of the present disclosure relates to a lighting system configured to monitoring the irradiation of an object with light generated by a light fixture.
  • the light system comprises the light fixture comprising one or more light sources, which together are configured to emit light with a spatial radiation characteristic, a data processing unit connected to the light fixture and configured to obtain information on an intensity of the light emitted by the light sources, a first memory connected to the data processing unit, in which information about the spatial positioning of the light fixture with respect to a surface of the object is stored, and a second memory connected to the data processing unit, in which information about the spatial radiation characteristic of the one or more light sources or the light fixture is stored.
  • the data processing unit is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface of the object as a function of the information on the light intensity, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture.
  • a fourth aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a light fixture comprising a plurality of light sources, a driver circuit configured to provide an individually controllable power supply to each of the light sources as a function of one or more control signals, a data storage device having stored at least one preset configuration data item, and a data processing unit comprising a memory.
  • the method comprises: reading a preset configuration data item from the data storage device and storing the preset configuration data item into the memory; and generating the one or more control signals as a function of the configuration data stored to the memory.
  • a fifth aspect of the present disclosure relates to a method of operating a light fixture comprising a light module comprising one or more light sources, a power supply circuit configured to provide a DC voltage, a regulated current generator configured to provide an output current to the one or more light sources as a function of a reference signal, a current sensor configured to provide a first measurement signal indicative of the output current, and a data processing unit operatively connected to the regulated current generator and the current sensor.
  • the method comprises executing the following steps via the data processing unit: setting the reference signal as a function of data identifying a requested illumination to be generated by the one or more light sources; determining an upper and a lower current threshold as a function of the reference signal; obtaining the first measurement signal;
  • a sixth aspect of the present disclosure relates to a method of producing a translucent optical element for a light fixture, wherein the translucent optical element is implemented with a translucent material comprising a first surface for receiving a light radiation and an opposite second surface for providing an attenuated second light radiation, wherein the second surface is arranged at a given variable thickness from the first surface.
  • the method comprises the steps of: obtaining a first matrix of first light intensity values, wherein each first light intensity value is associated with a respective area of the first surface and identifies the intensity of light expected to enter the respective area of the first surface; obtaining a second matrix of second light intensity values having the same dimension as the first matrix, wherein each second light intensity value is associated with a respective area of the second surface and identifies the intensity of light requested to exit the respective area of the second surface when the expected intensity of light enters the first surface; calculating a matrix of light transmission ratios having the same dimension as the first matrix and the second matrix, wherein each light transmission ratio is calculated as a function of a respective first light intensity value and a respective second light intensity value; obtaining an attenuation factor of the translucent material; calculating a matrix of thickness values having the same dimension as the matrix of light transmission ratios, wherein each thickness value is calculated as a function of a respective light transmission ratio and the attenuation factor of the translucent material, and wherein the matrix of thickness
  • a seventh aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with at least one light fixture. Specifically, in various embodiments, the method comprises:
  • each dataset comprising: o data identifying a list of pigments of the respective artwork; o data identifying the illumination of each pigment of the list of pigments during a given time period; o data identifying the ageing of each pigment of the list of pigments during the given time period;
  • An eights aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command.
  • the method comprises the steps of: obtaining data identifying requested spectral characteristics, obtaining data identifying a viewer’s eye characteristics,
  • a ninth aspect of the present disclosure relates to a method of selecting at least one light fixture by: obtaining data identifying characteristics of an artwork, obtaining data identifying characteristics of an exposition area, determining a set of light fixtures and/or operating setting for a set of light fixtures as a function of the data identifying characteristics of the artwork and the data identifying characteristics of the exposition area.
  • a tenth aspect of the present disclosure relates to a method of selecting at least one light sensor for a lighting system used to illuminate at least one artwork in an exposition area via one or more light fixtures configured to emit light with variable characteristics as a function of a control command.
  • the method comprises the steps of: obtaining a digital model of the exposition area, the digital model including: o exposition area data comprising data identifying the dimension of the exposition area; o artwork data comprising data identifying the position of the at least one artwork within the exposition area; o light fixture data comprising data identifying the position, orientation and illumination characteristics of the one or more light fixtures; and o background illumination data comprising data identifying the position and illumination characteristics of other natural and/or artificial light sources emitting light within the exposition area; executing a plurality of illumination simulations of the digital model of the exposition area by varying the illumination characteristics of the one or more light fixtures and/or the illumination characteristics of the other natural and/or artificial light sources, and determining for each illumination simulation data identifying
  • FIG. 1 shows an example of an exposition area comprising one or more artworks
  • FIG. 2 shows an embodiment of an illumination system comprising at least one light fixture and a control system
  • FIG. 3 shows an embodiment of an integrated illumination system comprising at least one light fixture and a control system
  • FIG. 4 shows an embodiment of the illumination system of Figure 2 or Figure 3;
  • FIG. 5 shows an embodiment of a light fixture comprising a driver circuit and a lighting module
  • FIG. 6 shows an embodiment of a lighting module
  • FIG. 7 shows a first embodiment of a control system for an illumination system configured to control one or more light fixtures as a function of a viewer’s eye characteristics
  • FIG. 8 shows a second embodiment of a control system for an illumination system configured to control one or more light fixtures as a function of a viewer’s eye characteristics
  • FIG. 9 shows an embodiment of a method of selecting a set of light fixtures and displaying a preview of the illumination
  • FIG. 10 and 11 show embodiments for selecting a set of light fixtures in the method of Figure 9;
  • FIG. 12 shows an example of a selected set of light fixtures
  • FIG. 13 shows an embodiment of a method of selecting a set of light sensors for a given exposition area
  • - Figure 14 shows an example of a selected set of light sensors.
  • - Figure 15 shows a first embodiment of a system for monitoring the irradiation of an object with light generated by a light fixture;
  • FIG. 16 shows an example of an intensity distribution on a surface of an object, as determined by the system of Figure 15;
  • FIG. 17 shows an example of a spatial position between a light fixture and an object
  • FIG. 18 shows a flow chart of the operation of the system of Figure 16
  • FIG. 19 shows a second embodiment of a system for monitoring the irradiation of an object with light generated by a light fixture
  • FIG. 22 and 23 show a third embodiment of a system for monitoring the irradiation of an object with light generated by a light fixture
  • FIG. 24 shows a flow chart of the operation of the system of Figure 23;
  • FIG. 25 shows an embodiment of a light fixture with individually controllable light sources
  • FIG. 27 shows an embodiment of optics adapted to be used with the light modules of Figures 26, 28 and 29;
  • FIG. 30 shows a block diagram and Figure 31 a flowchart of the operation of the light fixture of Figure 25;
  • FIG. 32 shows a first embodiment of a light fixture comprising a regulated current generator and a data processing unit
  • FIG. 33 shows a second embodiment of a light fixture comprising a regulated current generator and a data processing unit
  • FIG. 34 shows a third embodiment of a light fixture comprising a regulated current generator and a data processing unit
  • FIG. 35 shows a fourth embodiment of a light fixture comprising a regulated current generator and a data processing unit
  • FIG. 36 shows a fifth embodiment of a light fixture comprising a regulated current generator and a data processing unit
  • FIG. 37 is a flowchart showing embodiments of the operation of the data processing unit of Figures 32 to 36;
  • FIG. 38 shows a lighting system configured to monitor ageing of an artwork
  • FIG. 39 shows an embodiment of the operation of the lighting system of Figure 38.
  • FIG. 40A, 40B and 40C show embodiments of the optics of a light fixture
  • - Figure 42 shows an embodiment of the optics of a light fixture comprising a translucent optical element
  • - Figure 43 shows an embodiment of a method of producing the translucent optical element of Figure 42;
  • various embodiments of the present disclosure relate to illumination/lighting systems adapted to be used for illuminating artworks in an exposition area, such as a room of a museum or a gallery, or other rooms, e.g. of historical buildings, e.g. religious buildings, or even outdoor environments.
  • the disclosed lighting systems may also be used to illuminate home or industrial areas and may also be used to illuminate other objects and/or persons.
  • the term room refers any area having a floor and one or more walls. Accordingly, the term room includes both closed environments (e.g. having four walls and an entrance/exit) and open environments, such as an open stage. Moreover, the term does not necessarily imply a room of a building, e.g. having brick or cement walls, but also refers to temporary installations, such as at faire. Accordingly, a room may be both inside a building or outdoors. Finally, when referring to the room height, the respective height may correspond to the height of a ceiling of the room, or to infinite in case of a room without ceiling, such as an open stage, wherein light fixtures may be installed on sperate support structures.
  • Figure 2 shows a first embodiment of a lighting system 100.
  • the illumination system 100 comprises a control system 130 and one or more ( i.e . a given number n of) light fixtures 110i..l l0 n.
  • the illumination system 100 comprises also one or more (i.e. a given number m of) sensors 120i..l20 n.
  • the sensors 120 may be configured to measures parameters related to the light fixture, the illuminated object 140 (e.g. damages or certain properties of the object), the environment, or the person 150 visiting the object 140.
  • the control system 130 is configured to control one or more parameters of the light fixtures 110i..ll0 n.
  • the control system 110 may be configured to switch-on or switch-off one or more of the light fixtures 110i..ll0 n.
  • control system 130 may be configured to modify the illumination, such as the intensity, color, color temperature, illumination pattern, etc., based on pre-defmed rules, signals received from the sensors 120i..l20 n or configuration data received via a user interface and/or a communication interface.
  • the control system 130 may perform an automated calibration of the light emitted by the light fixtures 110i..l 10 n or assist a user to configure the light emitted by the light fixtures 1101..110 n or may even perform an automated mesh-like configuration of a network of light fixtures 110i..ll0 n.
  • the control system 110 may be configured to vary the light emitted by the light fixtures 1101..110 n as a function of data received via the user interface and/or the communication interface and/or sensor 120 in order to permit an interaction with a user.
  • Figure 3 shows an embodiment of an integrated lighting system 100, wherein the control system 130 may be integrated with a light fixture 110.
  • one or more sensors e.g. a sensor 120i
  • one or more sensors e.g. a sensor 1202
  • a sensor 1202 may be integrated with the light fixture 110 and/or one or more sensors, e.g. a sensor 1202, may be external with respect to the light fixture.
  • Figure 4 shows an embodiment of the various blocks of the illumination systems 100 shown in Figure 2 and 3.
  • the illumination system 100 comprises at least one light fixture 110, a control system 130 (possibly integrated with the light fixture) and optionally one or more sensor 120
  • the light fixture 110, the control system 130 and the optional sensor 120 comprise a respective communication interface 111, 131 and 121.
  • any wired or wireless interface may be used.
  • any digital or analog communication may be used for exchanging data between the various circuits.
  • control system 130 may measure an analog signal provided by a sensor and/or the control system 130 may vary an analog control signal of the light fixture 110, such as a voltage or current signal indicative of a request brightness.
  • Possible digital wired communication interfaces 111, 121 and 131 may include a Digital Addressable Lighting Interface (DALI), an Ethernet interface, and/or a Controller Area Network (CAN) bus.
  • DALI Digital Addressable Lighting Interface
  • Ethernet Ethernet interface
  • CAN Controller Area Network
  • Possible wireless communication interfaces 111, 121 and 131 include Bluetooth®, Zigbee®, Wi Fi (i.e . wireless communication according to one of the IEEE 802.11 standards), a mobile communication interface, such as General Packet Radio Services (GPRS), Universal Mobile Telecommunications System (UMTS) or Long-Term Evolution (LTE) communication module, and/or optical wireless communication, e.g. via a modulation of visible and/or infrared (IR) radiation.
  • GPRS General Packet Radio Services
  • UMTS Universal Mobile Telecommunications System
  • LTE Long-Term Evolution
  • each of the communication interfaces 111, 121 and 131 may also comprise a plurality of communication interfaces.
  • control system 130 may be connected to the light fixtures 110 and optionally to the sensors 120 via any suitable dedicated or shared communication channel.
  • a communication network is formed between the interfaces 111, 121 and 131, wherein each of the communication interfaces 111, 121 and 131 has associated a respective address.
  • the network may comprise a Local Area Network (LAN) and/or a Wide Area Network (WAN), such as Internet.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the interfaces 111, 121, 131 are configured to connect light fixtures 110 with other light fixtures 110, sensors 120 and control systems 130.
  • the interfaces 111, 121, 131 may be configured to connect sensors 120 with light fixtures 110, other sensors 120 and control systems 130.
  • the interfaces 111, 121, 131 may be configured to connect the control systems 130 with light fixtures 110, sensors 120 and other control systems 130.
  • the communication interfaces 111, 121, 131 may be used only to exchange data, or the communication interfaces may also provide energy, e.g. to the light fixture and/or the sensor 120
  • a light fixture 110 comprises at least one lighting module 118 comprising one or more light sources 117 for the illumination of an object 140 or a person 150.
  • the illumination provided by the light fixture 110 may be static or dynamic, i.e. the intensity or the color or the beam spread can change over time.
  • the light fixture 110 is configured to provide an illumination with a CRI (Color Rendering Index) of 93 or higher.
  • the light fixture 110 may comprise a mounting feature to mount the light fixture to a wall 163 or ceiling 160, a track or a frame. It can be fixedly installed or portable. In various embodiments, the light fixture 110 may be produced so that its parts can easily be replaced, for example it can be structurally configured in a modular arrangement.
  • the one or more light sources 117 may be selected from the following group of light sources or a combination thereof: light emitting diode (LED) including a phosphor conversion LED (pc-LED) using a fluorescent and/or phosphorescence substance for conversion, laser diode (LD), laser activated remote phosphor (LARP) light sources, organic light source such as OLED, or a quantum dot based light source.
  • LED light emitting diode
  • pc-LED phosphor conversion LED
  • LD laser diode
  • LFP laser activated remote phosphor
  • organic light source such as OLED
  • quantum dot based light source a quantum dot based light source.
  • the one or more light sources 117 are configured to emit radiation in the wavelength range between UV and infrared, but preferably in the visible range, i.e. between 400 and 800 nm.
  • the one or more light sources 117 are of one color like e.g. white.
  • the light module comprises a plurality of light sources having different colors, such as red, green and blue to create white light.
  • the light module 118 can also comprise white light sources of different color temperatures.
  • the light emitted by the light source 117 may be combined by an optics, e.g. to form a homogenous beam of one color.
  • the optics 115 may include one or more of the following elements: a lens and/or reflector,
  • - means for reducing glare, a diffuser or diffusive layer, optical filters for color changing, and/or a framer or shutter.
  • the diffuser has an inhomogeneous distribution of diffusing strength along the diffusive area.
  • the diffusion may be homogenous on a certain area of the diffuser and stronger, i.e. more highly diffused, on the side where the illumination of a first light fixture overlaps with the illumination of another light fixture.
  • the other light fixture may have a diffuser with stronger diffusion on the overlapping edge as well, thus resulting in a more homogenous illumination in the overlapping area.
  • the different diffusion strength can be realized, e.g. , by the form of the diffuser and/or the distribution of scattering materials in the diffuser.
  • the light fixture 110 may also include one or more actuators 114.
  • the actuator 114 may comprise a motor which can modify the orientation of the light fixture 110 or the position of parts of the light fixture 110 like the optics 115 or the light source 117 to achieve a flexible beam angle.
  • the actuator(s) 114 are configured to change their position or positions of part of the light fixture 110, such as the optics 115.
  • the optics may have a fixed position relative to the light module or the position can be changed using an actuator.
  • the optics 115 may be configured to perform a beam shaping and the actuators 114, such as motors, may be configured to modify the orientation of the light fixture 110 and/or the light sources 117 or the position, shape or properties of the optics 115. This can comprise the angle relative to a reference plane or moving the light fixture and/or the optics in a certain direction or spatial position. It can even be possible to change the form of the optics 117 and/or the light fixture 110 using an actuator 114.
  • an actuator 114 is configured to modify the form of a framer.
  • a framer is understood in the art as a shutter system that lets the light pass which fits the contour of the artwork but shuts out the rest of the light.
  • art lighting is conventionally a spotlight and emits some form of circular illumination.
  • certain artworks, especially two-dimensional artworks such as paintings are commonly rectangular.
  • shutters can be added at the edges of the luminaire that shut out the round edges of the illumination, resulting in a projected rectangular beam. The position of the shutters is not fixed, so they are adaptable to the size of the painting sought to be illuminated.
  • a framer can have any form so that the contour of the artwork is met, such as e.g.
  • An exemplary gobo is a stenciled circular disc used in lighting fixtures to create a projected image or pattern.
  • Conventional gobos are made of a variety of materials depending on their purpose, such as made of sheet-metal and referred to as “black and white”; or made of glass with a thin layer of aluminum and etched and can be “colored” to function as a color filter; or made of plastic typically for use with LED fixtures.
  • the light module may be configured to generate a pixelized light (“digital gobo”) and e.g. comprise an array of light sources.
  • each pixel of the array can include a single LED, more than one LEDs or a single or several mini-LEDs or micro-LEDs.
  • Each pixel and/or each LED can be controlled independently.
  • a liquid-crystal based system e.g. LCoS (liquid crystal on silicon) can be used to generate a pixelized light.
  • the liquid crystal system can be used in transmission or reflection.
  • a device referred to in the lighting arts as a Digital Micromirror Device (DMD) could be used as well to create a pixelized light.
  • DMD Digital Micromirror Device
  • a pixelized light it is possible to project a defined pattern on the object without using mechanical means such as a mechanical framer to limit the geometrical extent of the illumination pattern.
  • the boundaries of the illumination can be set using sensors or they can be set according to user definitions.
  • Using a digital gobo can provide highly specific illumination for 3D-objects.
  • the form of the object to be illuminated can be provided using a depth camera (such as a time-of-flight (TOF) camera).
  • TOF time-of-flight
  • one of the sensors 120 may be such a depth camera.
  • the lighting module 118 may thus have associated a driver circuit 116 configured to control the power provided to the lighting module 118, and in general the operation of the lighting module 118.
  • the driver 116 may provide a DC or AC current to the light source(s) 117. It can drive the light sources 117 using pulse width modulation (PWM) or any other pulse pattern modulation, or current modulation or a combination thereof. It can dim the light sources 117 individually and adjust the color temperature and the color of the light module or any other parameter relevant for the light quality, such as the CRI or the spectrum, in particular the spectral power distribution.
  • PWM pulse width modulation
  • Figure 5 shows an embodiment of a driver circuit 116 and a lighting module 118.
  • the driver circuit 116 is an AC/DC or DC/DC electronic converter. Therefore, the electronic converter 116 includes two input terminals 116a and 116b for the connection to an AC or DC power supply, such as the mains, and two output terminals 116c and 116d for connection to one or more lighting modules 118.
  • the electronic converter 116 may be either a voltage generator or a current generator.
  • the lighting module 118 may be configured to be supplied with a regulated voltage or current. Accordingly, the electronic converter 116 may receive at input, via terminals 116a and 116b, e.g.
  • V m.A c such as 110 or 230 VAC
  • Vout a regulated voltage Vout , such as e.g. 12 or 24 VDC, or a regulated current i out.
  • Figure 6 shows an embodiment of a lighting module 118.
  • the lighting module 118 includes a positive input terminal 118a and a negative input terminal 118b, for the connection to the terminals 116c and 116d of the electronic converter 116.
  • the lighting module 118 may be connected, either directly or through a cable, to the electronic converter 116.
  • the lighting module 118 is a LED module including one or more LEDs (or laser diodes) 117, connected between the terminals 118a and 118b.
  • module 118 may include a LED chain or string 117, wherein a plurality of LEDs 117i and IP2 (or similarly laser diodes) are connected in series.
  • the lighting module 118 is supplied with a regulated voltage
  • the lighting module 118 typically includes a current regulator 118c, connected in series with the LED string 117.
  • the current regulator 118c may be a resistor or a linear current regulator.
  • the current regulator 118c may also be implemented by current mirrors or by a switched mode current source, typically including an inductor and an electronic switch.
  • a plurality of lighting modules 118 may be connected to the electronic converter 116.
  • the lighting modules 118 may be connected in parallel to the terminals 116c and 116d.
  • the lighting modules 118 are typically connected in series between the terminals 116c and 116d.
  • Figure 7 shows an embodiment of a generic switched mode power supply/electronic converter 116 for a lighting module.
  • the electronic converter/driver 116 includes a switching stage 116h and in case of an AC/DC converter also rectification circuit 116f.
  • the input of the rectification circuit 116f such as e.g. a diode bridge, is connected (e.g. directly) to the terminals 116a and 116b. Therefore, the rectification circuit 116f receives at input the input voltage V m,A c and provides at output a DC voltage V m ,DC.
  • an input filter circuit 116e configured to filter the noise produced by the electronic converter 116.
  • a filter circuit 116g such as e.g. a capacitor connected in parallel with the output terminals of the rectification circuit 116f. Therefore, in this case, the filter circuit 116g receives (e.g. directly) the voltage V m ,DC and provides at output a filtered voltage, typically called a bus voltage, Vbus. In this case, therefore, the switching stage 116h receives at input the voltage Vbus.
  • an electronic converter with power factor correction (PFC) PFC
  • the switching stage 116h includes one or more electronic converters, adapted to control the current flow through a reactive element Rn 6.
  • a reactive element Rn 6 is a resonant circuit, including one or more inductive elements Ln 6 , such as inductors, and one or more capacitive elements Cii 6 , such as capacitors.
  • the switching stage 116h is configured to apply an alternated voltage to the reactive circuit Rn 6.
  • the switching frequency of stage 116h may be in a range between 1 kHz and 500 kHz, preferably between 20 kHz and 200 kHz.
  • a further filter circuit 116i may be provided, which is connected between the output of converter 116.
  • the switching stage 116h is driven by a control circuit 116m, i.e. the control circuit 116m is configured to generate one or more drive signals DRV116 for driving the switching stage 116h, so as to regulate the output voltage V out or the output current i out to a desired value.
  • the control circuit 116m may be any analog or digital circuit.
  • the driver 116 may comprise a feedback circuit 116k configured to provide a feedback signal FBue which is determined as a function of the output voltage V out (for a voltage generator) or of the output current i out (for a current generator).
  • a feedback signal FBue which is determined as a function of the output voltage V out (for a voltage generator) or of the output current i out (for a current generator).
  • control circuit 116m may be configured to generate one or more drive signals DRV P6 until the feedback signal FBue corresponds to a requested value, e.g. indicative of a requested power supply to be provided to the lighting module 118.
  • control circuit 116m may regulate the brightness of the light emitted by the lighting module 118.
  • control circuit 116m may be configured to vary the average power supply by:
  • varying the instantaneous power supply e.g. by varying the reference value for the feedback signal, thereby performing an amplitude modulation of the current provided to the lighting source 117; and/or enabling/disabling the output of the electronic converter 116, e.g. by using a pulse width modulation (PWM) or any other pulse pattern modulation, wherein the pulse pattern modulation has a frequency which is smaller than the switching frequency of the switching stage 116h.
  • PWM pulse width modulation
  • the driver circuit 116 may also comprise a plurality of switching stages (with respective feedback loop) for providing a different power supply to a plurality of light sources. For example, by individually varying the power supply provided to light sources having different colors, the driver circuit 116 is able to perform a color mixing operation.
  • the control circuit 116m may receive data identifying the requested power supply form the control system 130 ( e.g . via the communication interfaces 111 and 131) or form a data processing unit (DPU) 113 of the light fixture 110, such as a digital processing unit, such as a microprocessor programmed via software instructions.
  • the data processing unit 113 may control the driver 116 in collaboration with the control system 130 or independently from the control system 130.
  • the control circuit 116m may also vary the requested power supply as a function of the data provided by a sensor 120, such as an ambient light sensor.
  • the illumination generated by the light fixture 110 may be controlled by the control system 130 via the driver 116, wherein preferably the individual light sources 117 may be controlled independently.
  • the light fixture 110 may also include a data processing unit 113 and/or a data storage device (DSD) 112, e.g. having stored the software to be executed by a microprocessor 113.
  • the processing unit 113 may also be configured to implement (at least in part) the operation of the control circuit 116m.
  • the processing unit 113 may implement (at least in part) the control system 130.
  • Figure 4 also shows an embodiment of a sensor.
  • the sensor 120 comprises an interface 121 for providing an analog signal or digital data to the control system 130 and/or the light fixture 110.
  • the sensor 120 may also be part of the light fixture 110, but it can also be separate from the light fixture 110 or it can be part of another device like a smartphone or a guiding-system used by a visitor in a museum.
  • An example of a guiding-system used by a visitor in a museum is a virtual reality headset with transparent lenses such as that sold by Microsoft Corp. under the trade designation or trademark “HoloLens”.
  • the sensor can be used to measure features of the light fixture.
  • the sensors can also be used to monitor an object or a person.
  • any sensor 120 may be used, such as a resistive, a capacitive, an inductive, a magnetic, an optical (e.g. a camera or spectral sensor), an acoustic and/or a chemical sensor.
  • the senor 120 may be configured to process the information transmitted via the interface 121.
  • the sensor 120 may comprise a data processing unit 123, such as a digital processing unit, such as a microprocessor programmed via software instructions, and optionally a data storage device 122, e.g. having stored the software to be executed by a microprocessor 123.
  • the senor 120 may comprise one or more actuators 124, such as motors, e.g. controlled by the processing unit 123.
  • an actuator 124 may be configured to change the position or position of part of the sensor 120. This can comprise the angle relative to a reference plane or moving the sensor in a certain direction or spatial position.
  • the control system 130 is configured to receive data from the sensor 120 and control the light fixture 110 and optionally the sensor 120. For example, the data can be analyzed by the data processing unit 133 and/or stored in the data storage device 132.
  • the sensors 120 may measure data indicative of an object 140 and/or a person 150. The control system 130 may then analyze the data and set the illumination of the light fixture 110 accordingly to illuminate the person 150 and/or the object 140.
  • the senor 120 is configured to measure parameters of an object 140 and/or a person 150.
  • One parameter could be the form of an object 140, so that the light fixture can be set to only illuminate this form.
  • Another parameter could be the reflectance of the object 140 or parts of the object 140 and its temporal change to identify damages of the object.
  • the senor 120 is configured to detect the presence, behavior, gestures, voice or action of a person 150.
  • the signals might be used to modify settings of the light fixture (e.g. increase the light intensity when a person approaches the object) or to provide a warning to security personnel (when a person come too close to the object).
  • the senor 120 is configured to react to external stimuli (e.g. sound or music) so that the light fixture sets its illumination accordingly.
  • external stimuli e.g. sound or music
  • the settings for the light fixture 110 may also be derived from data or settings stored in the data storage device 112 and/or 132. For example, in various embodiments, the data are compared to values/parameters stored in the data storage device 112 and/or 132.
  • the senor 120 may comprise a photometric sensor or a camera to measure parameters like the intensity, the color temperature, or a 2-dimensional distribution of these parameters of the illumination system, the ambient light or the intensity of light reflected by the painting.
  • the measurement may include maximum values and time-integrated values.
  • the sensor may be used to detect critical illumination that might damage the object.
  • the senor 120 is used to measure the on-time of the system, i.e. the overall illumination as well as the illumination at certain wavelengths.
  • the sensor may use the processing unit to analyze the data and the data storage device 122 for storing the processed data.
  • the senor is used for predictive maintenance.
  • the sensor 120 may (provide data which permit to) detect or predict a failing light module or a failing light source.
  • the term “failing” can be understood as an excessive excursion from the desired parameter. Failing in the context of present embodiments can mean that the light intensity is reduced, the light spectrum changed, and it can also mean that the light intensity can be undesirably increased which could damage an object.
  • the senor 120 is configured to detect deteriorating, ageing or damaging of an object 140.
  • the sensor 120 is configured to measure temperature, humidity and/or chemical components like pollutants or other compounds that could damage the artwork.
  • the sensor can be connected to an HVAC-System to control temperature and humidity, either directly or via the control system.
  • FIG. 4 also shows an embodiment of the control system 130.
  • the control system may comprise a data processing unit 133, such as a digital processing unit, such as a microprocessor programmed via software instructions.
  • control system comprises a data storage device 132, e.g. having stored the software to be executed by a microprocessor 133, i.e. the control system can comprise a computer program configured for performing a method of controlling and/or calibrating the light source 117 and/or optics 115 and/or lighting fixture 110 and/or sensor 120, wherein the computer program can be stored in the data storage device.
  • control system 130 may be implemented with any processing system, possibly also including distributed processing systems, such as cloud-based systems, e.g. including a personal computer, a local and/or remote server, and/or a mobile device, such as a smartphone or tablet.
  • the control system may be available locally, e.g. on a server or a smartphone, it can also be installed non-locally, i.e. in the cloud. Parts of the control system can be distributed among several units (e.g. cloud and smartphone) and interact via the interface(s) 131.
  • the data storage device 132 may also be used to store and retrieve settings and parameters for the light fixture 110, settings and parameters for the sensor and data collected by the sensor.
  • the data storage device can also contain pre-defmed scenes for the illumination (e.g. settings for a certain artist, or a certain object or a certain epoch).
  • the pre-defmed scenes can include settings for the overall color temperature of the light fixture or settings to highlight certain aspects of an object, e.g. when using a light fixture with a pixelized light. Highlighting can be understood as to increase the intensity on the object in a certain area; or to use a marker, e.g. an arrow to point to a certain area on the object; or to provide explanatory commentary such as by text or symbols.
  • control system 130 is configured to control the light fixture 110 and/or the sensor 120 and collect data measured by the sensor 120.
  • the control system 130 may comprise for this purpose a communication interface 131.
  • the interface 131 may connect the control system 130 with the light fixture 110, the sensor 120 and other control systems (which can include other users or control systems like HVAC).
  • control system 130 comprises a user interface 134, such as one or more visual, acoustic or haptic indicators, buttons, etc.
  • a user can also be a visitor of a museum or a show.
  • the user interface 134 may be implemented with a touchscreen.
  • the user interface 134 may be a graphical user interface, it can also use sensor data, e.g. to interpret gestures or behavior of a person, or voice commands. It can comprise a display to communicate with the user.
  • the user interface may be implemented as an application for a mobile device, such as a smartphone, a tablet, a mobile computer or similar devices.
  • the control system 130 may control several light fixtures 110. These light fixtures 110 might illuminate different objects (i.e. each light fixture illuminates one object 140) or several light fixtures 110 illuminate one object 140 ( e.g . from different sides or because the object is too large to be illuminated by one light fixture).
  • the light of the light fixtures 110 can be aligned so that the illuminated areas do not overlap but also do not show non-illuminated borders between the illuminated areas. Or the illumination areas can overlap. In this case, the intensity of the LEDs in the overlapping area can be reduced so that the overall light intensity on the objects is homogenous and equal.
  • the overlapping area can be detected by a sensor system, or the information can be provided by a user using a user interface.
  • control system 130 is configured to exchange settings between other control systems 130 e.g. via the interface 131.
  • the settings for the illumination of a certain painting can be transferred to another museum if the artwork will be lent to the museum to assure that it will be illuminated with the right parameters.
  • the control system 130 is configured to adjust parameters of the light fixture 110 and the sensor 120 as a function of data received via the user interface 131 or the communication interface 134.
  • the change can be done while the user is near the light fixture 110 or remotely such as over the internet.
  • the change can for instance be done using an application running on a smartphone.
  • a user can, for example, set the color temperature in Kelvin or the intensity in lumen or modify the light spectrum of the light fixture.
  • the user can also set the beam shape and the control system will control the actuators accordingly.
  • the control system 130 is configured to set one or more spectral characteristics of the light emitted by the light fixture(s) 110.
  • the control system 130 is configured to vary the spectrum considering an eye- sensitivity curve of a user and/or the age-dependent eye-sensitivity of a user.
  • control system 130 is configured to allow the user to define a schedule for the light fixture 110 and/or the sensor 120.
  • the light can be switched on at a pre-defmed time, the color temperature can change over the day (considering that certain illuminations might damage an object, thus restricting the circadian illumination in intensity and/or with respect to certain wavelengths), or the light can be dimmed at certain times.
  • control system 130 is configured to receive via the interfaces 131 and/or 134 data identifying the artwork 140, such as the epoch, the object or the artist, and the control system 130 may propose settings for the light fixture 110 and/or the sensor 120. It can store these pieces of information in the data storage device for later usage.
  • a camera can take a photograph of the object 140 and the control system 130 may analyses the image and identify the object, the artist, the epoch, and, on the basis of one or a combination of these sensed inputs, proposes settings for the light fixture 110 and/or the sensor 140.
  • the proposed settings can be overridden by the user using the user interface.
  • the control system 130 is configured to take sensor data into account when setting the parameters for the light fixture 110.
  • the sensor 120 can measure the ambient light and the control system 130 can reduce the intensity of the light fixture so that the desired light intensity, considering either the overall intensity or the intensity at certain wavelengths, illuminates the object.
  • the sensor 120 measures the spectrum of the light and the control system 130 sets the drivers 116 of the light fixtures 110 so that a desired overall color or color temperature is reached (either measured directly at the luminaire or measured at the picture, such as through its luminance).
  • control system is configured to provide methods to minimize the possible damage to an object 140 due to the illumination by measuring and analyzing the direct light emitted by the light fixture 110, the light reflected by the object 140 or the ambient light, either in absolute values or measuring values relative to a defined target value.
  • the control system 130 may e.g. analyze the sensor data (e.g. intensity overall or at certain wavelengths), it can store the critical parameters over time and integrate them, or it can calculate the intensity at the object using the light distribution of the light fixture and the geometry between light fixture and object.
  • the control system 130 may compare actual data with data stored some time ago (e.g., days, weeks, months, years) in the data storage device.
  • the light exposure is monitored cumulatively, somewhat analogously to the concept in health physics and radiation protection known as “dosimetry” to assess absorption of radiation over time.
  • the data relevant for the safety of the artwork may be stored in the data storage device 112, 122 or 132.
  • the data are stored in the data storage device 132 of the control system 130 and/or database which tracks the cumulative life-exposure of the object 140.
  • other parameters related to possible effects on an object such as humidity, temperature and/or vibrations or shocks during transportation can also be stored in the data storage system and/or database.
  • These data can be referenced by using one or a combination of the name of the object, some other reference provided by the user, or a signature derived from the object itself (e.g. colors and shapes measured by the camera), thus providing a digital fingerprint.
  • the stored data can be passed on if the object should be illuminated with a different light fixture or at a different place.
  • the digital fingerprint especially a digital fingerprint created under normalized illumination conditions, can be used to identify the object.
  • control system 130 is configured to perform actions to protect objects 140 from being damaged by the illumination, e.g. by comparing the measured values and time-integrated values with threshold values for the respective object.
  • the actions can comprise one or a combination of reducing the illumination intensity, switching the illumination off, or providing an alarm signal.
  • control system 130 is configured to perform actions based on user behavior. For example, the control system 130 may change the illumination level in response to presence or absence of a user, such as when a person approaches the object, the intensity is increased quickly, or alternatively the intensity is reduced slowly when the person has left, or the control system 130 may provide a warning to the security personnel such as when a person comes too close to the object.
  • control system 13 is configured to track how a person 150 is moving through the exposition area and/or a sequence of exposition areas, e.g. rooms of a building, thus e.g. analyzing which objects 140 are of more interest to the visitor of a place.
  • control system is configured to localize a person in an exposition are and provide additional information about an object 140, e.g. by projecting it on the wall 163, or through a downloadable software application (“app”) on a smartphone, or by using augmented or virtual reality.
  • control system 130 is configured to store the settings of the light fixture 110 either locally or in the cloud to restore them quickly after a power failure.
  • control system 130 can support the commissioning of the light fixtures 110 and/or the sensors 120.
  • the user scans a bidimensional bar-code, such as QR-code, on the luminaire, assigns a name (instead of using an encryption code), and defines the position on a map of the building.
  • a bidimensional bar-code such as QR-code
  • the light fixture 110 is installed and a camera, which could be attached to a light fixture 110 or stand-alone, takes a picture, then the object 140 is identified, and then the position of the light fixture 130 can be derived from the known position of the object 140.
  • control system 130 is configured to collect data about the lifetime of the light fixtures 110 and sensors 120 and/or detected damages. Artificial intelligence (“AT’) can be used to detect degradation of a light fixture based on the sensed data.
  • the control system 130 may provide this information graphically to the user e.g. by presenting the floor map and highlighting damaged fixtures or fixtures close to their end-of-life. This information could also be used by the supplier to provide a Light as a Service (LaaS) maintenance program to repair and/or replace lighting.
  • LaaS Light as a Service
  • control system 130 is configured to distribute firmware updates to the light fixtures 110 and the sensors 120.
  • control system 130 provides an application programming interface (API) for controlling the illumination system by a third party and/or for third party data integration, e.g. settings for the light fixture or information about the artwork that can be presented to a user.
  • API application programming interface
  • the API can also be used to control the illumination system using software (e.g. an app) on a third-party device such as a smartphone.
  • control system 130 uses machine learning (also referred to as artificial intelligence) and/or data mining to optimize the illumination settings for a given object 140, e.g. the control system can analyze pictures taken by users, and which users potentially have shared on social media together possibly with commentary in text form describing their impressions, critiques or other feedback, to optimize the light settings. Illumination of an artwork based on viewer’s eye characteristics
  • Figure 8 shows an embodiment of an illumination system 100 configured to change the illumination of an artwork based on a viewer’s eye characteristics
  • a light fixture 110 may comprise pre-set or adjustable light sources 117, in particular with respect to light intensity and color, orientation of irradiation etc.
  • the light fixture 110 may comprise a variety of optical elements 115, such as lenses, diffuser, color filter, etc., and/or sensors, such as temperature, humidity, light intensity, color sensors, as well as sensors for people tracking, such as by using IR-radiation (emission and sensing).
  • Operation of the light fixture is controlled by a driver circuit 116 and optionally a data processing unit 113 and/or a control system 130, which e.g. may be configured to monitor the operation of the light sources 117 and/or control the intensity and color of the light emitted by the light source 117.
  • such components may have associated one or more actuators controllable by the control system 130 and/or the data processing unit 113.
  • one or more actuators 114 may be configured to control movement and orientation of the light sources 117 and/or the optical elements 115.
  • the light fixture 110 may have a control unit 113 and a driver 116 for a plurality of LED light sources 117 having different colors.
  • the control unit 113 may be configured to adjust the resulting color temperature of the irradiated light to a certain value.
  • the requested value may be received from the control system 130.
  • the light fixtures 110 may be configured to provide a tunable light output in particular a tunable white light output.
  • such combination of LED light sources 117, LED driver 116 and control unit 113 is configured to adjust the color temperature of the total light output so that the color point lies on or in the vicinity of the Planck Curve also called Planckian locus, e.g. in a CIE color diagram, such as the CIE 1931 color diagram.
  • the light sources 117 are selected (and the driver 116 is configured), such that the processing unit 116 may adjust and keep the resulting color temperature approximately on and along the Planck Curve, for example, in the range between 1800 K and 6000 K.
  • the term approximately indicates that the deviation from the Planck Curve is at most 2 MacAdam Ellipses, preferably within 1 MacAdam Ellipse, further preferably within a circle with a diameter that equals the minor axis of such a MacAdam Ellipse and the circle center being located on the Planck Curve.
  • color mixing may be obtained by regulating the proportion between the intensity of light emitted by light sources having different colors. Such regulation may be obtained by controlling the average power supply provided to the light sources, e.g. by applying a PWM or other dimming methods to the light sources.
  • color mixing may be performed via fed-forward control, e.g. based on a lookout table, such as a PWM-LED-Color-Look-Up-Table, assigning given power-supply parameters to the light sources 117 based on a requested color temperature.
  • a sensor 120 e.g. integrated in the light fixture or arranged in the vicinity of the object to be illuminated
  • this color feedback loop is an outer/second feedback loop with respect to the inner/first feedback loop used to regulate the power supply to the light sources 117.
  • an artwork may be any object 140, such as a painting, a picture, a sculpture, an assortment of various pieces of art, people and the like.
  • An artwork may even encompass self-lit objects.
  • the object 140 may be located in any kind of exposition area 160, such as a room of an art museum or an art gallery or exhibition or somewhere else (inside and outside of a building).
  • a user/viewer is a person who wants to experience and see an artwork 140, be it a painting, a sculpture or something else. Such experience can be performed on-site (i.e. life in person) or remote (i.e. virtual). A user might want to see and experience an artwork as intended by the artist. However, the fulfillment of such wish is not easily achievable, since many factors play a role.
  • a user usually perceives an artwork 140 with the eyes, although other sensory inputs like hearing, touch and smell, may be used as well. It is known from scientific research, that the human eyes degenerate over time due to a variety of causes, like macular degeneration, cataracts, arcus senilis, corneal changes, and decreasing performance of photoreceptors. All this may result in decreased visual acuity, declining sensitivity of a visual field, decreasing contrast sensitivity, and increased dark adaptation threshold. One effect could be that the differentiation or distinction between a light blue color and a yellow-green color becomes more difficult for an elderly person. All this may affect user experience when visiting an exposition area 160 or seeing a piece of artwork 140 remotely on a display.
  • an artwork 140 may be illuminated by natural and/or artificial light for viewing purposes.
  • an artwork 140 may be illuminated with various kinds of light sources 110 at the same time under different irradiating angles and beam diameters.
  • illumination setting may be described by an Illumination Matrix (IM).
  • control system 130 may have associated a one or more databases 200 comprising a light fixture database 202 having stored data identifying the installed light fixtures 110 configured to illuminate a given artwork 140.
  • these data may include for each light fixture in a given exposition area data identifying light intensity, frequency/color, polarization, direction and/or beam spread.
  • an exhibition area database 204 may have stored the characteristics of the exhibition area 160, such as the position of the artworks, and/or an artwork database 206 may have stored the characteristics of the artwork 140, such as the dimension, respective color and reflectivity data, etc.
  • control system 130 may calculate/estimate the illumination of an artwork 140. Additionally or alternatively, at least part of the actual illumination conditions of an artwork 140 may also be measured via one or more sensors 130.
  • the light fixture database 202 may contain data specifying the range of each modifiable parameter of a light fixture 110, such as the possible light intensity and color range. These data may also be in addition to the data identifying the current illumination condition ( e.g . in case of estimation).
  • the database 200 may be a single database or a distributed database. Moreover, the database 200 or portions of the database 200 may be stored within the control system 130 or within another computer, such as a remote database server, accessibly by the control system 130. In various embodiments, data compressing algorithms may be used to reduce the dimension of one or more of the databases 200.
  • a camera 230 acquires one or more images 240 of the artwork 140 which is transmitted to a remote device 250.
  • an image database 216 may be used to store the images 240 of the various artworks 140, which may be seen remotely.
  • data compressing algorithms may be used to reduce the dimension of the images 240.
  • image taking may be done under various lighting situations and under various positions and angles with respect to the illuminated artwork.
  • image measurement is functionally related to the artwork Illumination (ambient and artificial), the reflectivity features of an artwork and the image measurement characteristics (see also the related description of the light fixture database 202 and artwork database 206).
  • the camera 230 e.g. a CCD or CMOS camera
  • a camera needs to have filter segments that are placed in front of the image-sensor chips, for example RGB filters in a Bayer configuration/setting, in order to allow for color perception and respective measurement.
  • filter segments, sensor-chips and signal procession will influence the image acquisition.
  • a camera database 214 may have stored the characteristic of the camera 230 used to acquire the image 240.
  • a Display Device has the ability to transform digital image contents into a visual representation, i.e. it has the ability to transfer digital image data (however complex) into electronic commands for display or projection pixel control.
  • the display of such a display device may be of any kind of currently used (or anticipated of future use) display or displaying units, for example LCD-Displays, AMOLED-Displays, Laser Projection Devices, Plasma Screens, Augmented and Virtual Reality Glasses, and the like.
  • Each display has its own color representation possibilities, viewing angles, brightness range and related limitations. In this respect, often a display device (e.g .
  • a laser projector has some kind of control unit with data processing, data storing and data communication capabilities that is configured to select, calculate and apply Optical Transfer Functions (OTF) to a provided Digital Image Representation, so that a pixelated image can be properly displayed.
  • OTF Optical Transfer Functions
  • a display device database 212 may have stored the characteristic of the display device 250.
  • ITF Image Transfer Function
  • optical transfer function describes how an incoming light distribution is changed when passing through or being deflected or reflected by the next optical element or passing from one display medium to another.
  • DDS Image Representing Digital Data Set
  • a process chain may have several optical transfer functions.
  • the camera database 214 may have stored data identifying the optical transfer function of the camera and the display device database 212 may have stored data identifying the optical transfer function of the display device.
  • the viewer’s eye has an optical transfer function. It may therefore be appreciated if the characteristics of a viewer’s eyes are known, either by previous measurements or by or actual on-site measurements (at least in regard to some visual aspects) and taken into account when selecting the best illumination of an artwork or when viewing a virtual representation of an artwork.
  • Personal Eye Data may be based on a variety of measurements and testing procedures for visual perception. Of course, such an illumination may take into account the combined effect of natural (sunlight) and artificial lighting.
  • the database 200 may comprise a visitor’s eye database 210 having stored data identifying the optical transfer function of the visitor’s eye.
  • an artist i.e. any person or even an artificial Intelligence computer who wants to present an artwork; the artist may or may not correspond to the creator of the artwork
  • may specify preferred lighting settings for the artwork for example with respect to the illumination condition, which may correspond, e.g., to the spectral characteristics of candle light, sunrise, midday or evening lighting scenarios, or lighting under a certain angle and beam spread, or changing colors according to a pre-defmed or ad-hoc generated time table or schedule.
  • the artist may thus provide data identifying these lighting setting e.g. by specifying a color temperature or color location, or based on reference lighting settings, a requested Image Representing Digital Data Set (DDS), spectral reference data or color tables.
  • the database 200 may comprise an artist illumination database 208 having stored the data identifying requested lighting settings, such as an Artist Lighting Scenario Matrix (ALSM).
  • ALSM Artist Lighting Scenario Matrix
  • the operator of the exposition area receives these settings (from the database 208) and transforms them into operating commands for each of the installed light fixtures 110 so that such input data and related lighting settings are adjusted and represented as best as possible.
  • a user may interact with on-site lighting conditions or, being remotely, with remote display functions in combination with off-site lighting conditions, both affecting color perception and spectral accuracy of what is to be seen.
  • some of the visual -related conditions for on-site viewing of an artwork include: illumination of the artwork (natural and/or artificial; steady or changing), perception conditions of the viewer, the artist preferred lighting settings, etc.
  • some of the visual -related conditions for off-site/remote viewing of an artwork include the ambient lighting at the viewer’s site, display type with color setting, the viewer APP and GUI interfaces, transfer function for Digital Image Data into pixelated display data, etc.
  • a viewer may also specify preferred settings, e.g. via a user interface (such as a Smartphone APP or any other GUI), which permit to set preferred lighting conditions (at least within specified boundaries so that lighting does to affect an artwork in a harmful way).
  • a user interface such as a Smartphone APP or any other GUI
  • preferred lighting conditions at least within specified boundaries so that lighting does to affect an artwork in a harmful way.
  • these data may be stored in the database 210.
  • a user may be a person with given a User Visual Perception (UVP) and optionally given user preferences for lighting conditions and color perception (UVPP).
  • UVP User Visual Perception
  • UVPP user preferences for lighting conditions and color perception
  • the user visual perception is specified via eye deficiency data, which are stored in the database 210.
  • a human eye is an organ of perception.
  • a human eye is a very complex biological product that finally transfers signals to the visual cortex area of the brain.
  • Color perception is based on many influencing factors (both physiological and psychological). For example, in this context may be cited https://www.sciencedirect.com/topics/engineering/colour-perception.
  • a user may know the individual Eye Deficiencies Matrix (EDM) based on a variety of measuring methods. For example, for this purpose may be used an eye testing device and method. Such a device/method measures visual eye characteristics, like color perception. Due to complexity and certainly also due to incomplete understanding of eye function and color recognition, only some aspects have so far been accessible for research and testing. Since every human is affected by ageing, certain eye deficiencies occur over time, like Presbyopia (difficulty with near vision focus), cataract glaucoma and macular degeneration. Some of these conditions may be measured with standard test procedures like Ishihara plate tests, Holmgren tests and Farnsworth tests.
  • such matrices may be stored in the database 210.
  • An on-site or off-site user may be willing to provide the personal UVP and EDM data to the operator of the exposition area or to a display device provider and allow use of such data for changed or improved image display and lighting setting.
  • the eye characteristics of a plurality of visitors are stored in the database 210, wherein respective visitor’s eye characteristics are associated with a univocal visitor code.
  • a visitor may provide his/her univocal code, which e.g. may be stored in any suitable manner on a support 224, such as an alphanumeric string, a barcode, a bi-dimensional barcode, such as a QR code, a magnetic support or a short range wireless transmitter, such as an Radio-Frequency Identification (RFID) or Near-Field Communication (NFC) transponder or a Bluetooth® transceiver, to the operator of the exposition area and the operator may obtain the respective eye data from the database 210.
  • RFID Radio-Frequency Identification
  • NFC Near-Field Communication
  • the visitor’s eye data may also be stored on a portable memory support, such as a memory card 220, such as a SD card or chip card, or a smartphone 222.
  • a portable memory support such as a memory card 220, such as a SD card or chip card, or a smartphone 222.
  • an artist may store preselected lighting conditions, e.g. in the form of an Artist Lighting Scenario Matrix stored to the database 208, so that it can be used by: an operator of the exposition area for adapting the characteristics of artificial light emitted by one or more light fixtures 110, or a display device APP or GUI for proper transformation of digital image data (DDS) into pixelated display settings.
  • DDS digital image data
  • An artist may even want to include his own personal eye deficiency data into that database 208, which thus permits that a viewer perceives an artwork as the artist has perceived the object.
  • an artist creates an artwork, like a painting or a sculpture.
  • An artist may work during various lighting conditions like candle light (Michelangelo) or natural sunlight in the morning, during midday or in the evening, or with artificial light (fluorescent lamps, halogen lamps LEDs).
  • An artist (either the creator of the artwork or another artist) may thus want to specify the illumination and color perception or various other settings of illumination (angle of incoming light, beam spread, spectral distribution etc.) that refer to how the artist wants the artwork to be viewed.
  • an artist can specify illumination settings and communicate these for an exhibited artwork.
  • these conditions can be stored in the database 208. Specifically, knowing a viewer’s eye characteristics in regard to color perception and other optical characteristics can be used to improve visual reception by adjusting the lighting setting of the light fixtures in the exposition area. Knowing a user’s preferred lighting setting may further improve user experience.
  • the control system 130 may be configured to vary the characteristics of the light emitted by the light fixtures, i.e. the control system 130 may send one or more control commands to the light fixtures 130.
  • the control system 130 may adjust/optimize the illumination of an artwork when the user in there in person, thereby taking into account the viewer’s eye characteristics so that the perception of the artwork is improved, e.g. closer to what the artist experienced, thereby e.g. compensation eye degradation.
  • an illumination unit may be adjusted to emit light with a higher red or green content (spectral intensity) or light with a shifted color location, for example along the Planck-curve or freely in color-space. Therefore, a piece of art may be - depending on viewer’s data (PED) - illuminated with adjusted light characteristics, here called Preferred Viewer Illumination (PVI).
  • PED viewer’s data
  • PV Preferred Viewer Illumination
  • this approach may include that a person stores his or her personal eye data electronically, e.g. on a chip card 220 or in a Smartphone App 222 so that these data can be transferred to a lighting control unit 130 and used for proper (improved) illumination.
  • a user may transmit the eye characteristics upfront to a museum or art gallery, which may then store the data in the database 210.
  • these data are used to provide personalized lighting (PVI) for an individualized viewing experience, i.e. the control system 130 may use a sensor 140 to determine which visitor is viewing a given artwork, obtain the respective eye data from the database 210 and then adapt the illumination of the artwork as a function of the obtained eye data.
  • PVI personalized lighting
  • Such a personalized lighting setting might even take into account to actual viewing position, i.e. the lighting setting is adjusted to the actual user position that can, for example, be defined by using camera detection or face or body recognition (e.g. by a 3D-face infrared imaging method as currently used in certain Smartphones).
  • a user may also change in real-time the illumination conditions, e.g. via the user interface of the control system 134 or by using a smartphone APP. For example, in this way, a visitor may view an artwork under different lighting scenarios and select some (or all) of them according to preference.
  • the controlling unit of such chosen display device 250 may use its own display settings, and assumed or measured degradation characteristics and factor which influence the optical transfer function of the display device 250, i.e. transforming and applying digital data to a display device, such as AMOLED or LCD based displays, displays using micro (p)LEDs, Laser Projection devices, Virtual Displays, Augmented Reality glasses etc.
  • a display device such as AMOLED or LCD based displays, displays using micro (p)LEDs, Laser Projection devices, Virtual Displays, Augmented Reality glasses etc.
  • the optical transfer function of the display device 250 may be adapted in order to determine/calculate a corrected optical transfer function taking into account the display setting and optionally degradation characteristics, which e.g. may be estimated based on a operation time of the display device.
  • the image data 240 may be adapted either prior to transmission (e.g. via the control system 130) or within the display device 250 itself. For example, this is shown schematically in Figure 8, wherein the control system 130 and/or the display device 250 may access the display device database 212.
  • the display device 250 may receive the viewer’s eye data, such as an Eye Deficiency Matrix (EDM) data, which may be stored in the device 250, in the database 210 or a portable memory 220/222.
  • eye data such as an Eye Deficiency Matrix (EDM) data
  • EDM Eye Deficiency Matrix
  • the display device 250 may receive via a user input and/or via an automated data exchange with the control system 130, preferred user settings.
  • the display device 250 may comprise or have associated a sensor 120’ configured to measure the ambient lighting (e.g. level of illumination, color temperature).
  • the sensor 120’ may be a camera sensor of the device 250.
  • the image data 240 may be adapted in order to show a graphical representation similar to the artist’s preferred light setting. All this means that care should be taken that a representation of an artwork on a display comes as close as possible to the original perception (or the intended perception) of the artist. This also means that when there are severely limiting factors (like display degeneration) a best fit approach (image transfer function) may be employed.
  • the described enhanced user experience method permits an improved or adjusted or individualized image perception for on-site or off-site viewers based on personal visual perception, viewer’s personal preferences, ambient light settings (both natural and artificial), on site lighting of an exposition area, off-site ambient lighting conditions, display or projection devices, and other image or optical transfer function, user Interfaces, and artist preferred lighting settings.
  • various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command.
  • the method comprises the steps of: obtaining data identifying requested spectral characteristics, obtaining data identifying a viewer’s eye characteristics,
  • the lighting systems 100 described herein may be used to illuminate art objects 140, such as classic and modem paintings, sculptures, drawings, photographs, textile compositions, various sorts of canvases etc., whereby the art objects 140 may be placed in in exposition area, such as an art gallery or museum, both indoor and outdoor, and in various compositions.
  • art objects 140 such as classic and modem paintings, sculptures, drawings, photographs, textile compositions, various sorts of canvases etc.
  • the light fixture 110 may include a spotlight with high intensity or low intensity, it can have a fixed beam angle or a variable beam angle, the color temperature or color coordinates can be variable/changeable or fixed, and the light fixture can be equipped with a framer or gobo.
  • the light fixtures 110 are equipped with a variety of light emitting diodes 117 that either work as direct emitting or phosphor-converted LEDs.
  • each of the lighting fixtures 110 may be equipped with a variety of LEDs 117, including, for example, phosphor converted LEDs that emit a whitish light with different color temperatures, or direct emitting red, green, blue, lime, or amber LEDs.
  • artworks 140 often should be illuminated with specific lighting conditions. This may thus influence the decision which light fixture 110 or combination of light fixtures 110 should be used to illuminate artworks 140 in an exposition area 160.
  • this decision should take into account the kind of artwork, e.g. paintings, photos, drawings, etc., its size and environmental factors, such as, in case of a room 160, the height of the room, wall color and/or reflectivity, and/or the brightness/illumination in the room.
  • the decision which illumination system 110 should be used is done manually, e.g. by the operator of the exposition area.
  • the responsible person takes such a decision usually based on recommendations given by light designers or light suppliers.
  • providing these recommendations means additional effort and cost which can be a significant cost factor especially for smaller light installations.
  • a method which (at least in part) automatically assesses and evaluates the exposition situation and provide a recommendation which kind of illumination (fixture type, fixture combination, operating parameters, placement of fixtures relative to the art object) should be used.
  • such method may be implemented with a software program to be executed by one or more computer.
  • a computer comprises a data processing unit, a data storage and a display with a graphical user interface (GUI).
  • GUI graphical user interface
  • the method may be implemented with a web application executed by a Webserver and/or an APP to be executed by a mobile device, such as a smartphone or tablet.
  • the computer may also comprise a photo-electric sensor, such as a camera, e.g. the camera of a smartphone or tablet.
  • a tool device with implemented software programs und user interface
  • the tool is configured to determine a recommendation for suited light fixtures 110 for a specific exposition area 160 based on some data input provided by a user.
  • the parameters of light fixtures 110 already installed in the exposition area 160 and/or installable in the exposition area may be stored in a light fixture database, such as the database 202 of Figure 8.
  • the light fixtures 110 installed or to be installed may include (at least): a first spotlight corresponding to a high intensity spotlight with warm color, e.g. of down to 1800 K, a second spotlight corresponding to a low intensity spotlight, a first luminaire corresponding to a luminaire with a framer or a gobo, and a second luminaire with a wide angle of illumination like a wall-washer.
  • the light fixture database 202 may comprise for each of these light fixtures 110 one or more of the following data:
  • - brightness data such as data identifying a minimum brightness level and a maximum brightness level adapted to be emitted by the respective light fixture; spectral data identifying light colors adapted to be emitted by the respective light fixture, such as a color temperature or a color coordinate; optics data, identifying a light transfer function of one or more optical elements of the respective light fixture, such as a reflector, diffusor, lens, shutters and/or framers.
  • one or more light fixtures may also support a plurality of configuration conditions having different characteristics (based on the configuration settings).
  • the data identifying the characteristics of these light fixtures comprise preferably data for the plurality of configuration conditions.
  • the tool is configured to show the lighting effects as they would appear in one or more various scenarios, such as in a standardized environment (size, height, wall, floor and ceiling properties), or in the specified location (room) with or without the art objects.
  • the tool may use pictures (graphical representations) of the art objects 140. Such graphical representations can be obtained from the taken picture (see below) or from a database of art objects 140.
  • the tool may use for this purpose the exposition area database 204 and/or the artwork database 206 and/or the image database 216.
  • the tool may thus show what the illuminated artworks 140 would look like using the recommended light fixture 110 or combination of light fixtures 110.
  • the tool is also configured to change this recommended configured, e.g. in order to provide examples what the artworks 140 would look like using different light fixtures or illumination settings thus giving the responsible decider and/or designer the possibility to choose the requested illumination system 100 according to his likings. As mentioned before, this selection may include both already installed light fixtures 110 or light fixtures which need to be installed.
  • the tool is configured to determine then automatically the technical specification of the recommended and selected light fixture and settings (operational, positional, orientation).
  • this technical specification is then provided to the control system 130, which may generate one or more control commands for the light fixtures 110 in order to generate the requested illumination.
  • the decider is not required to have any real technical understanding to create the technical specification and control commands for the chosen illumination system 110.
  • Figure 9 shows an embodiment of the operation of the tool.
  • the tool receives at a step 302 data identifying characteristics of the exposition area 160 and the artwork 140, a determines at a step 304 a recommended configuration of light fixtures 110.
  • the tool may show at the step 302 a graphical user interface (GUI) for acquiring the characteristics of the exposition area 160 and the artwork 140.
  • GUI graphical user interface
  • the tool shows questions (provided by the software program) e.g. requesting the insertion of the characteristics of the exposition area, such as room size, material of floor, walls, ceiling, and the kind of artwork(s) to be illuminated.
  • the tool acquires at least a room height.
  • this parameter may correspond to the actual height of a room representing the exposition area or the distance between the floor and a truss used to mount light fixtures.
  • these questions may be configured as a decision tree.
  • the tool may show in a step 320 a screen for selecting at the type of artwork 140, such as: selection Si: painting, photograph, textile, selection S2: Old Master, selection S3: Modern art, selection S4: Statue, selection S5: other 3D objects.
  • the type of artwork 140 may also be selected based on other characteristics, such as the material of canvas, color pigments, frame material, etc.
  • the tool in case of a painting, photograph, textile (output “Si” of the step 320), the tool proceeds to a step 322 for selecting whether the room height is greater than a given threshold, e.g. 5 m.
  • a given threshold e.g. 5 m.
  • the tool may select an installation configuration INST2, e.g. comprising (and preferably consisting in) the first luminaire mounted or to be mounted at the ceiling of the room.
  • the tool may select an installation configuration INST1, e.g. comprising (and preferably consisting in) the first luminaire mounted on a support to lower the luminaire to a distance of 3 m with respect to the floor.
  • the tool selects an installation configuration INST5, e.g. comprising (and preferably consisting in) the first spotlight.
  • the tool proceeds to a step 326 for selecting whether the room height is greater than a given threshold, e.g. 5 m.
  • a given threshold e.g. 5 m.
  • the tool may select an installation configuration INST7, e.g. comprising (and preferably consisting in) the first spotlight.
  • the tool may proceed to a step 328 for selecting whether the room is a bright or dark room.
  • the tool may select an installation configuration INST8, e.g. comprising (and preferably consisting in) the second luminaire and a second spotlight.
  • the tool may select an installation configuration INST9, e.g. comprising (and preferably consisting in) the second spotlight.
  • the tool selects an installation configuration INST6, e.g. comprising (and preferably consisting in) the first luminaire with a gobo.
  • the tool proceeds to a step 324 for selecting whether the room height is greater than a given threshold, e.g. 5 m.
  • a given threshold e.g. 5 m.
  • the tool may select an installation configuration INST4, e.g. comprising (and preferably consisting in) the second spotlight.
  • the tool may select an installation configuration INST3, e.g. comprising (and preferably consisting in) the first spotlight.
  • Figure 11 shows an alternative embodiment, wherein the selection of installation is organized in a look-up table 330.
  • the look-up table receives at input an artwork type “ATYPE” and an indication whether the room height is smaller or greater that a given threshold, e.g. 5 m.
  • the tool may determine the recommended installation as a function of the artwork type and the room height.
  • the lookup table may perform the following mapping: drawing (ATYPE 1) and height ⁇ 5 m: first luminaire, drawing (ATYPE 1) and height > 5 m: first luminaire mounted at a height of 3 m with respect to the floor,
  • ATYPE5 and height > 5 m first spotlight
  • statue (ATYPE6) and height ⁇ 5 m first luminaire with gobo
  • statue (ATYPE6) and height > 5 m first luminaire with gobo
  • other 3D objects ATYPE7 and height ⁇ 5 m: second spotlight
  • other 3D objects ATYPE7 and height > 5 m: first spotlight.
  • other parameters such as color temperature, soft or sharp edges, or the size of the artwork may be included in the decision tree or look-up table.
  • the matrix/lookup-table may take into account the brightness of room.
  • the characteristics of the exposition area 160 and/or the artwork 140 may be obtained by acquiring via a camera of the device (or another device) an image of the exposition area 160 and/or the artwork 140, and extracting the respective characteristics from the image the respective characteristics.
  • images of the artworks 140 may also be stored already in an image database 216.
  • the characteristics of the exposition area 160 and/or the artwork 140 may be stored in an exposition area database 204 and artwork database 206.
  • the tool may permit to select the image of the artwork, the characteristics of the exposition area stored in the database 204 and/or the characteristics of the artwork stored in the database 206.
  • the artwork database 206 and the image database 216 may also be linked, wherein selecting an artwork permits also to obtain a respective image of the artwork and vice versa.
  • these databases may be stored within the device or remotely. For example, the selection may be done by:
  • such a univocal code indicative of an artwork or an exposition area may be stored in any suitable manner on a support, such as an alphanumeric string, a barcode, a bi-dimensional barcode, such as a QR code, a magnetic support or a short range wireless transmitter such as an RFID or NFC transponder or a Bluetooth® transceiver.
  • This support which may be e.g. an adhesive applied to the artwork 140 or the exposition are 160, may then be inserted manually in the tool and/or read automatically, e.g. by using a camera, or an NFC or Bluetooth® transceiver of the device.
  • the univocal artwork code may also be obtained from a distributed blockchain ledger.
  • the above described methods for obtaining the required information may also be combined in any suitable manner.
  • the acquired information may include the characteristics of the exposition area 160, such as height of the ceiling, the color of a wall, e.g. if it is a bright or dark color, the characteristics of the object 140 to be illuminated, e.g. if it is an old master painting, a modem painting, a sculpture, a drawing, a photograph, or a textile, the size of the object, and the effect needed, e.g. if the edges of the artwork should be illuminated with sharp or soft edges, if just one artwork should be highlighted or if several artworks should be illuminated at the same time, and optional a graphical representation/image of the object 140.
  • “Sharp edges” means that outside the art object the light intensity is abruptly reduced to zero, while soft edges provide a smooth transition from maximum to zero illumination.
  • the exposition area database 204 contains the following data: dimensional data of the exposition area, such as a room height, room width and room length,
  • the brightness data of the exposition are 160 are purely optional, because the exposition area database 204 may also comprise other data adapted to be used to estimate the brightness level or profile, such as position data of the exposition area 160, such as GPS position data, which may be correlated to a local time. Moreover, the brightness data may also be obtained via a light sensor 120 installed in the exposition are 160.
  • the brightness data both of the exposition area 140 and the light fixture 110 may comprise also further information such as a spectral distribution and/or a color location, and/or a Color Rendering Index (CRI) and/or a beam direction and/or a beam spread angle and/or a light polarization of the light in the exposition area 160 or the light emitted by the light fixture 110, respectively.
  • CRI Color Rendering Index
  • the artwork database 206 contains the following data: descriptive data, such as the name of the artwork, the name of the artist, the period or creation year of the artwork, the type of the artwork, dimensional data of the artwork,
  • - global or local (e.g . pixel) color data for the artwork such as color analysis data, spectral data, and/or image pixel data (which may also be stored in a separate image database 216 linked to the artwork database 206), optional damage data, such as a local or global damage matrix, option reflectivity data, such as local or global reflectivity data, such as a reflectivity matrix.
  • optional damage data such as a local or global damage matrix
  • option reflectivity data such as local or global reflectivity data, such as a reflectivity matrix.
  • the tool may also obtain data identifying the light fixtures already installed in the exposition area 160. As mentioned before, these data may be stored already in the light fixture database 202.
  • the exposition area database 204 and the light fixture database 202 may also be linked, wherein selecting an exposition area 160 permits also to obtain the respective data of the installed light fixtures 110 and vice versa.
  • the technical information of the available light fixtures 110 may also be provided as a manual input.
  • the tool may determine a recommendation of a set of light fixtures 110 to be used for illuminating the artwork 140.
  • the tool may use for this purpose a decision tree and/or a decision matrix, i.e. two or more parameters like the kind of artwork or height of room may be used to determine the recommended light fixture(s) to be used and optionally the recommended settings for the light fixture(s).
  • the tool accesses the (local or remote) database of light fixtures 202 and the respective data are matched to done or more features of the exposition area 160 and/or the artwork 140.
  • the tool may use artificial intelligence, i.e. a machine learning method based on a reference dataset comprising a plurality of reference installations for a respective artwork or artwork type in a given exposition area, and the respective used light fixtures and settings.
  • artificial intelligence i.e. a machine learning method based on a reference dataset comprising a plurality of reference installations for a respective artwork or artwork type in a given exposition area, and the respective used light fixtures and settings.
  • the machine learning may provide at output the recommended light fixtures and settings.
  • the machine learning method may directly access the reference dataset in order to determine, based on the characteristics of the exposition area 160 and the artwork 140, one or more best-matching reference installations and then display the set of light fixture(s) used for these one or more best-matching reference installations.
  • the machine learning method may use the reference dataset during a training phase in order to generate a mathematical function configured to estimate a set of light fixtures and the respective settings as a function of the characteristics of an artwork and an exposition area, i.e. the method may comprise: acquiring a training database of a plurality of illumination conditions/ reference installations comprising data identifying characteristics of a respective artwork, data identifying characteristics of a respective exposition area, a respective selected set of light fixtures and a respective selected configuration condition (settings),
  • the tool may display at a step 306 the recommended set of light fixtures and optionally the respective settings.
  • the tool may generate/render and display an image providing a visual impression what the illuminated artwork 140 will look like using the recommended illumination system 110.
  • the tool may use the graphic representation/image of artwork 140 or by using a generic artwork which fits the category of artworks that will be displayed in the exposition area or by calculating a modified image as seen under the recommended illumination setting.
  • a generic artwork might be necessary when the illumination system is installed for the first time, i.e. with no artworks yet available in the exposition area.
  • the rendering operation may thus use the following data:
  • the characteristics of the exposition area 160 e.g. the graphical representation/image of the exposition area 160 or other data, such as the wall color, etc., or a default exposition area,
  • the characteristics of the artwork 140 e.g. the graphical representation/image of the artwork 160 or other data, such as the artwork type;
  • the characteristics of the selected light fixture(s) 110 which may be used to estimate the illumination of the artwork 140.
  • the rendering operation and/or the display operation of the rendered image may take into account the characteristics of the display device and/or the ambient illumination conditions of the device 250. For the respective description may thus be made reference to the description of Figure 8.
  • the tool may also show at a step 308 a screen which permits to select a different set of light fixtures and/or different settings for the light fixture(s).
  • the tool may also generate/render and display an updated image providing a visual impression what the illuminated artwork 140 will look like using the different illumination system 110.
  • the tool recommends a light fixture with a framer for drawings and a certain color temperature, the user want to see the impression of the artwork using a simple spotlight or a different color temperature.
  • the tool may then receive at a step 310 an input indicating which set of light fixtures and settings should be used (i.e. the recommended or a different setup).
  • the tool may then determine at a step 312, as a function of the selected setup (light fixtures and settings) and the light fixtures 110 already available in the exposition area, at least one:
  • the technical specification for the light fixtures to be installed and optionally the configuration parameters for the light fixtures 110 to be installed.
  • the technical specification may include thus data which permit to provide and install the selected illumination system.
  • the tool when performing the set-up of the illumination of an artwork/object 140, permits to automatically select a set of light fixtures 110 and the respective (target) illumination characteristics. This may involve the use of already installed light fixtures 110, and/or the selection and installation of new light fixtures 110. As will be described in greater detail in the following, the tool may also automatically select a set of sensors 120 and, if support, the respective sensor configuration. This may involve the use of already installed sensors 120, and/or the selection and installation of new sensors 110.
  • the tool may also determine the installation position of the light fixtures 110, preferably within a 3D model of the exposition area 160.
  • a 3D model of the exposition area 160 may already be stored in the exposition area database 204.
  • the respective data may be inserted manually, e.g. via a user interface, or at least in part automatically.
  • a camera e.g. a camera of the device having installed the software tool, is used to acquire image data of the exposition area 160, which permits to calculate a 3D model of the exposition area 160.
  • a 3D scanner such as a Light Detection and Ranging (LIDAR) system, may be used.
  • LIDAR Light Detection and Ranging
  • Figure 12 shows a top view of the model of a simple exposition area 160 in the form of a room having a given width, length and height.
  • 3D model reconstruction from a plurality of images e.g. a video
  • a plurality of images e.g. a video
  • rendering a more detailed description herein superfluous for example, in this context may be cited documents US 2003/0072483 Al, US 8,254,667 B2 or US 10,275,945 B2.
  • the 3D model includes also the apertures of the exposition area 160, e.g. doors 165 and windows 164, through which natural or artificial light may enter.
  • the 3D model may also include color and/or reflectivity data of the surfaces of the exposition area 160, e.g. of the floor 162, the walls 163 and/or the ceiling 161 (see also Figure 1) .
  • the tool may acquire data identifying the position of the artworks 140 within the exposition area 160.
  • data may be used data already stored in a database, e.g. in the exposition area database 204 or the artwork database 206, or by receiving the data via a manual input or at least in part automatically, e.g. by acquiring image data of the exposition area 160 and calculating the position and dimension of each artwork in the exposition area.
  • the artwork data may also include color and/or reflectivity data of the artwork 140.
  • Figure 12 are shown three pictures/paintings 140i, 140 2 , 140 3 and a 3D sculpture 140 4.
  • the selected light fixtures 110 are positioned in the 3D model.
  • Figure 12 are shown four light fixtures 1 10i, 1 I O2, 1 I O3 and 1 I O4.
  • the respective installation position of the light fixture 110 is usually fixed. Accordingly, also the respective position data may already be stored in a database, e.g. the light fixture database 202 or the exposition area database 204. Alternatively, a light designer may position the already installed light fixtures 110 manually within the model of the exposition area 160.
  • the tool may determine a (recommended) orientation of each already installed light fixture 110. For example, the tool may propose to orient the light fixtures I I O3 and I I O4 in order to point with the respective optical axis towards the center of the artworks 140 3 and 140 4 , respectively.
  • the tool is configured to determine a (recommended) position and orientation of each new light fixture 110 as a function of the characteristics of the light fixture 110 and the position of the artwork 140 within the exposition area 160.
  • each artwork 140 has associated respective target and/or maximum illumination data.
  • these data may be stored in the artwork data base 206 and/or the artist illumination database 208, inserted manually or determined automatically by classifying the artwork 140, e.g. as a function of an image of the artwork 140.
  • the tool may use the 3D model of the exposition area 160 and the data of the artwork 140 and the selected light fixtures 110 in order to determine a (recommended) position and orientation for each new light fixture 110.
  • the tool may determine an initial position for each new light fixture 110.
  • the tool may select the initial position by: determining a normal plane with respect to the surface of the artwork 140, wherein the normal plane is parallel to the vertical axis and preferably located at the center of the artwork 140; determining the mounting height, which (as mentioned before) may correspond (approximately) to the height of a ceiling of a room representing the exposition area 160 or the mounting height of a support structure; and determining a given distance with respect to the artwork 140, e.g. selected as a function of the characteristics of the selected light fixture, e.g. as a function of the dimension of the artwork and the beam angle of the light fixture 110.
  • the initial mounting position of a given light fixture may be determined by: determining within the normal plane a circle segment (around the center of the artwork) having a radius corresponding to the given distance to the artwork 140; determining within the plane a line parallel to the floor at the mounting height; and determining the point of intersection of the circle segment and the line.
  • the distance from the artwork may also depend on the mounting height (or similarly the vertical displacement with respect to the center of the artwork 140), because the inclination of the light fixture 110 with respect to the artwork 140 may also influence the dimension of the light spot on the artwork 140.
  • the tool may determine the initial mounting position of a given light fixture 110 as a function of the position of (the center of) the artwork 140, the mounting height and the requested distance between the artwork 140 and the light fixture 110, which in turn may be determined as a function of at least one of: the characteristics of the light fixture 110, the dimension of the artwork 120, and the mounting height.
  • the tool may then perform a light simulation of the 3D model of the exposition area 160 and determine the illumination of each artwork 140.
  • this simulation may include also at least one of:
  • the background light e.g. by simulating windows 164 and doors 165 as additional (fixed) light sources with given (fixed or variable) light characteristics;
  • the tool may determine whether the illumination of each artwork 140 corresponds to the target illumination and/or is smaller than the maximum threshold.
  • the tool may reduce the light provided by the respective light fixture 110. Additionally, the tool may determine whether a given artwork 140 is also illuminated by another light fixture 110, and in this case vary the position of the respective light source 110.
  • the tool may vary the characteristics of the set of light fixtures 110 and/or the respective mounting position via an iterative process of simulations until the illumination of each artwork 140 corresponds to the target illumination and/or is smaller than the maximum threshold.
  • various embodiments of the present disclosure relate to a method of selecting at least one light fixture by: obtaining data identifying characteristics of an artwork, obtaining data identifying characteristics of an exposition area, determining a set of light fixtures and/or operating setting for a set of light fixtures as a function of the data identifying characteristics of the artwork and the data identifying characteristics of the exposition area.
  • Possible embodiments of this solution are detailed at the following point "Example 3".
  • a lighting system 100 comprising a control system 130, one or more light fixtures 110, and optionally one or more sensors 120.
  • a control system 130 for a general description of these blocks, reference can be made, e.g ., to the description of Figures 2 to 7.
  • the light fixture(s) 110 should be configured to emit light with high quality, i.e. the spectral characteristics of the light should be within given boundaries, e.g. in order to obtain a high CRI.
  • the spectral characteristics of the light emitted by the light fixture are settable/programmable.
  • the spectral characteristics of the light emitted by the light fixture 110 may be varied by varying the brightness of the light emitted by the light sources 117.
  • control system 130 or a data processing unit 113 of the light fixture 110 may vary the brightness of ⁇ i.e. perform a dimming of) the light emitted by the light sources 117 by varying/regulating the average power supply provided to the light sources 117.
  • each light source or set of light sources 117 may be powered via a separate electronic converter 116, a sperate switching stage 116h or an additional output stage of the switching stage 116h (see also Figure 7).
  • Figure 25 shows an embodiment, wherein four light sources or sets of light sources 117i, 117 2 , 117 3 and 117 4 , such as LEDs or laser diodes, receive a power supply from the same driver 116.
  • the driver 116 is configured as current generator providing a (regulated) current i o t.
  • the light sources or sets of light sources 117i ... 117 4 are connected in series between the output terminals of the current generator 116.
  • the average current provided to the light source or set of light sources 117i... 117 4 may be varied via at least one of:
  • 117 1. . . 117 4 e.g. via a respective electronic switch SW1..SW4 configured to switched on or off the power supply for the respective light source or set of light sources 117i... 117 4.
  • the electronic switches SW1..SW4 and/or the electronic switch SW5 may be driven via respective pulsed drive signals DSW1..DSW5, such as pulse width modulation signals.
  • DSW1..DSW5 pulse width modulation signals.
  • Figure 25 refers to the case of a current generator 116
  • similar dimming operations may also be performed in case the driver 116 is configured as voltage generator providing a (regulated) voltage Vout (see also Figure 6), e.g. by selectively connecting the light sources 117 to the voltage Vout and/or by controlling the operation of the current regulator 118c.
  • the spectral characteristics of the combined light emitted by the light fixture may be varied by performing an individual dimming operation of the separate light source or set of light sources 117i... 1174.
  • the spectral characteristics of the light emitted by the light fixture(s) 110 may be varied as a function of, e.g. at least one of: requested spectral characteristics 208, e.g. as specified by an artist; data of the viewer, such as the above-mentioned eye-characteristics and/or the Preferred
  • sensor data provided by one or more sensors 120, such as a light sensor configured to monitor the illumination of the artwork 140 and/or the ambient light in the exposition area 160; and maximum illumination values for the artwork 140.
  • control system 130 may send (via the interfaces 131 and 111) a control command CMD to the data processing unit 113 of the light fixture 110.
  • the data processing unit 113 may vary the power supply of one or more light sources 117, e.g. by varying the reference signal of the driver 116 and/or controlling the operation of the switches SW1..SW5.
  • the light fixture(s) 110 should also be configured to provide other features, such as an illumination which only (or at least mainly) illuminates the artwork 140 and/or the possibility to highlight certain aspects of an artwork 140, such as certain colors or parts of the artwork 140 (e.g. heads of persons in a painting).
  • the light fixture 110 may comprise one or more framers 115 configured to limit the illuminated range of the light emitted by the light fixture 110 and/or other optical elements used to focus the light generated by the light sources 117.
  • framer usually comprises one or more mechanical shades, which are moved in a position where they shade light that would illuminate areas outside of the artwork 140. While framers can easily produce rectangular borders, shading a circular object or an object of random borderlines (e.g. a sculpture) may be a complex task.
  • additional spotlights are used in order to highlight given zones of an artwork 140. Accordingly, one or more additional spotlights may highlight certain aspects of an artwork 140.
  • the light fixture 110 may also be configured to generate one of more drive signals D114 for an actuator 114 of the light fixture 110, such as one or more actuators associated with a framer and/or one or more actuators associated with optical elements 115 used to orientate and/or focus the light generated by the light fixture 110.
  • the processing system 113 may be configured to vary a significant number of parameters of the light fixture in response to the control command CMD received from the control system 130, such as:
  • the settings of the actuators 114 e.g. actuators associated with one or more optical elements 114 and/or one or more sensors 120 integrated in the light fixture 110 and/or the complete light fixture 110, e.g. in order to move (e.g. shift, rotate, pan or tilt) the light fixture 110.
  • the data processing 113 unit may be configured to determine for each light source or set of light sources 117i ... 117 4 a relative power supply/brightness level, e.g. as a function of a requested color, such as a requested color temperature.
  • the data processing unit 113 may be configured to determine for each light source or set of light sources 117i... 117 4 an absolute power supply/brightness level as a function of the relative power supply/brightness level and a requested brightness of the light emitted by the light fixture 110.
  • one or more commands CMD may comprise data identifying the requested color and a requested brightness.
  • Figure 26 shows a first embodiment of a light module 118 comprising a plurality of light sources 117.
  • the light sources 117 are arranged on one side of a support, preferably a flat substrate, such as a printed circuit board.
  • the light module 18 is rectangular and the light sources 117 are arranged in a matrix having a given number of rows and columns. For example, in Figure 26 are shown 8 columns and 6 rows.
  • Figure 28 shows a circular light module 118, wherein the light sources 117 are arranged along parallel lines.
  • Figure 29 shows that a first set of light sources may be arranged along a circle and a second set of light sources is arranged along parallel lines.
  • the light module 118 may have any form and the light sources 117 may be arranged in any suitable manner.
  • the light module 118 has preferably an axial symmetric form, such as a rectangular, circular, elliptical form and/or the light sources 117 are arranged (preferably equidistant) along rectilinear lines and/or circle segments.
  • the light sources 117 are selected from the group of: light emitting diodes including a phosphor conversion LEDs, laser diodes, organic light source such as OLED, or a quantum dot based light source.
  • all light sources 117 are LEDs, preferably mini -LEDs or micro-LEDs.
  • the light sources 117 may comprise a single LED (e.g. a white LED) or a plurality of LEDs (e.g. red, green and blue, or red, green, blue and white).
  • the LEDs may form the pixels of the matrix, wherein each pixel may be controlled individually and wherein each pixel may consist in a single LED or comprise a plurality of LEDs.
  • the light fixture 110 may also comprise one or more optical elements 115.
  • the light fixture 110 comprise a first lens structure 115a configured as collimator lens and a second lens structure 115b configured to focus the light towards the artwork 140.
  • the lens structure 115a may be implemented with micro-lenses, wherein one or more micro-lenses are arranged in correspondence with each light source 117.
  • the second lens structure 115b may comprise a convex lens mounted at a given distance D from the first lens structure 115a.
  • the distance D may also be variable, ( e.g . via an actuator 114 configured to move the lens 115b), thereby selectively varying the focal point of the light generated by the light fixture 110.
  • the light module 118 is connected to a driver 116 configured to individually control the power supply of sub-sets of light sources and preferably the power supply of each light source 117, and more preferably (in case the light source 117 comprises a plurality of LEDs) the power supply of each LED.
  • the data processing unit 113 may be configured to control:
  • the data storage device 112 of the light fixture 110 may have stored a data structure 800, such as a look-up table, having stored a plurality of preset configurations, wherein with each configuration item is associated a univocal code.
  • a data structure 800 such as a look-up table
  • the association may be explicit, e.g. each configuration data item may have stored a respective univocal code, or implicit, e.g. the univocal code may be determined as a function of (e.g. corresponds to) the index of the configuration data item.
  • the data processing unit 113 may comprise a digital processing unit 1130, such as a micro-processor programmable via software instructions, and a temporary memory 1132, e.g. implemented registers or a Random-Access Memory (RAM).
  • the data processing unit 113 may proceed at a step 804, where the data processing unit 113 waits for a new command CMD. Once a new command CMD has been received, the data processing unit 113 extracts from the command CMD a field comprising a univocal code of a preset configuration, and proceeds to a step 808.
  • the data processing unit 113 retrieves from the data structure 800 the configuration stored for the received univocal code and stores the respective configuration to the memory 1132. Accordingly, the data processing unit 113 may use the data stored in the memory 1132 to drive the driver 116 and optionally the one or more actuators 114. Additionally or alternatively, the data processing unit 113 may also receive at the step 804 data from one or more sensors 120. Accordingly, in various embodiments, the data processing unit 113 may also select a different preset configuration as a function of the sensor data.
  • the data structure 800 may comprise a plurality of preset configuration data items, wherein the light emitted by the light fixture has the same color characteristics but different brightness levels. Accordingly, by selecting a different preset configuration data item, a different brightness level (with the same color characteristics) may be used.
  • the data processing unit 113 verifies at a step 806 whether the received command or the measured data indicate that a different preset configuration has to be loaded from the data structure 800 or whether the data stored to the memory 1132 should be adapted.
  • the data processing unit 113 proceeds to the step 810, where the data processing unit 113 adapts the data stored to the memory 1132.
  • the driver 116 and/or the actuators are driven as a function of the adapted data stored to the memory 1132.
  • the data processing unit 113 may thus return to the step 804 for receiving a new command from the control system 130 or sensor data from a sensor 120.
  • the data processing unit 113 may be configured to: receive a first command requesting the activation of a given preset configuration, such as data identifying for each light source 117 relative power supply data, e.g. in order to implement a color mixing operation or to generate a spotlight with a subset of the light sources 117; optionally receive a second command requesting an adaptation of the data stored to the memory 1132, e.g. in order to set a requested brightness level of the light fixture 110; and optionally receive data from a sensor 120 used to adapt the data stored to the memory 1132, e.g. in order to regulate the brightness level of the light fixture as a function of the actual illumination of an artwork 140 as monitored by a light sensor 120.
  • a first command requesting the activation of a given preset configuration, such as data identifying for each light source 117 relative power supply data, e.g. in order to implement a color mixing operation or to generate a spotlight with a subset of the light sources 117
  • receive a second command requesting an adaptation
  • the data structure 800 is stored in a non-volatile memory.
  • the data structure 800 may be stored in a Read-Only Memory (ROM).
  • ROM Read-Only Memory
  • a programmable non volatile memory is preferably.
  • the control system 130 and the data processing unit 113 may be configured to perform an update operation of the preset configurations 800.
  • the control system 130 may: access a database, such as the light fixture database 202, having stored a plurality of preset configurations and the control system 130 may store via the data processing unit 113 only a selected subset of the preset configurations to the data structure; and/or determine one or more parameters of the preset configuration as a function of at least one of: o requested illumination characteristics, or a sequence thereof, e.g. as stored in the artist illumination database 208; o the characteristics of the artwork 140 to be illuminated, e.g. as stored in the artwork database 206 and/or as measured via one or more sensors, o the characteristics of the exposition area, e.g. as stored in the exposition area database 204 and/or as measured via one or more sensors, o data identifying characteristics of the viewer, e.g. as stored in the viewer’s eye database 210.
  • a database such as the light fixture database 202, having stored a plurality of preset configurations and the control system 130 may store via the data processing unit 113 only a
  • the requested illumination characteristics are used to determine a plurality of preset configurations necessary to obtain the requested illumination characteristics.
  • the sensor data and/or the viewer’s eye characteristics are used by the data processing unit 113 to adapt the preset configuration data in order to obtain the requested illumination (as a function of the sensor data) and/or to obtain the requested perceived illumination (as a function of the viewer’s eye characteristics).
  • a single preset configuration data item may be stored to the data-structure 800 (identifying a requested illumination) and the variation of the illumination may be performed by varying the data transferred to the memory 1112.
  • the preset configuration data may also take into account the shape and/or dimension of the object 140 to be illuminated. For example, knowing the shape and dimension of the object 140 to be illuminated, the expected illumination of the object 140 may be determined (e.g. as a function of the optical characteristics of the light fixture 110 and the distance from the object 140), and the preset configuration data may indicate that given light sources 117 should be switches off (power supply disabled), thereby illumination only the object, such as a painting or a statue. For example, for this purpose a sensor, such as a camera, may monitor directly the illumination of the artwork 140.
  • control system 130 may also obtain data identifying the shape and/or dimension of the object 140 to be illuminated. For example, these data may be obtained via a camera or by reading respective data from the artwork database 206.
  • each artwork 140 may have associated a univocal (artwork) code, such as a QR code applied to the artwork 140, and a reader device may be used to obtain the univocal code.
  • the control system 130 may obtain the data identifying shape and/or dimension of the objected associated with the univocal (artwork) code.
  • any other method described in the foregoing may be used to obtain data associated with an artwork 140, such as image recognition or via a user interface.
  • the control system 130 may transmit the selected set of preset configurations to the light fixture 110 for storage into the data structure 800.
  • the preset configuration data may also be obtained/determined with the previously described tool used to determine the configuration of light fixtures.
  • this tool may be used to determine requested illumination settings for the light fixtures 110 installed in and/or to be installed in the exposition area 160.
  • the tool may determine the preset configuration data to be stored to the data structure 800 of the light fixture(s), and data identifying the control commands CMD to be sent to the control system 130 to the light fixture(s), e.g. the univocal code (or sequence thereof) to be sent to the light fixture(s) 110.
  • the preset configuration data stored to the data structure 800 may include preset configuration data, wherein each preset configuration data item identifies one or more of: requested illumination data for the light sources 110 associated with one or more specific artworks and/or respective artists and/or epochs; a global color temperature or local color temperatures for each light source of subsets of light sources 117; data identifying whether given light sources 117 are switched on or off, e.g. in order to implement a virtual framer or gobo, wherein the transition between the activated and the deactivated light sources 117 may be abrupt (e.g. one pixel is switched on, and an adjacent pixel is switched off) or it can be smooth (e.g.
  • a given first pixel is switched on and a given second pixel is switched of, and wherein one or more intermediate pixels have a reduced intensity); and data identifying a set of light sources 117 to be supplied with a higher power supply, thereby generating highlighted areas of the artwork 140.
  • each preset configuration data item may comprise a respective parameter indicating a requested power supply or brightness level for each light source or set of light sources 117.
  • each preset configuration data item may comprise 48 power supply parameters.
  • the light module 118 should comprise a significant number of light sources, such as at least 1000 light sources.
  • each preset configuration data item may be stored in several ways in the data storage device 112.
  • the data associated with each light source 117 may be stored as pixel data, wherein each pixel is associated with a given horizontal and vertical of the module.
  • the pixel data may comprise data identifying the intensity/power supply for the respective LED, for example in a range between 0 and 255.
  • the pixel data may comprise: data identifying the intensity/power supply for each of the LEDs; or data identifying the intensity/power supply for the complete set of LEDs, and the color of the light to be emitted by the set of LEDs, which thus may be used to calculate the intensity/power supply for each of the LEDs.
  • each preset configuration data item may also comprise data specifying a requested illumination pattern, such as pixel data specifying the color and brightness of a matrix of pixels, wherein this illumination matrix may also have a different resolution than the matrix of the light sources 117.
  • the illumination matrix may be represented by an image, such as an RGB image, wherein each pixel specifies the requested intensity and color of the illumination of a given area of the artwork 140.
  • the data processing unit 113 may be configured to map the illumination matrix on the light module 118 by scaling the image according to the distance of the artwork from the light fixture 110 and the focal distance (or more generally the spatial radiation characteristics) of the light fixture 110, which could also be variable, e.g. in order to generate a light beam having approximately the dimension of the artwork 140 to be illuminated.
  • the distance between the object 140 and the light fixture 110 may be entered manually or detected automatically, e.g. via distance sensor.
  • the data processing unit 113 may also take into account the mounting height and inclination/orientation of the light fixture with respect to the artwork 140.
  • the mapping of the light sources 117 on the artwork 140 may also be determined automatically via the light fixture 110.
  • the data processing unit 113 may be configured to switch on given light sources 117 or sets of light sources 117, and monitor, e.g. via a camera, the illumination of the artwork 140, thereby permitting to associate each light source 117 with a given area of the artwork 140.
  • the data processing unit 113 may also perform a plurality of iterations, wherein the data processing unit 113 also drives one or more actuator 114 of the light fixture 110 in order to regulate the focal distance of the light fixture 110, e.g. in order to obtain the setting of the optics wherein the dimension of the light beam generated by the light module 118 corresponds (approximately) to the dimension of the artwork 140.
  • the data processing unit 113 may calculate the requested power supply parameters for the light sources 117. Accordingly, in this case, the memory 1112 may comprise the original illumination image data or already the calculated power supply parameters for the light sources 117.
  • the use of such an illumination matrix/image of requested illumination value has the advantage that a light artist does not need to know the characteristics of the light fixture 110, but has only to specify the requested illumination profile for the artwork 140, independently from the resolution of the light matrix.
  • a sequence of images such as a film, could be used.
  • the requested illumination data may be stored with conventional picture or video formats, such as GIF, PNG, MPEG, preferably with lossless data compression.
  • these preset configuration data relate only to a base illumination, which may be adapted via control commands CMD and/or as a function of sensor data.
  • the dimension and/or the position of an highlighted area may be changed by adapting dynamically at the step 810 the power supply parameters.
  • the control system 130 may send a sequence of commands for adapting the parameters stored to the memory 1112 in order to change the dimension of and/or move the highlighted area according to a predetermined sequence.
  • the control system 130 and/or the data processing unit 113 may receive a user input, such as gestures detected via a camera or commands received via a visitor’s smartphone.
  • a human or automatic guide could explain certain features of an object 140, and to show a respective feature, the guide could move the highlighting spot across the object 140.
  • the light spot could move gradually, or it could jump from one feature to the next, simply by reducing the intensity of the respective pixels at the first spot and increasing the intensity of the respective pixels at the second spot.
  • a viewer could move the highlighting spot across the object 140 via a remote control, such as the visitor’s smartphone having installed a suitable application, via gesture recognition using a sensor (e.g . a camera) connected to the light fixture 110 or the control system 130, or interact with some user interface, such as buttons.
  • the parameters stored to the memory 1112 may be varied in order to change the color of given areas of the artwork. For example, in some older artworks, certain colors may have faded strongly. Increasing their intensity or luminosity may help to counteract the fading effect, i.e. the color could be perceived as strong as originally intended by the artist. For instance, the color yellow has faded strongly on paintings by van Gogh, the color red on other old paintings and textiles. Increasing the intensity of yellow or red on these paintings by illuminating the painting with a yellowish or reddish color permits thus to perceive the yellow or red areas to their original intensity.
  • the light module 118 described herein permit to concentrate the increased yellow or red illumination only on given areas, while the remaining areas may be illuminated with white color (e.g. light having a color temperature along the Planck curve).
  • the previously described solution may also be applied when a plurality of light fixtures 110 illuminate the same artwork 140.
  • control system 130 or a data processing unit 113 of a light fixture acting as a master device may determine the illumination data for each light fixture 110 as described in the foregoing (in particular with respect to the association of the light sources 117 to given areas of the artwork, and the power supply parameters for the light sources 117) and then regulate the intensity of light emitted by each light fixture 110 in order to obtain the requested light intensity.
  • each of the light fixtures 110 could be used to illuminate a given sub-area of the artwork 140.
  • the illumination either needs to be well aligned so that no dark border between the illuminated areas on the object exists, or the light fixtures can produce illumination areas which overlap.
  • the control system 130 or the data processing unit 113 may monitor the intensity of the light in the overlapping area and may reduce the power supply of the light sources 117 (of the light fixtures 110) associated with this overlapping area such that the overall intensity of the illumination is homogenous and equal.
  • the control system 130 or the data processing unit 113 may receive a user input or monitor the illumination of the artwork 140, e.g. via a camera.
  • the light fixtures 110 could be used to implement different tasks, e.g. one or more first light fixtures 110 could be used to provide the basic illumination of the artwork 140 and one or more second light fixtures could be used to implement dynamic effects (step 806), correct the color of the illumination or highlight given areas of the artwork 140.
  • various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a light fixture comprising a plurality of light sources, a driver circuit configured to provide an individually controllable power supply to each of the light sources as a function of one or more control signals, a data storage device having stored at least one preset configuration data item, and a data processing unit comprising a memory.
  • the method comprises: reading a preset configuration data item from the data storage device and storing the preset configuration data item into the memory; and generating the one or more control signals as a function of the configuration data stored to the memory.
  • the light fixture 110 comprises a driver circuit 116 configured to provide a regulated power supply, such as a regulated current i o t , to a light module 118 comprising one or more light sources 117, such as LEDs or laser diodes.
  • a driver circuit 116 configured to provide a regulated power supply, such as a regulated current i o t , to a light module 118 comprising one or more light sources 117, such as LEDs or laser diodes.
  • the light fixture 110 comprises a power supply circuit 900 configured to provide a DC voltage Vt us.
  • the power supply circuit 900 may comprise the previously described circuits 116e, 116f and 116g configured to generate the (preferably regulated) voltage Vt us based on an AC input voltage V m,A c received via two input terminals 116a and 116b.
  • the driver circuit 116 comprises a regulated voltage or preferably current source 902 configured to provide via output terminals 116c and 116d a regulated voltage V out or preferably a regulated current i out to a light module 118 comprising one or more light sources 117, such as LEDs or laser diodes.
  • the regulated voltage or current source 902 may comprise: a switching stage 116h and an optional output filter 116i; a feedback circuit 116k configured to provide a feedback signal FBue a control circuit 116m configured to generate one or more drive signals DRV 116 for the switching stage 116h as a function of the feedback signal FBue.
  • the feedback circuit 116k is configured to provide a feedback signal FBue indicative of (and preferably proportional to) the output quantity to be regulated, such as the instantaneous or average value of the output voltage V 0 «t or output current iout.
  • a feedback signal FBue indicative of (and preferably proportional to) the output quantity to be regulated, such as the instantaneous or average value of the output voltage V 0 «t or output current iout.
  • a voltage measurement circuit configured to monitor the output voltage V out
  • Figure 32 shows a current sensor configured to monitor the output current i out.
  • various solutions are known for monitoring the output voltage V out or output current i o t in an electronic converter. Usually, these solutions use a voltage or current sensor connected to the output of the switching stage 116h or the output terminals 116c, 116d.
  • the value of the output voltage V out or output current i out is estimated based on one or more voltage or current sensors configured to measure signals at the input of the switching stage 116h or intermediate signals within the switching stage 116h, such as the current received at the input terminals of the switching stage 116h or the current flowing through a component of the switching stage, such as a current flowing through an inductive element Ln 6 of the switching stage, e.g. the primary or secondary winding of a transformer of the switching stage 116h.
  • the control circuit 116m may be configured to generate the one or more drive signals DRV 116 in order to regulate, e.g. via a regulator circuit, the switching activity of the switching stage 116h until the feedback signal FBue corresponds to a requested value, e.g. indicative of a requested current i re f to be provided to the lighting module 118.
  • the regulator circuit comprises at least one of: a proportional (P), low- pass filtering (PT1), integral (I), or derivative (D) component.
  • P proportional
  • PT1 low- pass filtering
  • I integral
  • D derivative
  • such a regulator circuit may be implemented with an active controller, e.g. implemented with an operational amplifier and at least one dynamics’ compensating feedback branch having proportional, low- pass filtering, integral, and/or derivative characteristics.
  • At least part of the regulator circuit may also be implemented via software instructions executed by the data processing unit 113.
  • the circuit 902 is a closed-loop regulated voltage or current source.
  • the circuit 902 implements a current generator.
  • the data processing unit 113 may thus set the value of the reference signal i re f indicative of (and preferably proportional to) the requested output current in order to control the output current i o t.
  • the reference signal i re f may also be a voltage signal, e.g. in case the current sensor 116k, such as a shunt resistor, provides a voltage signal proportional to the output current i o t.
  • the data processing unit 113 may set the reference signal i re f as a function of a control command CMD received from a control system 130 and/or one or more signals received from one or more sensors 120, such as light sensors, temperature sensors, etc.
  • Figure 33 shows a possible embodiment of a block diagram of the data processing unit 113.
  • the driver circuit 116 comprises again a regulator current generator 902 configured to regulate a signal FBi indicative of (and preferably proportional to) the output current i o t to a requested value i re f
  • the signal FBi is generated by a current sensor 116ki, such as a shunt resistor Rs connected in series with the output terminals 116c and 116d.
  • the data processing unit 113 is configured to determine a signal indicative of a requested light flux F h /.
  • a module 906 such as an analog and/or digital hardware circuit or a software module executed by the digital processing unit 1130 (see also Figure 30), may receive via the communication interface 110 a command CMD from the control system 130, such as a central control system of a complex lighting system 100, a smartphone or also only a dimmer.
  • the command CMD may be used to load a preset configuration data item from the dataset 800.
  • the signal indicative of a requested light flux re f is provided to a second module 908, such as an analog and/or digital hardware circuit or a software module executed by the digital processing unit 1130.
  • the module 908 is configured to generate the signal i re f as a function of the signal indicative of a requested light flux F -f
  • the requested light flux F G3 ⁇ 4 t may refer to a requested brightness value of the light emitted by the light fixture 110 or a plurality of requested brightness values for a plurality of light sources 117 or sets of light sources.
  • such requested brightness values for a plurality of light sources 117 or sets of light sources 117 may be determined as a function of a global requested brightness value and requested color information, such as a requested color temperature.
  • the data identifying the requested light flux F re f may be used to set the reference value ef for supplying a given set of light sources 117 with a given current i o t , wherein a relative dimming of the light sources 117 within the set of light sources 117 may still be performed by controlling the current flow through the light sources (see e.g. the description of Figure 25).
  • the digital processing unit 1130 may store the requested power supply data, which may be used to determine or may also directly include the reference signal U e f, to the memory 1132.
  • the light flux is a function (at least) of the current i out flowing through the LEDs, the voltage V 0ut at the LEDs and the temperature of the LEDs.
  • the light system 100 e.g. directly the light fixture 110, comprises a light sensor 120a configured to generate a signal indicative of a light flux F generated by the light sources 117.
  • a plurality of light sensors 120a may be used and/or the light sensor 120a may also provide color information.
  • the module 908 may be configured to vary/regulate the reference signal U e f, e.g. via a PID regulator circuit, such that the signal indicative of a light flux F corresponds to the signal indicative of a requested light flux n e.g. by adapting the data stored to the memory 1132.
  • Figure 34 shows an embodiment, wherein the light fixture 110 comprises a temperature sensor 120b configured to generate a signal indicative of the temperature T of the light sources 117.
  • the driver circuit 116 comprises a second feedback circuit 116k 2 configured to provide a signal FB2 indicative of (and preferably proportional to) the output voltage V out.
  • the module 908 may have stored a look-up table or a mathematical function which determines the value of the signal i re f as a function of the signal indicative of a requested light flux n the feedback signal FB 2 and the signal indicative of the temperature T of the light sources 117.
  • Figure 37 shows in this respect an embodiment of the operation of the data processing unit 113.
  • the light fixture 110 and the respective light sources 117 are calibrated at a step 922.
  • the light flux F is measured for a plurality of operating settings, such as for different values of the reference signal U e f and (if supported) by setting different relative dimming levels for the light sources 117.
  • the measured light flux F may refer not only to the brightness but also the color.
  • the calibration phase 922 associates given reference signal /V e / with a respective light flux F.
  • the light flux F often depend also on the voltage V 0ut at the light sources 117 (signal FB 2 ) and the temperature T of the light sources 117. Accordingly, also these data may be monitored at the step 922 in order to generate a model of the light sources 117. Generally, respective data may also be derived from the datasheet of the light source(s) 117. Thus, at the end of the calibration phase 922, a model, e.g.
  • the module 908 may be stored in the module 908, which associates to each combination of requested light flux ⁇ P re f feedback signal FB2 and signal indicative of the temperature T of the light sources 117 a respective reference signal i re f flowing through the LEDs,
  • the module 908 sets at a step 924 the reference signal i re f as a function of data identifying a requested light flux F, ⁇
  • the reference signal i re f is then used by the driver 116 as the current setpoint for the output current i o t.
  • the module 908 may also adapt at a step 926 the reference signal i re f for a given requested light flux F, n[
  • LEDs or laser diodes usually have a higher efficiency, i.e. higher lumen output at a certain current, when the junction is cold, while in normal operation, when the junction has become hot, the lumen output decreases for the same current i o t.
  • the module 908 may use the measured light flux F in Figure 33, and the output voltage V 0 ut and the temperature T in Figure 34.
  • the desired illumination intensity might not be reached as a light source 117, e.g. an LED, is damaged or the intensity of the light source 117 is degraded due to ageing.
  • a certain threshold value e.g. due to a malfunction in the control system 130, the module 908 or a short circuit in the wiring. This can be the case e.g. for laser based light sources 117 where a too high laser intensity could damage eyes of a user.
  • artworks 140 often have a given maximum irradiation threshold value.
  • the embodiment shown in Figure 33 performs a control of the light flux F. Accordingly, a degradation of the light sources 117 may be compensated via the feedback of the light sensor 120a. Similarly, the light sensor 120a may provide indications for an incorrect operation of the light sources 117, e.g. an excessive or insufficient intensity of the light emitted by the light sources 117 for a given reference value i ref.
  • the module 908 may also consider degradation of the light sources 117 in the model of the light sources 117, but a malfunction may not be detected easily.
  • the driver circuit 116 may thus implement an overcurrent and/or overvoltage protection function.
  • the driver circuit 116 may comprise an electronic switch 904 configured to selectively disable the power supply of the voltage or current source 902, such as an electronic switch connected between the power supply circuit 900 and the voltage or current source 902, or an electronic switch of the switching stage 116h.
  • the voltage or current source 902, e.g. the control circuit 116m may be configured to disable the power supply when the voltage V 0 ut and/or the current i out exceeds a respective threshold value.
  • the threshold value may be determined as a function of the maximum light flux value.
  • the threshold when using laser diodes, the threshold should be set to a current value that limits the light output to a given maximum value when the junction is cold, thereby avoiding damages to the eyes of users.
  • the start-up phase with high intensity is usually short ( e.g . several seconds) and the threshold value should be set to a current value that limits the light output when the junction is hot.
  • artworks may have different maximum illumination thresholds.
  • a fixed maximum overcurrent threshold may be suitable from an electrical safety point of view, but usually is not sufficient to ensure a maximum light flux.
  • an overcurrent protection may not always ensure that malfunctions in the driver circuit 116, in particular the control circuit 116m, are handled correctly.
  • the module 908 generates again at a step 926 a reference signal i re f, e.g. as a function of a requested light flux ⁇ P re f, and optionally a measure light flux F, or the output voltage Vout and/or the temperature T. Accordingly, in the embodiment considered, the module 908 provides the setpoint i re f for the constant current regulator 902.
  • the regulated current generator 902 measures the current i o t and regulates it to the desired value i re f This should assure a correct illumination during a normal operation of the light fixture 110.
  • the signal i re f is also provided to a module 910, which also receives a signal indicative of (and preferably proportional to) the output current i o t .
  • the module 910 may be connected to the current sensor 116ki or to an additional current sensor.
  • the signal feed to the module 910 is indicate of (and preferably proportional to) the average value of the output current i out .
  • the module 910 may receive a low-pass filtered version of the signal FBi.
  • the module 910 may be an analog and/or digital hardware circuit or a software module executed by the digital processing unit 1130.
  • the module 910 is configured to compare at a step 930 the measured value, e.g. the signal FBi, and the requested value i re f independently from the constant current regulator 902.
  • the module 910 is configured to determine at a step 928 a lower threshold and an upper threshold as a function of the value of the requested value i re f
  • the module 910 is configured to calculate the upper threshold and the lower threshold by adding a given percentage of the value i re f to the value i re f and subtracting a given percentage of the value i re f from the value / re /, respectively.
  • the given percentage may be between 5 % and 20 %, e.g. 5, 10 or 20 %.
  • the module 910 detects that the measured value, e.g. the signal FBi, is smaller than the lower threshold or greater than the upper threshold (out “Y” of the verification step 930), the module 910 generates at a step 932 an error signal indicating a malfunction of the light fixture 110 and the procedure terminates at a stop step 934.
  • the measured value e.g. the signal FBi
  • this error signal may be used to disable the power supply of the current regulator 902, e.g. via the electronic switch 904.
  • the module 910 may send a warning signal to the control system 130, optionally including also data identifying the value of the measured value, e.g. the signal FBi, and/or the reference signal i re f
  • the module 910 may be configured to provide the measured value and/or the reference signal i re f to the control system 130, e.g. periodically or in response to a given command CMD received from the control system 130.
  • the module 910 may not signal an abnormal behavior and returns to the step 924.
  • Figure 36 shows an embodiment, wherein the reference signal i re f is not fed directly to the current generator 902.
  • the module 908 generates again a reference signal i re f for the current generator 902.
  • the reference signal i re f is provided to the module 910, which generates the reference signal i re f for the current generator 902.
  • the module 910 is configured to set the signal i re f to the value of the signal i re f.
  • the module 910 determines at the step 928 the upper and lower thresholds as described in the foregoing.
  • the module 910 may again generate at the step 932 an error signal indicating an abnormal behavior of the light fixture 110.
  • the module 910 may adjust at a step 936 the value of the signal i re f in order to regulate the measured value to the requested value i re f.
  • the module 910 implements a further closed control loop in addition to the control loop of the current regulator 902.
  • this additional control loop may be exposed to malfunctions.
  • the data processing unit 113 may comprise a “watchdog” circuit 912.
  • a watchdog circuit 912 may be reset periodically via the module 910, e.g. at the step 936. If the watchdog detects a fault from the DPU, e.g. the watchdog circuit 912 is not reset within a given period of time, the watchdog circuit 912 may set the error signal indicative of an abnormal operation, which, e.g. , disables (e.g. via the electronic switch 904) the power supply of the current regulator 902.
  • the data processing unit 113 implements three functions: the light output control function 906, the light flux regulation function 908 and the additional ( e.g . software) current loop function 910.
  • the light output control function 906 provides data identifying a requested light flux F h / of the light sources 117.
  • the requested light flux F,- e f ay be determined as a function of a requested and/or maximum illumination of an artwork.
  • the light flux regulation function 908 translates the requested light flux F,- e f into a current setting i re f, possibly taking into account further data provided by sensors 116k 2 , 120a and/or 120b.
  • the module 908 may adapt the reference signal / re / in order to regulate the light flux F.
  • the module may adapt the reference signal re / as a function of the junction temperature T (see Figure 34) or according to a given time profile implicitly indicating the heating curve of the junction temperature, thereby compensating the different light flux during the heating phase of the light sources 117.
  • the current loop function 910 provides the current setting i re f to the driver 116 as a setpoint, compares redundantly the target / re /o r i re f with the real value, e.g. FBI , of the current i out and decides which measures to take in case of discrepancy.
  • the real value e.g. FBI
  • the verification of the output current i o t may be performed via a digital processing unit 1130, e.g. via respective software instructions executed by a microprocessor. Accordingly, a digital sample of the output current i out , e.g. the feedback signal FBi has to be obtained (and if required also of the other signals provided by the sensors 116k 2 , 120a and 120b). Accordingly, the data processing unit 113 may also comprise an analog-to-digital converter 914 configured to provide a digital sample indicative of (and preferably proportional to) the output current i o t. Similarly, in case the regulated current generator 902 is configured to receive an analog reference signal i re f, e.g. because the control circuit 116m is an analog control circuit, the data processing unit 113 may comprise a digital-to- analog 916 converter configured to receive a digital reference signal / re / and provide an analog reference signal i re f
  • such an analog-to-digital converter 914 may also be low-speed and operated, e.g. with a sampling frequency being smaller than 1 kHz, preferably between 1 and 100 Hz, e.g. between 1 and 20 Hz.
  • various embodiments of the present disclosure relates to a method of operating a light fixture comprising a light module comprising one or more light sources, a power supply circuit configured to provide a DC voltage, a regulated current generator configured to provide an output current to the one or more light sources as a function of a reference signal, a current sensor configured to provide a first measurement signal indicative of the output current, and a data processing unit operatively connected to the regulated current generator and the current sensor.
  • the method comprises executing the following steps via the data processing unit: setting the reference signal as a function of data identifying a requested illumination to be generated by the one or more light sources; determining an upper and a lower current threshold as a function of the reference signal; obtaining the first measurement signal;
  • the light fixture 110 may include optics 115 comprising one or more optical elements.
  • the optics 115 may include one or more of the following optical elements: a lens and/or reflector, means for reducing glare, a diffuser or diffusive layer, optical filters for color changing, and/or a framer or shutter.
  • Figures 40A, 40B and 40C show possible embodiments of the optics 115.
  • the light module 118 (comprising one or more light sources 117, such as LEDs) emits light.
  • the brightness and/or the spectral characteristics of the light emitted by the light module 118 may be controllable.
  • the brightness and/or the spectral characteristics of the light emitted by the light module 118 may be varied globally or locally. Accordingly, in general, the light emitted by the light module 118 has a given beam pattern.
  • the light emitted by the light fixture 110 should have a given beam angle or even a requested beam pattern.
  • the requested beam angle/beam pattern may depend on the application needs. For example, a spotlight should have a small beam angle.
  • a light fixture 110 comprises one or more optical elements 115 for varying the beam pattern of the light emitted by the light module, or even by each light source 117, in order to obtain a requested beam pattern, e.g. by focusing or expanding the light emitted by the light module 118 in order to obtain a requested beam angle.
  • Figure 40A shows a first embodiment.
  • the light fixture 110 comprises a first optical element 115a configured to focus the light generated by the light module 118.
  • the first optical element 115a may be configured to generate substantially parallel light rays.
  • the first optical element 115a may comprise one or more of: a reflector for the light emitted by the light module 118; a micro-reflector structure, wherein a micro-reflector is arranged in correspondence with each light source 117 or a set of light sources 117; a collimator lens or lens structure arranged in front of the light module 118; and a micro-lens structure, wherein one or more micro-lenses are arranged in correspondence with each light source 117.
  • the light fixture 110 comprises also a second optical element 115b mounted at a given distance D from the first lens structure 115a.
  • the second optical element 115b may comprise a second reflector and/or a second lens structure configured to focus or expand the light provided by the first optical element 115a in order to obtain the requested beam angle.
  • the second optical element 115b may comprise a convex lens or lens structure.
  • the distance D may also be variable (e.g. via an actuator 114 configured to move the second optical element 115b), thereby selectively varying the focal point (and thus the beam angle) of the light generated by the light fixture 110.
  • the distance between the light module 118 and the first optical element 115a could be variable and e.g. controlled by a further actuator 114.
  • FIG. 40B shows an embodiment, wherein the optics 115 comprise an optical element 115c, wherein the optical element 115c comprises a framer, shutter and/or gobo.
  • the optical element 115c may be arranged along the light path between the light module 118 and the artwork 140, e.g.
  • one or more elements of the optical element 115c may be variable, and e.g. controlled by a further actuator 114, such as the vertical and/or horizontal aperture of a framer.
  • Figure 40C shows an embodiment, wherein the optics 115 of the light fixture 110 comprises also an optical element 115d, wherein the optical element 115d comprises a neutral density filter, diffuser, diffusive layer, and/or optical filters for color changing.
  • the optical element 115d may be arranged along the light path between the light module 118 and the artwork 140, e.g. :
  • Figure 41 A shows a typical illumination scenario, wherein a light fixture 110, e.g. fixed to the ceiling 161 of the exposition area 160, illuminates an artwork 140, e.g. fixed to a wall 163 of the exposition area 160. Accordingly, in such an illumination scenario, the artwork 140 is illuminated under a given angle. Moreover, often, a light fixture 110 does not emit uniform light. Accordingly, as shown in Figures 4 IB and 41C, when measuring the local illumination values F of the artwork 140 via a light sensor 120, such as a camera, it may be observed that the illumination F of the artwork 140 often is not uniform, e.g. because: the illumination emitted by the light fixture 110 is not uniform, and/or
  • the upper portion of the artwork 140 is closer to the light fixture than the lower portion of the artwork 140.
  • line 1100 in Figure 41C shows an example of the measured light intensity F in the vertical direction y of the artwork 140.
  • a diffuser 115d may be arranged in the light path between the light module 118 and the artwork 140, in order to generate a more uniform illumination emitted by the light fixture 110.
  • line 1102 in Figure 41C shows an example of a more uniform light intensity in the vertical direction of the artwork 140.
  • this does not necessarily compensate the different distances of areas of the artwork 140 from the light fixture 110
  • optics 115 which are able to transform the beam pattern of the light emitted by the light module 118 (possibly already transformed via one or more optical elements of the light fixture 110) into a requested beam pattern, which provides the desired illumination of an artwork 140.
  • a beam pattern may be defined by a bi-dimensional matrix of intensity values in a plane perpendicular to the optical axis of the light fixture 110.
  • a translucent optical element such as a diffuser or neutral density filter, is used to perform such a conversion between the beam pattern of the light emitted by the light module 118 and the requested beam pattern.
  • Figure 42 shows an embodiment of the optics 115 of a light fixture 110 in line with the previous description.
  • the optics 115 comprise: a translucent optical element 115 2 ; a first set of optical elements 115i arranged between the light module 118 and the diffuser 115 2 , and optionally a second set of optical elements 115 3 arranged between the diffuser 115 2 and the artwork 140.
  • the first set of optical elements comprises at least one lens and/or reflector 115a/l 15b
  • the optional second set of optical elements comprises a framer, shutter or gobo 115c.
  • This embodiment has the advantage that the optical transfer function of the second set of optical elements 115 3 does not further deform the beam pattern, but only sets given values of the beam pattern to zero.
  • the translucent optical element 115 2 may also be mounted in the aperture of the framer, shutter or gobo 115c, wherein the distance d2 is substantially zero.
  • the translucent optical element 115 2 may be arranged at any position within the optics 115, preferably perpendicular to the optical axis of the light provided by the first set of optical elements, i.e. the second set of optical elements 115 3 may also comprise other optical elements, such as a lens, color filter, etc..
  • Figure 43 shows an embodiment of a tool and a respective method for implementing the translucent optical element 115 2.
  • a tool may be implemented via software instructions, such as an application executed on a processing device, such as a smartphone or tablet, or a web-application.
  • the translucent optical element 115 2 should be mounted in a given plane 1106 at a distance dl from the first set of optical elements 115i.
  • a second set of optical elements 115 3 may be arranged at a distance d2 from the translucent optical element 115 2.
  • the plane 1106 is perpendicular to the optical axis 502 of the light provided by the light module 118 once having passed the first set of optical elements 115i, i.e. the optical axis of the light provided by the first set of optical elements 115 1 .
  • the tool determines at a step 1114 the “original” beam pattern of the light emitted by the light module 118 and having passed the first set of optical elements 115i in the plane 1106 (without the translucent optical element 115 2 ).
  • the tool may receive the beam pattern of the light provided by the light fixture 110 and optionally the optical transfer function of the second set of optical elements 115 3.
  • the beam pattern of the light provided by the light fixture 110 and optionally the optical transfer function of the second set of optical elements 115 3.
  • methods for determining the beam pattern of light are per se well known in the art.
  • the tool may determine at the step 1114 the original beam pattern in the plane 1106 by receiving at the step 1114 directly the beam pattern in the plane 1106, which e.g. may be measured by removing the optional second set of optical elements 115 3.
  • the tool may determine the original beam pattern in the plane 1106 by receiving at a step 1112 a beam pattern measured in a plane perpendicular to the optical axis 502 at a distance being greater than the distance dl, and by calculating at the step 1114 the beam pattern in the plane 1106 via geometrical projection of the measured beam pattern.
  • the tool may determine the original beam pattern in the plane 1106 by receiving at the step 1112 a beam pattern measured in a plane perpendicular to the optical axis 502 at a distance being greater than the distance dl + d2, and by calculating at the step 1114 the beam pattern in the plane 1106 via geometrical projection of the measured beam pattern and as a function of the optical transfer function of the second set of optical elements 115 3 (which may be rather simple in case the second set of optical elements 115 3 comprises only a framer, shutter or gobo).
  • the tool may determine the original beam pattern in the plane 1106 by receiving at the step 1112 a beam pattern measured in a plane being arranged at a given angle with respect to the optical axis 502, e.g. the beam pattern of the illumination of the portion of the wall where the artwork should be positioned (see also Figure 41 A). Accordingly, in this case the tool may first calculate at the step 1114 the beam pattern in a plane perpendicular to the optical axis 502 as a function of the measured beam pattern and the position of the (measurement) plane with respect to the light fixture 110, and then proceed as mentioned for the second or third embodiment, i.e.:
  • such beam patterns may be measured by illuminating a reference surface 1104, preferably a Lambertian surface, with the light fixture 110 and measuring the illumination/luminance of the reference surface, e.g. via a camera 120.
  • a reference surface 1104 preferably a Lambertian surface
  • the reference surface may be a wall of the exposition area 160 where the artwork 140 should be fixed.
  • the tool determines at a step 1118 the “requested” beam pattern of the light emitted by the light module 118 and having passed the first set of optical elements 115i in the plane 1106 (with the translucent optical element 115 2 ).
  • a given artwork 140 should usually be illuminated with a given requested illumination pattern, which may be either uniform or custom, e.g. in order to highlight given areas of the artwork and/or to compensate overlapping illuminations of a plurality of light fixtures 110.
  • the tool may determine the modified/requested beam pattern in the plane 1106 by receiving at a step 1116 requested illumination values, e.g.
  • the tool may first calculate at the step 1118 the requested beam pattern in a plane perpendicular to the optical axis 502 as a function of the requested illumination values and the position of the artwork 140 with respect to the light fixture 110, and then proceed as mentioned before, i.e. :
  • the steps 1114 and 1118 provide the original beam pattern which should be converted in a requested/modified beam pattern via the translucent optical element 115 2.
  • the properties of the translucent optical element 115 2 may be determined, which permit to transform the original beam pattern in the modified beam pattern.
  • the tool may provide at a step 1122 the technical specification of the translucent optical element 115 2 , which may be used to produce the translucent optical element 115 2.
  • the translucent optical element 115 2 may be mounted in the above described position in the optics 115, i.e. at a distance dl from the first set of optical elements 115i and the procedure terminates at a stop step 1126.
  • the translucent optical element 115 2 should be configured to convert an original beam pattern, identified via a first matrix of light intensity values, in a modified/requested beam pattern, identified via a second matrix of light intensity values.
  • the translucent optical element 115 2 is implemented with a translucent material.
  • a translucent material may be implemented: e.g. in case of a neutral density filter, with a light absorbing base material 1150, and/or e.g. in case of a diffuser, with opaque and/or scattering particles 1152 dispensed in the base material 1150.
  • the transmittance of a given area of the translucent optical element 115 2 may be modified, e.g, by:
  • the translucent optical element 115 2 is implemented with a translucent material 1150, 1152 comprising a first surface 1154 for receiving a light intensity F 1 and an opposite second surface 1156 for providing an attenuated second light intensity F' .
  • a translucent material 1150, 1152 comprising a first surface 1154 for receiving a light intensity F 1 and an opposite second surface 1156 for providing an attenuated second light intensity F' .
  • the light beam passes through the thickness L of the translucent optical element 150 2 and is attenuated.
  • ⁇ F(z ) — m(z) F(z) dz
  • m(z) is the attenuation coefficient of the translucent optical element
  • F(z) is the light flux entering a given slice.
  • the attenuation m(z) may result from absorption within the base material 1150 and/or absorption/scattering at the particles 1152.
  • the following light transmission ratio T of the translucent optical element may be defined:
  • the tool may calculate the requested thickness L of the translucent optical element as:
  • the tool may be configured to calculate at the step 1120 a matrix of transmission ratios T(x,y) as a function of the original beam pattern, essentially comprising a matrix of light flux values F i (c,g), and the modified/requested beam pattern, essentially comprising a matrix of light flux values F'(c, ⁇ ).
  • the tool may obtain a first matrix of first light intensity values F 1 , wherein each first light intensity value F 1 is associated with a respective area of the surface 1154 and identifies the intensity of light expected to enter the respective area of said first surface 1154.
  • the tool may obtain a second matrix of second light intensity values F 1 having the same dimension as the first matrix, wherein each second light intensity value F 1 is associated with a respective area of the surface 1156 and identifies the intensity of light requested to exit the respective area of the surface 1156 when the expected intensity of light enters said first surface.
  • the tool may thus calculate a matrix T(x,y) of light transmission ratios having the same dimension as the first matrix and the second matrix.
  • each light transmission ratio T may be calculated as a function of a respective first light intensity value F 1 and a respective second light intensity value F e.g. in case of absolute intensity values, by calculating the ratio between the respective second light intensity value F' and the respective first light intensity value F 1 .
  • the light transmission ratio T may be between 0 and 100%, i.e. the translucent optical element may only reduce the light flux entering the translucent optical element. Accordingly, in various embodiments, in case one or more of the values of the requested beam pattern are greater than the respective values of the original beam pattern, the intensity of the light emitted by the light module 118 may be increased and a new original beam pattern may be obtained (either by estimating the new beam pattern or by performing new measurements). Similarly, in case all values of the requested beam pattern are significantly smaller than the respective values of the original beam pattern, the intensity of the light emitted by the light module 118 may be reduced and a new original beam pattern may be obtained (either by estimating the new beam pattern or by performing new measurements). For example, in typical applications, the intensity of the light emitted by the light module 118 should be varied such that the values of the matrix of transmission ratios T(x,y) are between 10% and 90%, preferably between 20% and 80%.
  • the respective thickness L(x,y) of the translucent optical element may be calculated as a function of the respective transmission ratio T(x,y) and the attenuation coefficient m of the material of the translucent optical element.
  • the tool may calculate a matrix L(x,y) of thickness values having the same dimension as the matrix T(x,y) of light transmission ratios, wherein each thickness value L is calculated as a function of a respective light transmission ratio T and the attenuation factor m of said translucent material.
  • the matrix L(x,y) of thickness values identifies the requested thickness of the translucent material between the surface 1154 and the surface 1156 in order to obtain the intensity of light requested to exit the surface 1156 when the expected intensity of light enters the surface 1154.
  • the translucent optical element has thus a conversion portion having a dimension corresponding to the dimension of the beam pattern in the plane 1106, or at least the area of the requested beam pattern in the plane 1106 having non zero values, which thus (in use) receives light from the first set of optical elements 115i and modifies the beam pattern.
  • the translucent optical element may also comprise a peripheral portion, which e.g. may be used to mount the diffuser in the light fixture 110.
  • the second set of optical elements 115 3 may only comprise a framer, shutter or gobo. Such, elements have a diameter between 10 mm and 100 mm. Accordingly, also the translucent optical element should have a similar dimension for the conversion portion, plus a peripheral portion for mounting the translucent optical element 115 2 in the light fixture 110. Moreover, as mentioned before, the translucent optical element 115 2 may also be mounted in the aperture of the framer, shutter or gobo 115c. Accordingly, in this case, the conversion area (with variable thickness) should have a profile being complementary to the aperture of the framer, shutter or gobo.
  • Figure 45 A shows a first embodiment of a translucent optical element 115 2.
  • the translucent optical element 115 2 should comprise a conversion area having a height and width of 100 mm.
  • the transmission ratio T(y) is shown in Figure 46A, which essentially has a linear profile:
  • the thickness L of the translucent optical element may be calculated as follows:
  • the tool may thus discretize the surface of the translucent optical element in the vertical direction (y) in a given number N of stripes (each one having a width of 100 mm and a height of 100 mm/L) as shown in Figure 45 A.
  • the tool may also calculate a higher resolution matrix U(x,y), e.g. via interpolation of the low- resolution matrix L(x,y).
  • the resolution of the thickness matrix used for producing the translucent optical element is greater than 25 dpi (dots-per-inch), preferably greater than 100 dpi, e.g. between 200 and 1200 dpi.
  • any suitable production method may be used for producing the translucent optical element at the step 1122 as a function of the dimensional data, in particular the thickness matrix L x,y).
  • the translucent optical element 115 2 may be produced via an injection molding process, e.g. for mass production. Conversely, for lower production numbers may be used a material removal process, in which the thickness of a block of material may be reduced in order to correspond to the requested thickness values L(x,y).
  • the translucent optical element 115 2 is produced via additive manufacturing, i.e. a 3D printing process.
  • the thickness matrix L(x, y ) may refer to the thickness of a coating to be applied to a flat and uniform substrate, such as a glass substrate, e.g. having a thickness between 1 mm and 2 mm.
  • a coating may be a metallic coating, which may be applied, e.g. , via sputter deposition, vacuum evaporation, etc.
  • the base material 1150 of the translucent optical element may be made from a plastic material, such as thermoplastic materials, e.g. polycarbonate (PC) or acrylic/polymethyl methacrylate (PMMA), silicone or glass material.
  • a plastic material such as thermoplastic materials, e.g. polycarbonate (PC) or acrylic/polymethyl methacrylate (PMMA), silicone or glass material.
  • the absorption/scattering within the material 1150 may be obtained via absorbing and/or scattering particles 1152 distributed in the base material 1150.
  • the particles 1152 may be AI2O3, S1O2, T1O2, etc.
  • various different particles 1152 may be mixed to the base material 1150, for example, a high refractive index silicone (HRI-Silicone), e.g. with a quantity of 0.5 - 5.0 wt%, may be mixed to a low refractive index silicone (LRI-Silicone), which permits to obtain a turbid mixture, which has, after curing, diffuse properties.
  • HRI-Silicone high refractive index silicone
  • LRI-Silicone low refractive index silicone
  • various embodiments of the present disclosure relate to a method of producing a translucent optical element for a light fixture, wherein the translucent optical element is implemented with a translucent material comprising a first surface for receiving a light radiation and an opposite second surface for providing an attenuated second light radiation, wherein the second surface is arranged at a given variable thickness from the first surface.
  • the method comprises the steps of: obtaining a first matrix of first light intensity values, wherein each first light intensity value is associated with a respective area of the first surface and identifies the intensity of light expected to enter the respective area of the first surface; obtaining a second matrix of second light intensity values having the same dimension as the first matrix, wherein each second light intensity value is associated with a respective area of the second surface and identifies the intensity of light requested to exit the respective area of the second surface when the expected intensity of light enters the first surface; calculating a matrix of light transmission ratios having the same dimension as the first matrix and the second matrix, wherein each light transmission ratio is calculated as a function of a respective first light intensity value and a respective second light intensity value; obtaining an attenuation factor of the translucent material; calculating a matrix of thickness values having the same dimension as the matrix of light transmission ratios, wherein each thickness value is calculated as a function of a respective light transmission ratio and the attenuation factor of the translucent material, and wherein the matrix of thickness
  • the previous solutions are useful in order to determine a suitable set of light fixtures 110 and/or the respective configuration for illuminating one or more artworks 140 in an exposition area 160.
  • the artwork(s) 140 may also be exposed in exposition areas 160 where background light levels are not always negligible.
  • Background light might be due to the natural (sun) light, i.e. daylight as a natural light source, or artificial light sources.
  • background illumination ⁇ i.e. in addition to the light generated by one or more light fixtures 110 in order to illuminate a given artwork 140) may be provided, e.g, by: natural or artificial light sources located outside the exposition area 160, e.g. outdoors or within another room/exposition area 160, wherein light enters into the exposition area 160 through an opening of the exposition area 160, e.g. a window 164 or a door 165; other light fixtures 110 installed in the same exposition area 160 and intended to illuminate other artworks or the exposition area 160 itself, e.g. the floor 162.
  • background light may vary during the exposition of the object/artwork 140, both in spectrum and intensity distribution.
  • natural background light does not have an exact cyclic behavior, e.g. due to a variable day-light intensity during different days of a year or due to variable weather conditions.
  • artificial background light may have variable characteristics, e.g. because lamps degrade and brightness varies over a long term, e.g. months or years.
  • the resulting light (having a given spectrum and intensity), which illuminates a specific object 140 may vary during an exhibition, thereby resulting in a different illumination of an object 140 with respect to the target/requested illumination, e.g. as defined during a light project by a light designer.
  • a different/variable illumination of an artwork 140 may create aesthetic issues, because the appearance of an artwork 140 changes, e.g. over a single day, a season, or a year.
  • background light essentially increases the illumination of an artwork 140, just using the characteristics of the light fixtures used to intentionally illuminate an artwork 140 may result in an unrealistic estimation of light dosimetry.
  • light dosimetry is often used to define the exposure limit in terms of illuminance level and duration of the exposure of an artwork.
  • the characteristics of the (natural and/or artificial) background light should thus be measured, and the characteristics of the light emitted by the light fixtures 110 should possibly be adapted as a function of the measured background light characteristics.
  • One of the simplest solutions consists in executing manual measurements of the illuminance level and the light spectra at each object 140 whose illumination should be controlled. This, of course, is limited by the accuracy in time as the measurements are only performed for a short moment, and the reproducibility of the measurement from the operator.
  • automated luminance measurement solutions are preferable.
  • the brightness and/or light spectrum of the light emitted by one or more light fixtures 110 is automatically controlled and optionally varied as a function of the measured luminance characteristics, e.g. in order to regulate the light characteristics to requested brightness and/or color values.
  • the sensors 120 may thus include luminance/light sensors installed in the exposition area 160.
  • a light sensor 120 may be a 2D color sensor configured to acquire pixel data, wherein (at least) a subset of the pixels comprises the object 140 to be illuminated, e.g. a painting, such as a camera configured to acquire a picture/image of the object 140.
  • the sensor 120 and/or the control system 130 may be configured to process the pixel data, e.g. in order to: extract the color data of the subset of pixels associated with the object 140; calculate a mean color value for the subset of pixels; and/or calculate color values for sub-segments of the subset of pixels.
  • the pixel data and/or the calculated color values are indicative of the brightness and the color reflected by the object 140.
  • the brightness data may be used, e.g. by using mere luminosity sensors, such as a monochrome (e.g. grayscale) camera, or by converting the color data into monochrome (e.g. grayscale) pixel data.
  • a monochrome e.g. grayscale
  • monochrome e.g. grayscale
  • control system 130 may be configured to regulate the intensity of light emitted by the light fixtures 110 as a function of the brightness data.
  • the target/requested values for the illumination of the object 140 are obtained.
  • these target data may be stored in an artist illumination database 208.
  • these data may also include maximum illumination values, which may be stored, e.g. , in the artwork database 206.
  • the requested and/or maximum illumination values may also be determined by (manually or automatically) classifying the artwork 140, wherein with each class of artwork are associated requested and/or maximum illumination values.
  • the characteristics of the illumination of an object 140 may be measured e.g. via: a light sensor positioned in proximity of the object 140, thereby measuring the light received at the object 140; a light sensor positioned in proximity of the light fixture 110, thereby measuring the light emitted by the light fixture 110, which permits to calculate the light received at the object 140 as a function of geometrical data specifying the position of the object with respect to the light fixture(s) 110; a light sensor, such as a camera, configured to measure the characteristics of the light reflected by the object 140; and/or a light sensor, such as a camera, configured to measure the characteristics of the light reflected by a reference surface positioned in proximity of the object 140.
  • the selection of the sensor(s) to be used depends usually on the application needs, e.g. with respect to the costs involved and the characteristics of the exposition area 160 and/or the artwork 140.
  • the selection of light sensors 120 is performed by a light designer which creates a light plan for a given exposition area 160. For example, based on the selected light sensors, also the installation characteristics of the light sensors 120 change.
  • the simplest solution for measuring the luminance of an object 140 is based on sensors installed in proximity of the object 140.
  • this approach is often not welcome, because it requires installing an electronic device in the proximity of the object.
  • this may have an aesthetical impact because a sensor 120 has to be placed near or even on the object 140.
  • possible malfunction of the sensor 120 may create a risk for the artwork 140, e.g. due to fire generated by the sensor 120.
  • the installation of such sensors 120 is often complex, because a power supply has to be provided or a battery of the sensor 120 needs to be replaced regularly. Generally, also such installation/maintenance operations may damage the object 140.
  • the solutions which measure the luminance of the object 140 i.e. characteristics of the light reflected by the object 140
  • the reflected light changes with the angle of incidence of the light generated by the light fixture(s) 110 and the angle of observation, both in brightness and spectral response.
  • the reflectivity of an object may also vary over the time, e.g. due to a variable temperature.
  • sensors 120 are selected manually by a light designer.
  • a method is proposed which (at least in part) automatically assesses and evaluates the exposition situation and provides a recommendation, which kind of light sensor 120 (sensor type, sensor combination, operating parameters, placement of sensors relative to the art object) should be used.
  • such method may be implemented with a software program to be executed by one or more computer.
  • a computer comprises a data processing unit, a data storage and a display with a graphical user interface (GUI).
  • GUI graphical user interface
  • the method may be implemented with a web application executed by a Webserver and/or an APP to be executed by a mobile device, such as a smartphone or tablet.
  • a tool device with implemented software programs und user interface
  • this software tool for selecting the sensors 120 may also be combined with the software tool used to determine the configuration of light fixtures 110 (light fixture selection tool).
  • the light fixture selection tool permits to select the light fixtures and optionally the respective configuration to be used to illuminate a given artwork 140.
  • the light fixture selection tool already uses a 3D model of the exposition area 160, which permits to generate a simulation of the illumination of the exposition area 160.
  • each light fixture 110 has associated data identifying the position of the light fixtures, their intensity and their radiation pattern. Alternatively, similarly data may also be entered manual by a light designer.
  • the sensor selection tool may similarly perform simulations of the illumination of the exposition area 160.
  • these simulations may also correspond to the simulations performed for selecting the position of the light fixtures 110.
  • a simplified simulation may be performed by using only a two- dimensional model of the exposition area 160 (more or less as shown in Figure 12).
  • illumination of an artwork 140 may include both the light generated by the light fixture(s) 110 dedicated to a specific object 140 and (natural and/or artificial) background light. Accordingly, light sensors 120 have to be placed in positions, which permit an efficient monitoring of the illumination of the artworks.
  • the selection of light sensors 120 may be performed manually or automatically. For example, based on the characteristics of the artwork, the tool may propose for each artwork 160 a suitable sensor. The tool may also propose a plurality of suitable sensors and one of the sensors may be selected manually, e.g. by a light designer. Generally, the tool may not propose sensors 120 for all artworks 140, but only for those artworks 140, which are indeed exposed to a variable background light.
  • Figure 13 shows a possible embodiment of the operation of the sensor selection tool.
  • the sensor selection tool determines the artworks exposed to variable background light by: simulating the illumination of the exposition area 160 when the light fixtures 110 are switched off; and/or simulating the illumination of the exposition area 160 by varying ( e.g . between a minimum value and a maximum value) the light intensity of light sources associated with background light, e.g. the (virtual) light sources used to simulate light entering through doors 165 and/or windows 164 of the exposition area 160 (e.g. between the minimum and maximum value of sunlight).
  • the light intensity of light sources associated with background light e.g. the (virtual) light sources used to simulate light entering through doors 165 and/or windows 164 of the exposition area 160 (e.g. between the minimum and maximum value of sunlight).
  • an artwork 140 may also be illuminated indirectly by light fixtures 110 used to illuminate other artworks 140. Accordingly, in case these light fixtures may generate a variable illumination, also the combined illumination may be variable.
  • the sensor selection tool may select at a step 402 a given configuration for the natural and artificial light sources in the 3D model and simulate the configuration at a step 404. These steps are then repeated, e.g. for a given number of possible configurations. For example, this is schematically shown via a verification step which verifies whether all (background) light scenarios have been simulated. In case other light scenarios have to be simulated (output “N” of the verification step 406), the tool returns to the step 402 for selecting another light scenario. Conversely, in case all light scenarios have been simulated (output “Y” of the verification step 406), the tool proceeds to a step 408.
  • the tool may determine at the steps 408 and 410 the artworks to be monitored via light sensors. For example, in various embodiments, the tool determines at the step 408 for each artwork 140 a minimum and a maximum value of the background light (generated by other natural and/or artificial light sources) as a function of the illumination simulations. In various embodiments, the tool may use these values in order to determine at the step 410 a set of light sensors to be used. For example, in case the difference between these values (representing the variability of background light) exceeds a given threshold, the tool may determine at the step 410 that a light sensor 120 should be used to monitor the respective artwork 140.
  • the tool may determine at the step 408 for each artwork 140 other illumination characteristics, e.g. the maximum brightness and/or the spectral characteristics. As mentioned before, these illumination characteristics should correspond to given target values and/or be below a maximum threshold value. Accordingly, in this case, the tool may determine at the step 410 whether the illumination characteristics (determined at the step 408) correspond to given target values and/or are below a maximum threshold value. For example, in case the illumination characteristics (determined at the step 408) do not correspond to given target values and/or are not below the maximum threshold value, the tool may determine at the step 410 that a light sensor 120 should be used to monitor the respective artwork 140.
  • the illumination characteristics e.g. the maximum brightness and/or the spectral characteristics.
  • these illumination characteristics should correspond to given target values and/or be below a maximum threshold value.
  • the tool may determine at the step 410 whether the illumination characteristics (determined at the step 408) correspond to given target values and/or are below a maximum threshold value. For example, in case the illumination characteristics (determined at the step
  • the sensor selection tool may recommend the use of light sensors 120i, 120 2 and 120 4 for monitoring the artworks 140i, 140 2 and 140 4 , e.g. because:
  • the artworks 140i and 140 2 are illuminated by variable natural light entering through the window 164;
  • the artwork 140 4 is illuminated by natural light entering through the door 165 and in part light generated by the light fixture 1 I O3.
  • the sensor selection tool determines that the illumination of the artwork MO3 does not vary significantly during the simulation and thus determines that no light sensor 120 is required to monitor the artwork 140 3 , i.e. a light sensor 120 may be omitted for the artwork 140 3.
  • the sensor selection tool may not only determine the artworks 140 to be monitored via a respective light sensor 120, but may also determine at the step 410 which light sensor 120 should be used to monitor a given artwork, or a user may also select preferred light sensors.
  • the tool may also acquire data identifying the position of already installed light sensors 120 within the exposition area 160.
  • data identifying the position of already installed light sensors 120 within the exposition area 160 may also be used data already stored in a database, e.g. in the exposition area database 204, or by receiving the data via a manual input or at least in part automatically, e.g. by acquiring image data of the exposition area 160.
  • the (new) light fixtures 1 lOi and 1 I O2 may already include light sensors configured to measure the light emitted by the respective light fixture 110.
  • the tool may also provide the data, in particular the geometric data (distance and relative position of the artwork with respect to the light fixture) in order to calculate the light received at the object 140 as a function of these geometrical data.
  • the tool proposes the installation of a new light sensor.
  • the sensor selection tool may propose the installation of a light sensor 120 4 , such as a camera, configured to measure the characteristics of the light reflected by the object 140 4.
  • a light sensor 120 4 such as a camera
  • the tool may be configured to select a light sensor type as a function of the type and/or characteristics of the object 140, and/or the characteristics of the exposition area 160.
  • the sensor selection tool may permit that the user confirms or changes these recommendations, e.g. with respect to the number, type and/or position of the light sensors 120
  • the light fixture selection tool may be used to determine (at least in part) automatically a light plan specifying the illumination of an exposition area 160 based on the available light fixtures 110 and/or new light fixtures 110 to be installed, possibly also taking into account other natural or artificial light sources, and/or the reflectivity of the objects 140 and/or the exposition area 160.
  • the light fixture selection tool may vary the light plan in order to optimize the overall illumination taking into account the local illumination required at the objects 140.
  • the sensor selection tool may be used to include in such a (automatically generated or manually inserted) light plan also a sensor system used to monitor and optionally adapt the illumination of the artworks 140 based on variable illumination, in particular with respect to possible background light.
  • the sensor selection tool may provide proposals which artworks 140 should be monitored via light sensors 120 and/or a light designer can define or change the artworks 140 to be monitored.
  • the sensor selection tool may determine which light fixtures 110 illuminate each object 140 and as a consequence have to be controlled to regulate and/or verify the illumination, e.g. in order to assure that the illumination of an object 140 is below the chosen threshold.
  • the sensor selection tool is also configured to determine which light sensor should be used to monitor a given artwork, e.g. as a function of the characteristics of the artwork 140 and/or the exposition area 160 and/or the respective light fixture used to illuminate the artwork 140 (e.g. because the light fixture already includes a light sensor 120). In various embodiments, the sensor selection tool may also determine the geometrical positioning of the light sensors 120.
  • the light sensors 120 may be used to regulate the illumination of an artwork 140 to requested target values (e.g. in terms of color and/or brightness) and/or to verify whether the illumination is below a given maximum value. Accordingly, the sensors 120 may be connected to the control system 130 in order to control and optionally regulate the luminosity of the respective artwork 140 by sending control commands to the light fixtures.
  • target values e.g. in terms of color and/or brightness
  • the sensors 120 may be connected to the control system 130 in order to control and optionally regulate the luminosity of the respective artwork 140 by sending control commands to the light fixtures.
  • each of the light sensors 120 provides only data being indicative of the illumination of the respective artwork 140.
  • a light sensor 120 positioned in proximity of an artwork 140 is usually not illumined with the same (maximum) illumination as the artwork 140, because the sensor 120 is usually positioned outside the center of the beam angle.
  • each light sensor has a given sensitivity function, which permits to calculate/estimate the illumination at the artwork as a function of measured (ID or 2D) data, such as brightness and/or color pixel data.
  • the sensitivity of a given light sensor 120 may be a function of the color, the angle with respect to the light fixture, the distance with respect to the artwork 140, etc.
  • the sensor selection tool may thus also perform at a step 412 one or more illumination simulations of the 3D model with the positioned light sensors 210.
  • the tool may acquire the characteristic data of the light sensors 210, which may be stored in a sensor database 218, which e.g. may form part of the local and/or remote database 200, and/or may be determined as a function of the position of the light sensor 210 with respect to the respective light fixture 110 and optionally the respective artwork 140.
  • the sensor selection tool may determine (for one or more different illumination situations) the expected measured value(s) for one or more of the light sensors 120 and the respective expected illumination of the associated artwork 140.
  • the sensor selection tool may determine a respective threshold value for the measured value as a function of the expected illumination at the artwork and the maximum value for the illumination of the artwork.
  • the procedure terminates at a stop step 414.
  • the lighting system 100 may be installed and the respective control information (determined by the light fixture selection tool and/or the sensor selection tool) may be provided to the control system 130.
  • the control system 130 is able to receive the data from the light sensors 120 and verify these data, e.g. verify whether the measured value are above the previous mentioned threshold, and/or control the operation of the light fixtures 120, e.g. reduce the light intensity of a light fixture 110 when the respective measured value exceeds the threshold and/or in order to regulate the measured values to given reference values (e.g. to requested brightness and/or color values).
  • various embodiments of the present disclosure relate to a method of selecting at least one light sensor for a lighting system used to illuminate at least one artwork in an exposition area via one or more light fixtures configured to emit light with variable characteristics as a function of a control command.
  • the method comprises the steps of: obtaining a digital model of the exposition area, the digital model including: o exposition area data comprising data identifying the dimension of the exposition area; o artwork data comprising data identifying the position of the at least one artwork within the exposition area; o light fixture data comprising data identifying the position, orientation and illumination characteristics of the one or more light fixtures; and o background illumination data comprising data identifying the position and illumination characteristics of other natural and/or artificial light sources emitting light within the exposition area 160; executing a plurality of illumination simulations of the digital model of the exposition area by varying the illumination characteristics of the one or more light fixtures and/or the illumination characteristics of the other natural and/or artificial light sources, and determining for each illumination simulation data identifying
  • the light sensors 120 and the control system 130 may be used to monitor the illumination of an object 140.
  • a system can be used for example in lighting systems for artworks 140, such as paintings, graphics, photographs or lithographs, textile compositions etc., to protect them against damages caused by the excessive illumination or in general irradiation with electromagnetic waves.
  • objects 140 are typically illuminated by light fixtures 110, such as spotlights.
  • light fixtures 110 illuminate the object 140 with light having a given frequency spectrum and given light intensity, which may be selected based on the characteristics of the object 140, such as a respective light sensitivity of the surface of the object.
  • the above-mentioned damages can have various physical or chemical causes.
  • a process of photochemical decomposition can be initiated by absorption of high-energy light quanta of the incident radiation in the molecules of the corresponding object surface. Since specific activation energies have to be exceeded for a corresponding direct cleavage of the molecules, usually short-wave light, especially UV light, can lead to greater damages.
  • Light in the infrared wavelength range may also cause damages, for example through thermal expansion of the surface (mechanical tension) or through dry damage resulting in cracking, or through phase transitions in plastics or glass materials that are not initially visible, with obvious consequences for the surface quality.
  • the material and fabric properties of the irradiated object surface play an important role for light resistance of the object 140. This is particularly relevant due to the numerous pigments used in artists’ paints. For this reason, these - like binders and carrier materials such as paper or textiles - are usually divided into light sensitivity categories or light fastness classes, which then allow exhibitors to take individual restrictive measures with regard to the exposure of objects to light.
  • measurements can also be made using so-called light data loggers, wherein the light irradiation is recorded periodically via a light sensor 120.
  • the measured values can be temporarily stored in a data memory of the sensor 120 and/or the control system 130, and used to calculate through temporal integration the light exposure.
  • the light sensor 120 and/or the control system 130 may determine/estimate a resulting color deviation and/or a remaining useful life.
  • a special example of such a system is described, for example, in the publication WO 2013/017287 Al.
  • the objects 140 are often not uniformly illuminated.
  • the maximum intensity of the illumination is usually in the center of the object 140, but not at the edge, where the light sensor 120 may be installed. Therefore, the light sensor 120 will often not measure the actual light irradiation of the object 140, but only a representative value.
  • the light sensor 120 to be used to monitor a given artwork 120 should be selected taking into account the characteristics of the artwork 140, the light fixture 110 and/or the exposition are 160.
  • low-cost light sensors 120 may be sufficient, while for other artworks 140 more precise solutions may be required to monitor the illumination also of sub-areas of the object 140 and/or in order to take into account the various materials of the artwork 140.
  • the measured data provided by the light sensor 120 and/or the calculated illumination as determined as a function of the measured data may be stored in the data storage unit to provide a track record for the illumination of the object 140.
  • Figure 15 shows a first embodiment of a lighting system 100 configured to monitor the illumination/irradiation of an object 140 with light 500 generated by a light fixture 110.
  • the object 140 is an artwork, such as an oil painting with a layer of paint 144 containing pigments on a carrier material 141 such as canvas.
  • the object 140 may also comprise a frame 143 and a surface 142 formed by the color layer 144 and illuminated by the light fixture 110 with the light 500.
  • the light fixture 110 is a spotlight, but it can also be a floodlight, downright or other right fixtures.
  • the right fixture comprises one or more right sources 117, such as LEDs, of which only one right source 117 is schematically shown in Figure 15.
  • the right emitted by the right sources 117 is focused or expanded by optics 115.
  • the optics 115 may comprise a lens, a lens system comprising a plurality of lenses, and/or reflector.
  • the distance between the right sources/LEDs 117 and the optics 115 or parts thereof may be adjustable, for example, to achieve a desired beam widening or focusing, so that, for example, optimum and as much as possible homogeneous illumination of the surface 142 of object 140 is achieved.
  • a light sensor 120i is integrated in (the housing of) the light fixture 110. Specifically, in the embodiment considered, the light sensor 120i is placed in the area of the light 500 emitted by the light sources/LEDs 117, e.g. between the light sources/LEDs 117 and the optics 115, within the optics 115 (e.g. between two lenses or within a reflector), or after the optics 115.
  • the light sensor 120i is configured to generate a measurement signal indicative of the brightness of the light 500 passing through its active surface.
  • the light sensor 120i may also provide a plurality of brightness vales for respective colors/wavelengths.
  • the light sensor 120i is connected to a control device.
  • This control device may be a data processing unit 123 of the light sensor 120i, a data processing unit 113 of the light fixture 110 and/or a data processing unit 133 of the control system 130 (see also the description of Figures 2 to 4).
  • a data processing unit 113 of the light fixture 110 is configured to receive the measurement signal provided by the light sensor 120i and to transmit the measured data indicative of the luminance to a data processing unit 133 of the control system 130.
  • the control system 130 may comprise or consist in a smartphone or tablet, and the data processing unit 113 may transmit the measured data via a wireless communication interfaces 111 (within the light fixture) and 131 (within the control system 130) to the control system 130.
  • the communication may be via a WLAN and/or ZigBee connection.
  • Intermediate stations (not shown) like bridges or even routers (Internet) can be provided, i.e. the communication between the light fixture 110 and the control system 130 may be at least in part wireless.
  • the measurement data are transmitted to the data processing unit 133 of the control system 130, i.e. the data processing unit 133 receives (directly or indirectly) the measurement data from the light sensor 120i.
  • the data processing unit 133 obtains the measurement data periodically, preferably at short time intervals, such as 1 second.
  • the data processing unit 113 may periodically obtain the measurement data from the light sensor 120i and transmit these data to the data processing unit 133, or the data processing unit 133 may send a control command to the data processing unit 113 requesting a new measurement.
  • the duration of the time interval may thus be controllable, e.g. via the data processing unit 113 or the data processing unit 133.
  • the measured luminance depends on the placement of the light sensor 120i, e.g. the distance from the light sources/LEDs 117, the angular deviation from an optical axis 502 (see also Figure 17) of the light fixture 110, if, for example, a luminous intensity has a maximum value along this optical axis 502 and decreases with increasing angular deviation from the optical axis 502 ( i.e ., the light fixture does not represent a Lambert's emitter), the orientation of the light sensor 120i with respect to the light radiation 500.
  • the light sensor 120i is positioned at a fixed position, where the measured luminance is indicative of (and preferably proportional to) the total light 500 emitted onto the surface 142 of the object 140.
  • the light fixture 110 may comprise a current and/or voltage measuring device.
  • the light sources 117 may be supplied with a power supply generated by a driver/power supply 116, such as an electronic converter.
  • the current fed to the light sources 117 (LEDs) may be measured, e.g. via the feedback circuit 116k.
  • the current intensity generally correlates with the illumination intensity. Accordingly, also in this case, the measured current value may be transmitted to the data processing unit 133 of the control system 130.
  • the data processing unit 133 may thus process the transmitted data in order to determine the luminance of the light sources 117.
  • the data processing unit 113 may already process (at least in part) the measured data and transmit processed data.
  • the control system 130 may comprise a data storage device 132 (or similarly the light fixture 110 may comprise a data storage device 112). As mentioned before, the control system 130 may be implemented also in a distributed mode (see also the description of Figure 8). Accordingly, such a data storage device 132 may be local and/or remote. Specifically, in various embodiments, the data storage device 132 (or 112) has stored a mathematical function or a (Look-up) table providing a relationship between a measured luminance value and/or a measured current value and a respective actual luminance of the light sources 117 (such as a luminous flux, luminous intensity etc.). For example, these characteristics of the light sensor 120i and/or the current sensor 116k may be stored in a sensor database 218. For example, the values for such a table may be determined in advance by experiments and measurements.
  • this mathematical function or table may also take into account the variation of the relationship between the current measurement and the actual luminance of the light sources 117 due to ageing effect or degradation of the light sources 117, according to which the light output converted by the current usually decreases over time, i.e. the function or table may also take into account the total operating time of the light sources 117.
  • the processing operation is based on the total operating time of the light sources 117.
  • this information may be monitored within the light fixture 110 and transmitted to the control system 130.
  • the control system 130 may store these data to the light fixture database 202.
  • a timer module 504 of the data processing unit 113 which is configured to monitor the operating time of the light sources 117.
  • the data processing unit 133 of the control system 130 e.g. the smartphone, 6 may thus be configured to determine the luminance or an analog quantity (luminous flux, luminous intensity etc.) of the light sources 117 as a function of the measured current value and the total operating time.
  • the light sensor 120i may also provide a plurality of luminance values as a function of wavelength or for different wavelength ranges, such as in the IR range, the visible range and/or the UV range.
  • a plurality of luminance values as a function of wavelength or for different wavelength ranges, such as in the IR range, the visible range and/or the UV range.
  • several light sensors may be included in the sensor 120i, each covering a different wavelength range.
  • the data storage device 112 or 132 may also comprise data identifying the distribution of illuminance over the wavelength ranges for types of light sources 117 used in the light fixture 110. For example, these data may be stored in the light fixture database 202. Also this information may have been determined and entered in advance by experiments and measurements. In various embodiments, a shift in the spectrum of the emitted light caused by ageing of light sources 117 may also be taken into account in the function or table. This means that the distribution of luminance over the wavelength ranges is taken into account depending on the current total operating time of the light sources 117.
  • monitoring of the irradiation load of the surface 142 of object 140 is further improved, since the shorter the relevant wavelengths are, the more harmful the illumination is usually for object 140. This is expressed, for example, in a fading of red colors or color pigments that absorb the more energetic blue radiation, while blue colors or color pigments reflect the blue radiation more strongly.
  • FIG. 16 shows an example of the iso-intensity lines of the light fixture, wherein two maxima II, 12 are highlighted, one (II) in the middle of the surface and a smaller one (12 ⁇ II) displaced with respect to the first one.
  • the second one may be an artifact resulting from the radiation characteristic of the light fixture. Specifically, in the example, this second maximum occurs in the direction of an upper side 142a of surface 142, which is positioned closer to the light fixture 142 than the opposite lower side 142b of the object 142.
  • the intensity maximum 12 is smaller than that of II, this intensity can be decisive for the adjustment of the target and/or maximum intensity of the light fixture as a whole, since it may illuminate e.g. an area of the object 140 that is particularly light-sensitive, while the higher intensity maximum II may be located on a more light-insensitive part of the object 140.
  • Figure 16 also shows that the intensity profile may be enlarged slightly in the lower direction 142b. The geometrical arrangement causing this type of illumination is shown in Figure 17.
  • the object 140 may be fixed or suspended on or near a wall 163 of a room/exposition area 160, and due to the fact that the light fixture may be mounted at a given height, such as at the top of a room ceiling 161, the object 140 is obliquely illuminated by the light fixture 110.
  • the data processing unit (e.g . 113 or 133) is configured to determine the illumination of the object 140 also as a function of the angle a, i.e. the data processing unit takes into account the oblique incidence.
  • the angle a may be stored, e.g. in the light fixture database 202 or the exposition area database 204.
  • an inclination sensor 120 2 is fixed to or forms part of the light fixture 110, i.e. the inclination sensor 120 2 is configured to measure the inclination of the light fixture and transmit the result, e.g. , to the data processing unit 113 or directly the control system 130.
  • the data processing unit (e.g. 113 or 133) is configured to determine the illumination of the object 140 also as a function of the distance from the wall 163 or the artwork 140. In various embodiments, this distance may be stored, e.g. in the light fixture database 202 or the exposition area database 204. Additionally or alternatively, an distance measurement sensor I2O3 may be fixed to or forms part of the light fixture 110, i.e. the distance measurement sensor I2O3 is configured to measure the distance of the light fixture 110 to the wall 163 or the artwork 140 and transmit the result, e.g. , to the data processing unit 113 or directly the control system 130. For example, in various embodiments, the distance measurement sensor I2O3 may be an ultrasonic sensor configured to measure a distance d between the light fixture 110 and the wall 163 to which the object 140 is attached.
  • control system 130 is configured to receive the data from the sensors 120 2 and I2O3 and store these data to the exposition area database 204.
  • control system 130 may also store other data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 of the object 140.
  • the data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 e.g, the distance d and/or the angle of inclination a
  • further spatial positioning data such as a lateral inclination angle and/or a lateral offset (perpendicular to the distance d) of the light fixture 110 with respect to the center of the surface 142
  • a lateral inclination angle and/or a lateral offset (perpendicular to the distance d) of the light fixture 110 with respect to the center of the surface 142 may be measured or entered manually, and stored, e.g. in the light fixture database 202 and/or the exposition area database 204 (see e.g. the light fixtures IIO3 and 1 IO4 in Figure 12).
  • the information on the spatial positioning of light fixture 110 with respect to the surface 142 of the object 140 are thus able to identify a position and orientation of the object surface 142 with respect to the light fixture 110, thereby permitting the calculation, via a mathematical projection or interpolation, of the beam path.
  • the radiation characteristic of the light fixture 110 may be stored, e.g. in the light fixture database 202.
  • This radiation pattern may be measured once, for example after production of the light fixture 110 or once for all luminaires of this type.
  • the radiation characteristic describes a direction-dependent output of light or luminous intensity of a luminaire for a given light intensity emitted by the light fixture.
  • such a radiation pattern may be identified via a two-dimensional intensity distribution in a reference plane perpendicular to the optical axis 502 at a certain distance from the light fixture 110.
  • a plurality of radiation patterns may be stored, e.g. when the light fixture 110 supports a plurality of operating conditions, e.g. when various sub-sets of light sources 117 in the light fixture 110 may be controlled independently and/or the optics 115 are controllable.
  • the radiation pattern of the light fixture 110 may be projected on the surface 142 as a function of the spatial positioning data. For example, the distance of the reference plane from the light fixture and a distance of a given point in the reference plane from the point of intersection of the reference plane with the optical axis 502, the data processing unit 133 may thus determine an angle. This angle can then be used to project the respective radiation characteristic at the given point onto a point of the surface 142 of the object 142.
  • the data processing unit 133 may calculate a respective distance of the point of the surface 142 from the light fixture 110 (as a function of the spatial positioning data) and determine an expected intensity by taking into account that the intensity decreases with increasing distance due to the beam widening.
  • the radiation characteristic may also include several intensity distributions in multiple planes at different distances from the light fixture 110, so that the data processing unit 133 may calculate the expected intensity distribution on the surface 142 of object 140 via interpolation or extrapolation between the planes.
  • the surface 142 can be divided into a grid or a matrix, whereby the local luminance is calculated for each grid point depending on the position in the relevant plane of the radiation characteristic and the distance from the light fixture 110, whereby a second plane of the radiation characteristic (a plane at a distance "in front" of the considered position on object 140, another plane at a distance "behind” the considered position on object 140) can also be used and then interpolated.
  • the grid points may be arranged at a distance of a few millimeters up to a few centimeters.
  • the data processing unit 133 may determine the actual illumination pattern as a function of the light intensity value actually measured/calculated as a function of the data provided by the sensor 120i or 116k.
  • the distance of the sensor 120i from the light fixture 110 and the optical axis 502 may be used to determine a reference point in the radiation pattern, and the value of the reference point and the measured/calculated light intensity value may be used to determine a multiplication factor for the intensity values determined for the surface 142.
  • the multiplication factor may also be stored, e.g. in in the light fixture database 202 and/or sensor database 218, and the value may be read by the data processing unit.
  • the system shown in Figure 15 takes into account the information about the spatial positioning between the light fixture 110 and the surface 142 of the object 140, how the optical axis 502, i.e. a main beam direction, of the light fixture 110 is aligned with respect to the surface 142, since the radiation characteristic itself are related to that optical axis 49 or main beam direction.
  • the main beam direction of the light fixtures is centered with respect to the center of the surface 142 of the object 140.
  • the data processing unit 133 (possibly in collaboration with the data processing unit 113) is able to calculate a local intensity for any position on the surface 142 of the object 140, e.g. in order to obtain a distribution as shown in Figure 15.
  • a plurality of distributions for different wavelengths or ranges of wavelengths may be determined.
  • these distribution of intensity values may be used by the control system 130 in order to verify whether the global and/or local intensity/illumination values exceed one or more maximum values and/or in order to regulate the intensity values to requested values.
  • the local intensity at the surface 142 should now not exceed a certain maximum limit in order to prevent damages, e.g. , to the paint application.
  • a given surface may not have a single global maximum irradiation value (or plural maximum values for different wavelengths), but each of a plurality of individual surface areas (e.g. the locally different pigments or materials of the object) may have a respective maximum value (or plural maximum values for different wavelengths).
  • inorganic pigments such as zinc white or ultramarine are generally more light- resistant than organic dyes.
  • the artwork database 206 may thus have stored for a given artwork 140 data identifying these maximum values, such as sensitivity values associated with individual positions on the surface 142 of the object 140 which may be determined e.g. as a function of paint application, pigments, binders and carriers etc.
  • the data processing unit 133 thus calculates the local intensity (or local intensities for respective wavelengths) for at least one of the numerous positions as described above and then compares this with a respective maximum value (or maximum values for respective wavelengths), e.g. , specified via the sensitivity information for this position.
  • the data processing unit 133 may generate a signal, such as a warning signal, as a function of the comparison.
  • the sensitivity information includes an assignment of the positions on the surface 142 of the object 140 to maximum values which can be defined, for example, by the blue scale (ISO 1-8) or a light sensitivity category classification according to Colby, Karen M.: "A Suggested Exhibition / Exposure Policy for Works of Art on Paper ", in: The Lighting Resource - Montreal Museum of Fine Arts, (accessed on 22.1.2019) at http://www.lightresource.com/research-papers/A-Suggested-Exhibition-Exposure-Policy-for- W orks-of- Art-on-Paper. pdf .
  • the following sensitivity data are used:
  • the maximum values may refer both to instantaneous limit values (short term) and to cumulative/average limit values (long term):
  • category 1 may include: most organic dyes, magenta, verdigris (copper acetate), chrome yellow, chrome red, smalt, pastel, clay papers, older color photographs, polaroids, felt-tip pen, most natural textile colors, feathers, colored printing inks, turmeric yellow, etc.
  • Category 2 may include Manganese blue, Prussian blue, zinc yellow, cadmium yellow, cinnabar, carmine red, groundwood paper and board, new photo prints, Kodachrome slides, Indian yellow, etc.
  • Category 3 may include Ivory black, titanium white (rutile), zinc white, cobalt violet, ultramarine, cobalt blue, chrome green, malachite, earth tones, Naples yellow, lead tin yellow, orpiment, good rag paper, carbon printing inks, S-W photos on gelatin, indigo on wool, earth tones, plastics (PE), etc.
  • the data processing unit 133 may thus e.g. integrate the local intensity values recorded since the beginning of the year, for example, in order to have the same standard of comparison.
  • the data processing unit 133 may determine the opening hours of the exhibitor area, which e.g. may be stored in the exposition area database 204, and multiply the calculated intensity (or an average value thereof) by this time.
  • a long-term limit may also refer to the limit of a "first bleaching effect".
  • the data processing unit 133 may estimate the future (global or local) intensity values as a function of the previous (global or local) intensity values), the data processing unit 133 may send one or more control commands to the lighting fixture(s) 110 in order to reduce the power provided to the light sources 117. In extreme cases, the data processing unit 133 may send one or more control commands to the lighting fixture(s) 110 in order to switched off one or more light sources 117/light fixtures 110. Specifically, in various embodiments, the control system 130 may perform these operations already when a single limit value associated with a very sensitive partial area of the surface 142 of object 140 is exceeded.
  • the maximum/sensitivity values may also be determined automatically.
  • such maximum/sensitivity values may be determined via a further sensor 120 4 , e.g. in the form of a camera, which may also be part of the light fixture 110.
  • the camera may have associated a data processing unit 123, which elaborates the image data of the surface 142 of the object 140 and performs an assignment of individual positions on the surface 142 to a respective category or directly a respective limit value via image data processing.
  • the respective data may then be provided to the control system 130, e.g. by storing respective data to the artwork database 206.
  • the artwork database 206 may have stored characteristics data, e.g. the maximum/sensitivity values, for a plurality of artworks 140.
  • characteristics data e.g. the maximum/sensitivity values
  • such an artwork database 206 may be stored in a cloud 506, but the database 206 may also be stored, e.g. , in the control system 130.
  • the use of a remote artwork database 206 is particularly useful when an artwork 140 may be moved from one museum to another.
  • the control system 130 may determine the previously mentioned maximum values.
  • the control system 130 may obtain the data for a given artwork 140, e.g. via an image recognition operation or by using a univocal code.
  • the control system may comprise its own camera 508. This can be used to perform the image recognition operation or to read an identifier 510 attached to the object 140, such as a QR code.
  • the control system 130 may comprise other reader devices, such as an NFC reader with which it reads an NFC tag attached to the object 140, which contains a corresponding unique identification.
  • the control system 130 may also take into account the intensity of background illuminations. Specifically, when using light sensors 120 configured to determine the light emitted by the light fixture 110 (in particular near the light fixture 110), this light sensor 120 is unable to measure the background illumination, which also illuminates the object 140. Accordingly, when taken alone, such a light sensor 120 may be used to monitor artworks 140 not exposed to significant and/or variable background illumination. Alternatively, the control system 120 may also be connected to a light sensor configured to monitor the background illumination generated by other artificial or natural light sources. For example, such an additional light sensor may be implemented with light sensor positioned near the artwork 140, or other light sensors 120 configured to measure the background illumination in the exposition area 160, preferably near the artwork 140. For example, such a light sensor 120 may be a camera, e.g. the camera 508, or a light sensor/camera integrated or positioned near the light fixture (see also Figure 14).
  • the data processing unit 133 may be configured to calculate actual (global and/or local) illumination values by summing the calculated intensity at the artwork 140 due to light generated by the light fixture (as e.g. measured by the sensor 120i) and the intensity of background light. For example, if the ambient light alone should cause the limit values to be exceeded, the data processing unit 133 may generate a warning signal. This may apply in particular to UV radiation, as this causes particularly severe damages to inks and materials, i.e. the background light sensor may also provide plural intensity values for different wavelengths or wavelength ranges.
  • Figure 18 summarizes the operation of various embodiments of the data processing unit 133 shown in Figure 15.
  • a step 520 the light sensor 120i or the current sensor 116k provides data indicative of the intensity of the light emitted by light sources 117 (or respective intensity values for a plurality of wavelengths/wavelength ranges) to the data processing unit 133, i.e. the data processing unit 133 receives the data indicative of the intensity of the light emitted by light sources 117.
  • the data processing unit 133 obtains, e.g. via the light fixture database 202 and/or the exposition area database 204, the data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 of object 140.
  • the data processing unit 133 obtains, e.g. via the light fixture database 202, data identifying the spatial radiation characteristics of the light fixture 117.
  • the data processing unit 133 determined a global and/or a plurality of local intensity values (or respective intensity values for a plurality of wavelengths/wavelength ranges).
  • the local intensity values may be calculated for a plurality of positions on the surface 142 of the object 140 as a function of:
  • the measured intensity value (or measured intensity values for a plurality of wavelengths/wavelength ranges).
  • the step 526 may also determine the global and/or local intensity value as a function of data identifying an intensity of background illumination.
  • the data processing unit 133 obtains, e.g. via the artwork database 206, data identifying the global and/or local sensitivity of the surface 142 of the object 140 to be irradiated (or global and/or local sensitivity values for a plurality of wavelengths/wavelength ranges). These data may include or may be used to calculate global and/or local maximum values (or global and/or local maximum values for a plurality of wavelengths/wavelength ranges), which may relate to short-term/instantaneous values and/or long-term values.
  • the data processing unit 133 compares the calculated global and/or local intensity values with the respective global and/or local maximum values.
  • the data processing unit 133 may generate one or more control commands for the light fixture(s) 110 in response to this comparison, for example, in order to adjust its power supply.
  • the operation of the data processing unit 133 may also be implemented in a data processing unit 123 of a sensor 120, e.g. the sensor 120i, or a data processing unit 113 of the light fixture 110, or in a distributed manner wherein the above operations are executed by at least two of such data processing units 113, 123 and 133 (see also the description of Figures 2, 3 and
  • the lighting system 100 described with respect to Figure 15 is able to monitor the irradiation of an object 140 with light from a light fixture 110.
  • the system comprises a light fixture 110 (or a plurality of light fixtures) with one or more light sources 117, which together emit light with a given spatial radiation characteristic.
  • the light fixture 110 may be a spotlight comprising one or more LEDs (light-emitting diodes) or similar.
  • the light fixture 110 is preferably a physical unit having a housing in which the light sources 117 and other functional units are accommodated, such as in particular an optical system 115 for expanding or focusing, e.g. lenses and/or reflectors arranged in a corresponding manner.
  • the system comprises also a data processing unit operatively connected via a suitable communication interface to the light fixture(s) 110 or possibly integrated in the light fixture.
  • the light fixture 110 comprises a data processing unit 132 configured to exchange data with a data processing unit 133 of a control system 130.
  • the connection can be a physical data line (including cable) or a wireless connection, or a combination of both.
  • the light fixture 110 and the data processing unit may be connected via the Internet, for example, via an on-site router, switch and access point.
  • a bridge can also be used to connect a building management system to the light fixture 110, such as DALI, KNX or ZigBee.
  • the data processing unit may also form part of a smartphone, which may correspond to or form part of the control system 130.
  • Such a smartphone may also be another mobile (hand-held) control and display unit, which enables a user of the system to monitor and control the system by showing respective data on a display.
  • the data processing unit has associated a first memory, e.g. an exposition area database 204, connected to the data processing unit, in which information about the spatial positioning of the light fixture 110 with respect to the object 140 are stored.
  • a first memory e.g. an exposition area database 204
  • this information on the spatial positioning may include data concerning a (horizontal) distance between the light source(s) and the surface 142 of the object 140 or another reference point, and an angle of inclination at which the light fixture 110 is positioned with respect to a surface normal or from a plane of the surface 142.
  • the information may include mere coordinates of the light fixture 110 and the surface 142 of the object 140 in a reference coordinate system.
  • the information contains data that allow the data processing unit to make a geometrical calculation of the radiation from the light fixture 110 to positions on the surface 142 of the object 140.
  • the system e.g.
  • the light fixture 110 may comprise at least one of: a distance sensor, preferably an ultrasonic sensor, configured to measure a distance between the light fixture 110 and the surface 142 or a reference point to it and to transmit the measurement result to the data processing unit; and an inclination angle sensor, preferably provided in the light fixture 110 or on the surface 140 of the object 140, and configured to measure an inclination angle at which the light fixture 110 is positioned with respect to a surface normal or from a plane of the surface 142, and to transmit the measurement result to the computing unit.
  • a distance sensor preferably an ultrasonic sensor, configured to measure a distance between the light fixture 110 and the surface 142 or a reference point to it and to transmit the measurement result to the data processing unit
  • an inclination angle sensor preferably provided in the light fixture 110 or on the surface 140 of the object 140, and configured to measure an inclination angle at which the light fixture 110 is positioned with respect to a surface normal or from a plane of the surface 142, and to transmit the measurement result to the computing unit.
  • the data processing unit has associated a second memory, e.g. a light fixture database 204, in which the above-mentioned spatial radiation characteristics of the light source(s) 117 or the light fixture 110 as a whole (including optics) are stored.
  • the radiation characteristic refers to a directional output of the light of a light fixture 110 with respect to a value determined for a main direction along an optical axis, whereby the radiation characteristic can be influenced by apertures, lenses, louvres or reflectors of the light fixture 110.
  • the beam pattern can be symmetrical (preferably with spotlights, spotlights or downlights) or asymmetrical (preferably floodlights).
  • the radiation characteristics refer to an optical axis of the light fixture 110.
  • an intensity distribution within a plane perpendicular to this optical axis may be given, with the intersection of the axis with the plane representing the reference point.
  • the orientation of the optical (or geometric) axis of the light fixture 110 with respect to the surface 142 of the object 140 may also be part of the data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 of the object 140 and may be stored in the first memory.
  • intensities can be stored angle-dependent (relative to the optical axis).
  • the spatial radiation characteristic of the light source(s) includes data with a two-dimensional distribution of intensities on one surface, or on several surfaces at different distances from the light source(s), perpendicular to an optical axis of the light emitted by the light sources of the luminaire.
  • the data processing unit may calculate the local intensity at a given position on the surface 142 of the object 140 by means of mathematical projection or interpolation or extrapolation from one surface or between the several surfaces.
  • the data processing unit is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface 142 of the object 140 from the information on the luminance, from the spatial radiation characteristic of the light source(s) and from the information on the spatial positioning of the luminaire relative to the surface of the object.
  • a calculation is essentially a geometric calculation, whereby the intensity determined at each position on the surface 142 includes the absolute distance from the light fixture 110, the inclination of the surface 142 with respect to the direction towards the light fixture 110 and the direction- dependent attenuation of the radiation relative to the aligned optical axis of the light fixture 110.
  • a further factor is the light intensity transmitted by the light fixture.
  • the data processing unit is able to monitor the irradiation of the surface 142 of the object 140 with local resolution.
  • the local irradiation determined in this way may be compared, for example, with local light sensitivities. For example, smaller, but particularly light-sensitive surface areas are thus taken into account much more appropriately in the monitoring process.
  • the data processing unit may modify the strength, orientation and, if necessary, also the wavelength range of the light source(s) in response to this comparison.
  • the system comprises a (photographic) camera with which the surface 142 of the object 140 can be scanned in order to obtain color and/or brightness values for positions on the surface 142, wherein the data processing unit is adapted to receive the position-dependent color and/or brightness values from the camera and calculates a limit value for each of the positions on the basis of a fixed predetermined relationship between the color and/or brightness values and a sensitivity.
  • this camera may be a camera of the control system 130, e.g. in the form of a mobile unit wirelessly connected to the light fixture 110.
  • the control system such as a mobile unit wirelessly connected to the light fixture 110, may comprise a camera or a device for near-field communication, which can be used to read an identifier attached to the object 140, allowing access to the sensitivity information for the object 140 to be irradiated stored in a fourth memory, e.g. an artwork database 206.
  • the identifier can be, for example, a QR code or a correspondingly programmed NFC tag, which can be read with a smartphone, for example, and is individually assigned to the object 140.
  • This fourth memory is preferably set up in a generally accessible cloud. If, for example, the object 140 is moved to another exhibition or museum, the new user can access the surface that has already been "mapped" earlier in terms of sensitivity information without having to recreate it.
  • the system may also comprise a timing device 504, e.g. a clock or timer, which is configured to output an operating time for the light source(s) 117 in which the light source(s) 117 are operated since their commissioning for irradiating the object 140, and a current and/or voltage measuring device 116k, which is designed to measure a current and/or voltage with which the light source(s) 117 is/are operated.
  • a function or table may be stored in a third memory, such as a sensor database 218, with which values of an illuminance are assigned to a combination of values from a current and/or a voltage and/or an operating time of the light sources. For example, in this way, the age-related decrease of the radiation power may be mapped.
  • indirect determination of the intensity of light emitted by the light fixture 110 is usually more cost efficient compared to a light sensor 120i.
  • the first, second and/or third memory may be installed in the smartphone or in a cloud accessible by the smartphone. Due to the fact that the data processing unit may be remote, such as a remote server, it is possible to make the underlying calculation for monitoring the irradiation of an object 140 available to third parties via suitable interfaces (Software as a Service (SaaS) or Platform as a Service (PaaS)).
  • SaaS Software as a Service
  • PaaS Platform as a Service
  • the sensitivity data e.g. the target and/or maximum illumination values, of the artworks 140.
  • the information used to determine the intensity of light emitted by the light fixture 110 as a function of the measured light intensity (sensor 120i), or the power supply (sensor 116k) may be used when simulating the operation of the sensors 120.
  • control operations for controlling the operation of the light fixture 110 as a function of the illumination of the artwork 140 may be used with other light sensors 120 and/or control systems 130 described herein.
  • the solution for obtaining the sensitivity data of an artwork 140 which e.g. are used to determine local and/or global target and/or maximum illumination values, may be used to provide these data, e.g. by storing these data into the artwork database 206, to any control system 130 using such data.
  • the light system comprises the light fixture comprising one or more light sources, which together are configured to emit light with a spatial radiation characteristic, a data processing unit connected to the light fixture and configured to obtain information on an intensity of the light emitted by the light sources, a first memory connected to the data processing unit, in which information about the spatial positioning of the light fixture with respect to a surface of the object is stored, and a second memory connected to the data processing unit, in which information about the spatial radiation characteristic of the one or more light sources or the light fixture is stored.
  • the data processing unit is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface of the object as a function of the information on the light intensity, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture. Possible embodiments of this solution are detailed at the following point "Example 5".
  • art lighting should assure that an artwork or object 140 is illuminated with a given target and/or maximum illumination, e.g. in order to avoid or reduce the risk that the artwork is damaged.
  • the light 500 emitted by a light fixture 110 often comprises wavelengths ranging from UV to IR.
  • damages of an artwork 140 may occur if the global and/or local light intensity or the light intensity at a certain wavelength or wavelength range is too high, thereby inducing, e.g. , damages in the pigments or other materials.
  • one of the simplest solutions consists in installing a light sensor 120 next to the object 140.
  • a light sensor 120 would thus not directly measure the illumination of the object 140, but still would permit to calculate or at least estimate the illumination of the artwork 140 as a function of the characteristics of the light fixture 110 and the geometrical position of the light fixture 110, the artwork 140 and the light sensor 120.
  • a light sensor 120 installed near the artwork 140 is often not a feasible solution, e.g. because a power supply has to be provided to the light sensor 120 and some kind of data- connection is required in order to transmit the measured data to the control system 130.
  • the global and/or local illumination of an artwork is calculated as a function of: - the intensity (possibly for plural wavelengths/wavelength ranges) of the light 500 emitted by the light fixture 110 as measured directly within the light fixture 110 (or at least in proximity thereof),
  • the illumination of an artwork 140 is calculated/estimated as a function of the data provided by a light sensor 120 configured to measure the light 600 reflected by the artwork 140 itself.
  • the light sensor 120 may provide a global and/or local (i.e. space-resolved across the object) measurement values for a single or plural wavelengths/wavelength ranges.
  • the light sensor 120 may be a photometric device, such as diode ( e.g . for an overall/global measurement) or a (2D) camera, such as a CCD or CMOS-sensor, which measures the intensity of local intensity values.
  • a camera 120 may be: a monochromatic, e.g.
  • grayscale, camera providing only intensity values for a single wavelength range
  • a full RGB camera or other color patterns, such as CMYK
  • each pixel has a plurality of sensor elements/light sensors associated with respect wavelength/wavelength ranges (e.g. due to given color filters installed before each sensor element), thereby providing measured color values indicative of the intensity of a plurality of wavelengths/wavelength ranges for each pixel
  • a “reduced” RGB camera or other color patterns, such as CMYK
  • each pixel has a single sensor element/light sensor associated with a respective color/color range (e.g.
  • the sensor elements/light sensors are usually arranged according to a color pattern, such as a Bayer pattern, which still permits to calculate/estimate, e.g. via interpolation, color values indicative of the intensity of a plurality of wavelengths/wavelength ranges for each pixel.
  • the light sensor 120 may be installed within, fixed to or installed in the proximity of the light fixture 110 used to illuminate the artwork 140 to be monitored, because in this way the light sensor 120 may be supplied by the power supply of the light fixture 110, e.g. via the electronic converter 116. Moreover, in this way, the light sensor 120 may also use the communication interface of the light fixture 110 in order to exchange data with the control system 130.
  • a camera 120 when using a camera 120, i.e. a matrix of sensor elements/light sensors, usually these sensor elements/light sensors have associated respective color filters, e.g. arranged according to a Bayer-filter.
  • the spectral characteristics filters and the spectral characteristics of the light 500 emitted by the light fixture 110 are matched.
  • the light fixture 110 when using a RGB camera 120, i.e. a camera having a plurality of filters for a red wavelength range, a green wavelength range and a blue wavelength range, respectively, the light fixture 110 should emit light with peak values in these red, green and blue wavelength ranges or vice versa the (e.g.
  • red, green and blue wavelength ranges of the color filters of the camera 120 should be selected to correspond to the peak emission values of the light fixture 110.
  • the light fixture 110 may comprise light sources 117, such as LED, emitting light with peak values corresponding to the wavelength ranges of the camera 120, such a red, green and blue LEDs, preferably having peak emission values corresponding to the wavelength ranges of the red, green and blue filters.
  • the measurement data provided by the camera 120 may thus be correlated directly to the light emitted by a given subset of light sources 117 within the light fixture 110.
  • red LEDs when detecting that the measured red light exceeds a maximum threshold value, only the power supply of the light sources 117 emitting red light, such as red LEDs, may be adapted, thereby simplifying the control operation within the control system 130.
  • one light sensor of the camera 120 measures the intensity of two or more light sources 117 (e.g. LEDs)
  • the transmission window overlaps with the intensity peak of two or more light sources 117, then the respective relative intensities of the two LEDs have to be taken into consideration when setting the intensity of the light fixture.
  • a similar issue occurs also when using a monochromatic camera, or when using only a global light sensor (such as a diode providing a single brightness value for a given wavelength range, or a plurality of diodes with respective filters each providing global brightness value for a respective different wavelength range), because, also in this case, the spectral characteristics of the light sources 117 may or may not be matched to the spectral characteristics of the sensor elements.
  • a global light sensor such as a diode providing a single brightness value for a given wavelength range, or a plurality of diodes with respective filters each providing global brightness value for a respective different wavelength range
  • spectral characteristics of the light sensor 120 are not matched to the spectral characteristics of the light fixture 110, these spectral characteristics may be taken into account in order to calculate/estimate the actual illumination of the light sensor 120, e.g. by scaling the measured values as a function of the spectral characteristics of the light fixture 110 and the spectral characteristics of the light sensor 120.
  • the spectral characteristics of the light fixture 110 and/or subsets of light sources 117 of the light fixture 110 may be stored, e.g. in the light fixture database 202, and the spectral characteristics of the light sensor 120 may be stored, e.g. in the sensor database 218.
  • the control system 130 (or in general any data processing unit of the lighting system 100) may use these data in order to: estimate the actual illumination of the sensor 120; and/or - generate the control commands for the light fixture 120 in order to regulate the spectral characteristics of the light emitted by the light fixture 110 as a function of the measured data provided by the light sensor 120.
  • the data identifying the spectral characteristics of subsets of light sources 117 of the light fixture 110 and the data identifying the spectral characteristics of the light sensor 120 may also be used in other embodiments described herein wherein the control system 130 uses data provided by a light sensor 120, e.g. in order to verify and/or control the operation of the light fixtures 110.
  • these spectral characteristics data are helpful when the spectral characteristics of the light sensor 120 and the light fixture 110 are not matched.
  • these data are purely optional, because instead of storing explicitly these spectral characteristics data, these data may be identified implicitly via sensitivity data of the light sensor 120 for a given light fixture 110, e.g. in the form of a mathematical function or table, which associates given measured values (for a single or plural wavelengths/wavelength ranges) to respective intensity values for a plurality of wavelengths/wavelength ranges.
  • the spectral characteristics of the light source 110 may be used to estimate/calculate also the intensities in the UV and/or IR range.
  • intensity values for a plurality of wavelengths or wavelength ranges may refer to the actually measured values or to intensity values (for the originally measured or different wavelengths or wavelength ranges) calculated/determined as a function of the above-mentioned spectral characteristics.
  • the respective calculation may be implemented already in the sensor 120, e.g. via the data processing unit 123, or the control system 130, e.g. via the data processing unit 133.
  • the light sensor 120 is configured to measure characteristics of the light 600 reflected by the artwork 140.
  • the intensity of the reflected light 600 does not provide direct information about the intensity of the light at the object 140.
  • the intensity of the reflected light 600 will, of course, depend on the intensity of the light 500 which illuminates the object 140, but it will also (and quite often significantly) depend on the reflectivity of the object 140, which in turn depends, e.g. , on the surface structure of the object 140 itself, e.g. if it is smooth, rough, has a high or low reflectivity, and which colors cover the object 140.
  • the intensity of the reflected light 600 is just indicative of the actual intensity with which an object 140 is illuminated.
  • the reflectivity of an object 140 will result in a variation of the spectral characteristics of the light 600 measured by the light sensor 120.
  • a calibration step is used to reference the (global and/or local) intensity of reflected light 600 as measured by the light sensor 120 (possibly for a plurality of wavelengths or wavelength ranges) to the actual (global and/or local) intensity of light (possibly for a plurality of wavelengths or wavelength ranges) at the object 140.
  • Figure 20 shows an embodiment of the calibration phase/step.
  • the artwork 140 is illumined at a step 610 via the light 500 generated by at least one light fixture 110
  • the actual illumination at the object 140 is determined.
  • the global and/or a plurality of local light intensities for at least one wavelength or wavelength range are determined at a step 612 by directly measuring the illumination at the artwork 140 via a sensor positioned at the position of the artwork 140.
  • the global and/or a plurality of local light intensities for at least one wavelength or wavelength range are determined at a step 614 by calculating/estimating these data as a function of other data, e.g. by: calculating the illumination at the artwork 140 as a function of the measured data provided by a light sensor positioned in proximity of the artwork; calculating the illumination at the artwork 140 as a function of the measured data provided by a light sensor configured to measure the light 500 emitted by the light fixture 110 (as described with respect to the first embodiment of a light sensor).
  • the respective data may be inserted manually or automatically by connecting the respective light sensor to the control system 130.
  • the global and/or a plurality of local light intensity values of the reflected light 600 are measured at a step 616 by the sensor 120 for at least one wavelength or wavelength range.
  • the measured intensity of the light fixture 110, the radiation characteristics of the light fixture 110 and the geometrical positioning (e.g. distance and/or angle) between the light fixture 110 and the object 120 may be used to calculate the global and/or local illumination of the object 140.
  • the measured intensity values of the measured reflected illumination may be associated with local intensity values of the calculated illumination of the object 140, possibly also for a plurality of wavelengths or wavelength ranges.
  • a plurality of measurements are performed for different settings of the lighting fixture 110.
  • the setting of the light fixture(s) 110 may be modified at a step 618 and the steps 610-616 may be repeated.
  • the spectral characteristics of the light fixture 110 and the light sensor 120 should be known in order to simplify the control of the light fixture in order to regulate the illumination of the artwork 140, in particular for individual wavelengths/wavelength ranges.
  • the reflectivity of the artwork 140 itself varies the measurement result. Accordingly, in various embodiments a plurality of measurement with different settings for the light fixture 110 are performed (e.g. with respect to the emitted brightness and color), e.g.
  • data identifying the respective setting of the light fixture(s) 110 data identifying the measured or calculated illumination of the artwork 140 (e.g. global and/or local intensity values, possibly for a plurality of wavelengths/wavelength ranges); and data identifying the measured reflected light 600 (e.g. global and/or local intensity values, possibly for a plurality of wavelengths/wavelength ranges);
  • the setting used during the calibration phase include one or more of:
  • the maximum allowable/possible intensity (overall, i.e. as a sum of intensities at certain wavelengths, or for a certain wavelength);
  • the setting of the light fixture(s) 110 may also not be stored, i.e. these data are purely optional.
  • the data obtained during the calibration phase may be stored in a dataset, such as a table of a database, thereby creating a mapping between:
  • the above data are used to calculate a specular and/or diffusive reflectance function of the object 140 as a function of the reflected illumination 600 measured via the sensor 120 and the (measured or calculated) actual illumination of the artwork 130.
  • another mathematical function configured to determine the illumination of the artwork 140 as a function of the reflected illumination 600 as measured by the sensor 120 may be determined during the training phase.
  • a machine learning algorithm such as a neural network, is used to generate such a mathematical function as a function of the dataset.
  • a machine learning algorithm may be useful when local intensity values of illumination (optionally for a plurality of wavelengths) have to be determined for local intensity vales of reflected light (optionally for a plurality of wavelengths).
  • the dataset e.g. in the form of a table, may be stored directly in a memory accessible of the light sensor 120 and/or control system 130, such as the sensor database 218.
  • a data processing unit of the light sensor 120 or the control system 130 may receive at a step 630 the measured global and/or local intensity values for at least one wavelength from the light sensor 120 and use the mathematical function or dataset to determine/estimate the illumination of the artwork 140 (e.g. a global and/or local intensity values for a single or a plurality of wavelengths or wavelength ranges) as a function of the measured reflected illumination of the artwork 140.
  • the illumination may be estimated via interpolation or extrapolation of the stored data.
  • the light sensor 120 may measure the reflected light 600 (an absolute value and/or a relative value thereof), and this measurement may be executed continuously.
  • the estimated illumination of the artwork 140 may be used to verify and/or regulate the illumination of the artwork 140, e.g. in order to regulate the (global or local) illumination to a target illumination (intensity and/or color) and/or to verify whether the (global and/or local) illumination is below a respective maximum value (or maximum values for a plurality of wavelengths/wavelength ranges).
  • the measured changes in the reflected light intensity are used to control the intensity of the light fixture 110.
  • the control system 130 may obtain at a step 632 a reference value and compare at a step 634 the current illumination of the artwork 140 (overall or at certain wavelengths) with the reference value.
  • the reference value may be a previous measurement, or determined as a function of a plurality of previous measurements, such as an average of a given number of last measurements, e.g. the last five measurements.
  • any other target/reference value may be used.
  • the use of the last measurements/target value may be suitable in case the background light of the artwork 140 may vary.
  • the control system 130 may thus vary at a step 636 the settings of the light fixture(s) 110, e.g. vary the brightness thereof. For example, the control system 130 may increase the light intensity of the light fixture 110 when the reflectance decreases compared to the reference value, or may decrease the light intensity of the light fixture 110 when the reflectance increases. Conversely, if no change occurs or if the change is smaller than a certain threshold (output “N” of the verification step 634), the control system 130 may maintain at a step 638 the previous setting of the light fixture(s) 110.
  • the new measurement value may then be stored at a step 640 as reference value (or may be used to calculate the new reference value).
  • control system 130 may send the control command to change of light intensity at the light fixture 110 immediately in response to the detection of a change of reflectance or with a certain delay or only if the change in reflectance is higher than a certain threshold, which e.g. may be determined by the user depending on the illuminated object.
  • control system 130 may compare at the step 634 the reflected light intensity with a maximum value. If the light intensity exceeds this threshold, the control system may generate a warning signal and/or reduce the power supply of the light source 117.
  • the threshold values (or other sensitivity data) may be stored in a data storage unit, e.g. the artwork database 206, and/or may be determined as a function of an image of the artwork 140.
  • the light sensor 120 may have a variable position. Specifically, as mentioned before, the reflectivity of the artwork 140 may also be a function of the angle of observation. Accordingly, in various embodiments, both during the calibration and the normal operation phase, the position of the light sensor 120 may be varied according to a given profile in order to acquire a sequence of a plurality of (global and/or local) intensity values of reflected light for at least one wavelength or wavelength range for respective positions of the light sensor 120, and the sequence of measurements may be used to determine the mathematical function or may be stored in the dataset.
  • the light sensor 120 may have associated for this purpose an actuator (schematically shown via a line 602 in Figure 19), e.g. controlled by the light sensor 120 or the control system 130, configured to vary the position of the light sensor 120 with respect to the artwork 140, e.g. the distance and/or angle of the light sensor 120 with respect to said artwork 140.
  • various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the artwork for at least one wavelength or wavelength range.
  • the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork for at least one wavelength or wavelength range and measuring via the light sensor the global and/or local light intensity values of the light reflected by the artwork; during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork; and during a normal operation phase, measuring via the light sensor the global and/or the plurality of local light intensity values of the light reflected by the artwork, and estimating via the mathematical function or the dataset the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork.
  • reflected light 600 has specular and diffusive components, which often are not easy to characterize.
  • paintings 140 often have a very uneven surface. This implies that specular as well as diffusive reflectivity of an object 140 may vary significantly depending on the incident direction of the light.
  • the reflectance of an object 140 may also vary over the time.
  • the light 702 reflected by a reference surface 700 is used instead of estimating the illumination of an artwork/object 140 as a function of the light 600 reflected by the artwork 140 itself.
  • a reference luminance target is an object of known spectral reflectance, preferably in both specular and diffusive reflectance.
  • a reference luminance target (RLT) 700 is installed in the proximity of the object 140 to be monitored.
  • the reference target 700 may be installed with respect to the border of the object 140, preferably laterally, at a distance being smaller than 1 m, preferably smaller than 50 cm, preferably between 5 and 30 cm.
  • the RLT 700 may be the tag having printed thereon the name of the respective object 140 or additional information about it.
  • the RLT 700 has a given (preferably known) reflectivity pattern depending on the angle of incidence 0_i and angle of reflection O r with respect to the normal of the surface 142 of the object 140.
  • the specular and diffusive reflectivity form the overall reflectivity:
  • R(0_i, O r) Rs(0_r) + Rd(0_i, 0_r)
  • the RLT 700 has preferably a known diffusive and preferably also a known specular reflectivity.
  • the RLT 700 has a surface with minimized specular reflection and maximized diffusive reflection.
  • a surface 700 may be white paint, such as barium sulfate BaS04, or polymers with diffusive particles, such as polycarbonate or PMMA or silicone with A1203 or Ti02.
  • the RLT 700 is (approximately) a Lambertian emitter on the complete spectral range of interest.
  • the apparent brightness of a Lambertian surface to an observer is (approximately) the same regardless of the observer's angle of view.
  • the RLT 700 and the object 140 are illuminated with the same light fixtures(s) 110, or by light fixtures 110 or light sources 117 with identical settings, so the values measured for the RLT are indicative of the illumination of the object 140. Furthermore, in various embodiments, shadowing of the RLT 700 and the sensor 120 is avoided, both in cases it is caused by the object (or other items) or visitors.
  • the light sensor 120 described with respect to the second embodiment of light sensor is now used to measure a global and/or a plurality of local light intensity values of the light 702 reflected by the reference luminance target 700 for at least one wavelength or wavelength range.
  • the light sensor 120 may be a 2D camera or a photodiode.
  • a calibration, a training and a normal operation phase is thus used to determine the global and/or plurality of local light intensities at the artwork 140 as a function of the measured global and/or plurality of local light intensity values of the light 702 reflected by the reference luminance target 700.
  • the light sensor 120 may be installed in a fixed position, because the measured global and/or plurality of local light intensity values are the same regardless of the incident angles of the light 500 from the light fixture(s) 110 and the observation angle (indicated as 0_r-c in Figure 23).
  • the light sensor 120 may have also a variable position, i.e.
  • the position of the light sensor 120 may be varied according to a given profile in order to acquire a sequence of a plurality of (global and/or local) intensity values of reflected light for at least one wavelength or wavelength range for respective positions of the light sensor 120, and the sequence of measurements may be used to determine the mathematical function or may be stored in the dataset.
  • the light sensor 120 may have associated for this purpose an actuator (schematically shown via a line 602 in Figure 23), e.g. controlled by the light sensor 120 or the control system 130, configured to vary the position of the light sensor 120 with respect to the artwork 140, e.g. the distance and/or angle 0_r-c of the light sensor 120 with respect to said artwork 140.
  • the artwork 140 and the RLT 700 are illuminated at a step 710 (essentially corresponding to the step 610) by at least one light fixture 110.
  • the global and/or plurality of local light intensities at the artwork 140 and/or the RLT 700 are obtained at a step 712 (essentially corresponding to the step 612 or 614), e.g. by directly measuring the illumination of the object 140/RLT 700 or calculating/estimating the illumination of the object 140/RLT 700.
  • the same methods as described with respect to the steps 612 and 614 may be used.
  • the reflected light intensity is measured at a step 714 (essentially corresponding to the step 616) via the light sensor 120.
  • the light sensor 120 may provide a global and/or a plurality of local light intensity values for a single or a plurality of wavelengths or wavelength ranges.
  • the settings of the light fixture(s) 110 may be varied at a step 716 (essentially corresponding to the step 618) and the steps 710-714 may be repeated Accordingly, the calibration phase is used to acquire a dataset, wherein each item of the dataset comprises: a global and/or a plurality of local light intensities at the artwork 140 and/or the RLT
  • the RLT 700 (preferably for a plurality of wavelengths or wavelength ranges); and a measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700 (preferably for a plurality of wavelengths or wavelength ranges)
  • a mathematical function or dataset is determined, which permits to: directly calculate/estimate (via the mathematical function or the dataset, e.g. via interpolation or extrapolation) the global and/or the plurality of local light intensities at the artwork 140 as a function of the measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700; or calculate/estimate (via the mathematical function or the dataset, e.g.
  • the global and/or the plurality of local light intensities at the RLT 700 as a function of the measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700, and then calculate (more or less in line with the first embodiment of a light sensor) the global and/or the plurality of local light intensities at the artwork 140 as a function of: o the calculated/estimated global and/or plurality of local light intensities at the RLT 700, o data identifying the geometrical position of the RLT 700, the artwork 140 and the light fixture 110, and o optionally the radiation pattern of the light fixture(s) 110.
  • the function used to calculate the illumination of the artwork 140 as a function of the (estimated) illumination of the RLT 700 may also be determined as a function of the dataset, e.g. by training a machine learning algorithm, such as an artificial neural network.
  • the calibration phase takes also into account the optical transfer function of the light sensor 120, such as the wavelength dependence of the camera, e.g. for the lens and sensor elements, the angle dependence of the camera, e.g. for the lens, the sensor elements, or the iris, if the camera has a strong field distortion or field curvature, etc.
  • the optical transfer function of the light sensor 120 such as the wavelength dependence of the camera, e.g. for the lens and sensor elements, the angle dependence of the camera, e.g. for the lens, the sensor elements, or the iris, if the camera has a strong field distortion or field curvature, etc.
  • a camera 120 will likely measure both light 702 reflected by the RLT 700 and light 600 reflected by the artwork 140. For this reason, it is convenient to perform a plurality of measurement for different setting of the light fixture(s) 110 and/or to vary the position of the light sensor 120.
  • a data processing unit of the lighting system 100 may thus receive the measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700 and determine via the mathematical function or dataset the global and/or the plurality of local light intensities at the artwork 140.
  • the data processing unit may first estimate the illumination of the RLT 700 and then calculate the illumination of the artwork 140 as a function of the geometrical position of the RLT 700, the artwork 140 and the light fixture 110.
  • the intensity of the reflected light may thus be measured continuously or periodically, e.g. every 5 minutes or every hour.
  • the measurement may be performed for the overall illumination of the artwork 140/RLT 700, i.e. the illumination provided by the light fixture(s) 110 and background illumination, and/or the measurement may be performed for only for the background illumination (i.e. with the light fixture(s) 110 switched off).
  • the background illumination may be measured when the exposition area 160 is closed or when no visitors in the exposition area 160 are detected.
  • the opening hours of the exposition area 160 may be stored, e.g. in the exposition area database 204.
  • a RLT 700 and a light sensor 120 are used to measure only background light, e.g. by placing a RLT 700 in a position, which is not illuminated by the fixture fixture(s) 110
  • various operations may be performed as a function of the global and/or plurality of local light intensities at the artwork (possibly for a plurality of wavelengths or wavelength rages).
  • the global and/or local intensities may be compared with one or more thresholds, and the control system 130 may adapt the brightness of the light fixture accordingly.
  • the exposition area may comprise means (e.g. an actuator) configured to vary the background illumination, e.g. by using roller shutters to reduce the background light from outside of a window 164.
  • the control system 130 may also adapt the brightness of the ambient light as a function of the global and/or local intensities at the artwork 140. Generally, a similar operation may also be performed in any other lighting system 100 described herein.
  • the lighting system 110 may also comprise a plurality of reference luminance targets 700 placed around the same object 140, allowing to interpolate the light intensity at the object 140. For example, this may be useful when the radiation pattern of the light fixture(s) is unknown.
  • a non-null specular reflectivity component i.e. a diffusive reflector with a controlled specular component, additionally to the Lambertian one, to monitor the source from a very specific direction;
  • the RLT 700 may also comprise surfaces of several materials with different features.
  • the RLT could also have a non-flat spectral reflectivity. This could be used to enhance or reduce the reflectivity of a specific wavelength or in a specific wavelength range.
  • various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a reference luminance target is installed in proximity of the artwork, whereby the reference luminance target is illuminated with the light emitted by the one or more light fixtures, and wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the reference luminance target for at least one wavelength or wavelength range.
  • the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork and/or at the reference luminance target for at least one wavelength or wavelength range, and measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target; during a training phase, determining a mathematical function and/or a dataset adapted to estimate the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity values of the light reflected by the reference luminance target; and during a normal operation phase, measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target and estimating the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity of the light reflected by the reference luminance target.
  • paintings and other pieces of artwork are illuminated by natural and/or artificial light sources in order to present the artwork, e.g. according to a given requested illumination having a good color rendition for a viewer.
  • an artwork 140 may be illuminated with various kinds of light sources at the same time under different irradiating angles and beam diameters.
  • a light fixture may encompass pre set or adjustable light sources 117 (intensity, color, orientation of irradiation etc.), a variety of optical elements 115 (e.g. lenses, diffuser, color filter), sensors 120 (such as temperature, humidity, light intensity, color, as well as sensors for people tracking, e.g. by using IR-radiation (emission and sensing), lighting control units (driver and controller for light sources and light/color sensors as well as for actuators for movement and orientation of light sources and optical elements), and the like.
  • pre set or adjustable light sources 117 intensity, color, orientation of irradiation etc.
  • optical elements 115 e.g. lenses, diffuser, color filter
  • sensors 120 such as temperature, humidity, light intensity, color, as well as sensors for people tracking, e.g. by using IR-radiation (emission and sensing), lighting control units (driver
  • a light fixture 110 may comprise a control unit 113 and a driver 116 for a light source 117, such as one or more LEDs.
  • the control unit 113 and the driver 116 may be configured to adjust the light spectrum of the light emitted by the light sources 110 to a certain value.
  • the light fixture 110 may be configured (via a suitable selection of light sources 117, and configuration of driver 116 and data processing unit 113) to adjust the color temperature of the combined light emitted by the light sources 117, so that the color point lies preferably on or near the Planck Curve (CIE color diagram).
  • CIE color diagram Planck Curve
  • the light fixture 110 may be configured to adjust and keep the resulting color temperature on and along the Planck Curve, for example, in the range between 1800 K and 6000 K with a deviation from the Planck Curve of not more than 2 McAdam Ellipses.
  • the light fixture may as well be configured to adjust color coordinates within a color space area that is not limited to a Planck Curve.
  • the light fixtures 110 are configured to provide a tunable light output, in particular a tunable white light output.
  • a color regulation function may be implemented by using light sources 117 emitting light with different colors and regulating/dimming the relative brightness of the light sources 117.
  • Such a dimming operation may be performed by using a PWM or other dimming methods.
  • the color regulation may be performed via a feedback/closed loop control, e.g. by using a sensor providing a measure of the color temperature of the light emitted by the light fixture 110, or via a feed-forward/open loop control, e.g. by using a look-up table having stored the dimming levels for the light sources 117 for a given requested color temperature.
  • paintings and other artworks 140 may also undergo ageing effects.
  • ageing effects of color-pigments of artworks may including a darkening or a color shift, e.g. due to temperature changes and/or increased levels of humidity, and/or color bleaching and/or textile (canvas) damages, e.g. due to natural and artificial light with a high blue or even UV-content.
  • a pigment is a material that changes the color of reflected or transmitted light as the result of wavelength-selective absorption. Most materials selectively absorb certain wavelengths of light. Usually the pigments are selected to have a high tinting strength relative to the materials it colors. Usually, once the pigments have been applied, they should be stable in solid form at ambient temperatures.
  • Pigments that are not permanent are called fugitive. Fugitive pigments fade over time, or with exposure to light, while some eventually blacken. Pigments are used for coloring paint, ink, plastic, fabric, cosmetics, food, and other materials. Most pigments used in manufacturing and the visual arts are dry colorants, usually ground into a fine powder. For use in paint, this powder is added to a binder (or vehicle), a relatively neutral or colorless material that suspends the pigment and gives the paint its adhesion. A distinction is usually made between a pigment, which is insoluble in its vehicle (resulting in a suspension), and a dye, which either is itself a liquid or is soluble in its vehicle (resulting in a solution).
  • a colorant can act as either a pigment or a dye depending on the vehicle involved.
  • a pigment can be manufactured from a dye by precipitating a soluble dye with a metallic salt. The resulting pigment is called a lake pigment.
  • the term biological pigment is used for all colored substances independent of their solubility.
  • the lighting system 100 is also configured to determine and monitor possible ageing effects of the artwork(s) 140 and to adapt the settings of the light fixture(s) to provide optimized lighting conditions.
  • Figure 38 shows an embodiment of a lighting system 100 configured to monitor ageing effects of artworks 140.
  • a piece of artwork 140 may be illuminated by natural and/or artificial light as provided by lighting fixtures 110.
  • lighting fixtures 110 may employ a variety of light sources 117 and optical components 115 like lenses and filters.
  • the combined light 500 emitted by the light fixture 110 may be characterized, e.g ., by spectral distribution, intensity, beam spread, and orientation (incidence of light).
  • the lighting fixtures 110 may be different from each other and also their incidence of lighting. This means that the lighting conditions may vary from one illuminated artwork 140 to another and thus making it difficult to compare the effect they have on degradation effects of the illuminated painting.
  • the total time of illumination i.e. the integral amount of light per day or per period, plays a role, adding further complexity to the matter.
  • the artwork 140 in order to determine the ageing of an artwork, is illuminated with a given (reference) lighting condition and a graphical image is obtained via a suited camera 120i providing pixelized image data where each (usually color filtered) pixel or subpixel of the image provides information about intensity and color coordinates of the optically related sensed area of the painting.
  • each pixel or subpixel of the image provides information about intensity and color coordinates of the optically related sensed area of the painting.
  • the pixels are optically related to small areas (cells) of the painting. So, each pixel data is related to a reflected cell spectrum.
  • the reflected light 600 of each area (cell) of the painting 140 is measured with the camera 120i, such as a CMOS or CCD camera, using (standardized) color filters in front of the sensor chips, for example RGB in a Bayer-setting or other color filter variants like RGEB (red, green, emerald, blue), or one that employs two kinds of green filters), and the like.
  • RGB in a Bayer-setting or other color filter variants like
  • RGEB red, green, emerald, blue
  • Figure 19 for a more detailed description of how to monitor an artwork via a camera.
  • image taking may be done under various lighting situations and/or under various positions and angles with respect to the illuminated artwork 140.
  • image measurement is functionally related to artwork Illumination (ambient and artificial), the reflectivity features of the artwork 140.
  • Such measurements will result in a large number of (digitized) camera pixel data of each painting 140.
  • the measurements depend also on the image measurement characteristics, i.e. the properties of the camera 120i.
  • a CCD/CMOS camera needs to have filter segments placed in front of the sensor chips, for example RGB filters in a Bayer configuration/setting, in order to allow for color perception and respective measurement.
  • filter segments, CCD/CMOS-chips and signal procession will show some kind of variation that needs to be taken into account.
  • the optical characteristics of the camera may be expressed via an Image Transfer Function or Optical Transfer Function.
  • the measurements may be represented by a (possibly multi-dimensional) Digital Data Set taking all these conditions into account.
  • the acquired image data set is stored in a database.
  • the evolution of given characteristics of the data set may be analyzed, e.g. via image analysis.
  • the stored data sets represent historical data of the evolution (ageing) of the color data of the artwork 140.
  • the historical data of a plurality of artworks 140 are stored to a remote database, e.g. in a cloud.
  • the lighting system may comprise: a remote control system 13 OR connected to a Wide Area Network WAN, e.g. Internet, wherein the remote control system 130 R comprises a remote data processing unit 133 R and a remote data storage 132 R ; and a local control system 130 L (i.e. located in the exposition area or in the vicinity of the exposition area 160) configured to manage the operation of the light fixture(s) 110 in the exposition area 160.
  • control system 130 L is purely optional, because the light fixture(s) 110 may also be connected directly to the Wide Area Network WAN.
  • the historical data of a plurality of artworks 140 may be stored to the remote, e.g. cloud based, data storage 132 R .
  • the data processing unit 133 R is configured to compare the historical data of a given artwork 140 with the historical data of other artworks 140. Specifically, in various embodiments, due to the fact that the artworks 140 are different, given features may be extracted from the historical data sets. Generally, as alternative to or in addition to storing the image data-sets, the image data may also be pre-processed (e.g. in order to extract the features) and the processed image data may be stored to the database.
  • the pixel-data may be grouped so that they represent color-coded information of a contiguous larger area of a painting, i.e. a larger cell area.
  • a similar color e.g. which are between a given upper and lower threshold with respect to the respective reference color.
  • this permits to determine images having similar colors, irrespective of the fact whether the positions and dimensions of the respective areas are similar.
  • the dimension of the respective areas of pixels may be stored, which e.g. permits to select artworks having a similar overall color usage.
  • data may be relevant in order to determine similar artworks.
  • one or more of the following characteristics may be stored for each artwork: the artist, the epoch, the type of artwork (old master, modem art, etc.), the type of pigments used (e.g. for each reference color) or even the ingredients used for the pigments.
  • the type of artwork old master, modem art, etc.
  • the type of pigments used e.g. for each reference color
  • ingredients used for the pigments e.g. for the ingredients used for the pigments.
  • such data may already be stored in the artwork database 204.
  • the actual and historical conditions of the artwork 140 may be taken into account, such as actual and/or historical data of the environment in which the artwork 140 has been exposed (which may also relate to a sequence of different exposition areas in case the artwork 140 is relocated), such as one or more of the following data which may be provided by one or more sensors 120 2 installed in the exposition are 160:
  • the illumination (possibly for a plurality of wavelengths or wavelength ranges) of the artwork 140 which may include instantaneous and/or cumulative values; possibly correlated with circadian lighting conditions, which may be measured or estimated, e.g. as a function of the position of the exposition area 160 (see also the respective description of light sensors).
  • the following data are stored to the central data base: data identifying the ageing of the pigments of the artworks 140 (as determined as a function of the images of the artwork 140); data identifying the characteristics of the artworks 140, which e.g. permits to determine similar artworks 140; and data identifying the historical data of the exposition of the artworks 140, in particular with respect to the illumination of the artwork 140.
  • the camera data (e.g. type of sensors, color filters, optical components, etc.) may be stored for each artwork 140 in the central database.
  • the remote processing system 13 OR may receive the following data, which may e.g. be already stored in one or more of the databases 200: data identifying the artwork 140, preferably including pigment identification data, wherein the pigment identification data preferably comprise a matrix containing pigment identification data for bi-dimensional positions of the artwork 140; data identifying a requested illumination for the artwork 140, such as respective light intensity values for a plurality of wavelengths or wavelength ranges, or a color temperature; optionally data identifying the characteristics of the camera 120i; optionally data identifying the light fixtures 110.
  • data identifying the artwork 140 preferably including pigment identification data, wherein the pigment identification data preferably comprise a matrix containing pigment identification data for bi-dimensional positions of the artwork 140
  • data identifying a requested illumination for the artwork 140 such as respective light intensity values for a plurality of wavelengths or wavelength ranges, or a color temperature
  • data identifying the characteristics of the camera 120i optionally data identifying the light fixtures 110.
  • the remote processing system 13 O R may receive from the local processing systems 130 (or directly the sensors 120i and I2O 2 ), the following measured data:
  • the image data or the processed image data provided by the camera 120i which are indicative of the ageing of the pigments of the artwork 140, such as the values of the reflectivity of given areas (associated with given pigments) of the artwork 140;
  • the other measured data provided by the sensors 120 2 such as the light intensity values for a plurality of wavelengths or wavelength ranges, and optionally one or more of: temperature, humidity, oxygen level of the air, location of the artwork.
  • the above data thus permit to define the (integral) environment (lighting, temperature and so on) of the artworks 140.
  • the data stored to the central database are processed via a machine learning method, in particular a Deep Learning / Artificial Intelligence (AI) program.
  • AI Deep Learning / Artificial Intelligence
  • Using such analysis based on supervised or non-supervised Al-programs permits to determine functional relationships between a plurality of parameters that are derived from the big data set (BDS).
  • the data may be processed in order to determine the best lighting conditions for each piece of artwork. Best lighting conditions may mean, less damaging lighting conditions, or less damaging lighting conditions with still acceptable or use-defined tolerance range of color rendition of an Art Work.
  • Figure 39 shows in this respect an embodiment of the operation of the remote processing system 13 O R .
  • the database 132 R has stored one or more datasets for each of a plurality of already monitored artworks 140.
  • the processing system 130 R may receive one or more datasets for each of a plurality of artworks 140.
  • each dataset comprises: data identifying a list of pigments of the respective artwork 140; data identifying the illumination of each pigment of the list of pigments during a given time period; data identifying the ageing of each pigment of the list of pigments during the given time period.
  • the data identifying a list of pigments of an artwork 140 may be a bi-dimensional matrix having stored data identifying the pigment in a given horizontal and vertical position of the artwork.
  • the data may identify at least one of: a pigment color, a pigment type, a pigment material, etc.
  • a list of pigments of an artwork 140 may be inserted manually or determined by analyzing an image of the artwork 140.
  • the color of given areas of the artwork 140 may be determined via image processing, and the color of a given area may then be associated with a given pigment type.
  • the pigment type for a given color may be entered manually or determined automatically, e.g. based on a table having stored pigment types for given artwork types, such as Old Master, Modern Art, etc., or an artist name.
  • pigment type data are per se static, these data may be stored once for each artwork 140, and the data identifying the list of pigments of the artwork 140 may simply identify the respective artwork 140, e.g. via a respective univocal artwork code.
  • the data identifying the illumination of each pigment of the list of pigments during a given time period may be determined by determining the global illumination of the artwork or determining the local illumination of given areas of the artwork. Possible solutions for determining a global or local illumination of an artwork 140 have already been described for the previous light sensors, such as based on the power supply parameters of the light sources 117, the light emitted by the light sources 117, the light received at the artwork 140, the light reflected by the artwork 140 or the light reflected by a reference surface.
  • the data identifying the ageing of the pigments may be determined by acquiring images of the artwork 140 and by processing the image via an image analysis, e.g. by determining the variation of the pixel data of two images taken at the beginning and the end of the time period.
  • image analysis e.g. by determining the variation of the pixel data of two images taken at the beginning and the end of the time period.
  • other method for monitoring the variation of given properties of the pigments may be used to determine/estimate the ageing of the pigments.
  • the processing system 13 OR receives at a step 1002 data identifying the list of pigments of the artwork 140 to be illuminated.
  • the data may already be stored to a database, such as the artwork database 204 (which may also be stored in the remote database 132R) and the data may comprise a univocal artwork code identifying the artwork 140 to be illuminated.
  • the processing system 130R may also receive the data identifying the illumination of each pigment and the respective ageing for the artwork 140 to be illuminated and store these data to the database 132R. Accordingly, the database 132R may add to the historical database also the data of the artworks to be illuminated.
  • the processing system 13 OR determines a maximum illumination threshold for the illumination of the artwork 140 as a function of the list of pigments of the artwork and the datasets stored in the database 132R.
  • the processing system 13 OR selects at a step 1006 a pigment of the list and uses the historical data of the artworks 140 stored to the data storage 132R in order to determine the best light spectrum that minimizes the ageing of the pigment, i.e. the maximum illumination for the respective pigment.
  • the method may perform the following operations at the step 1006: processing the datasets of the artworks 140 in order to determine the datasets having similar pigments; as mentioned before, for this purpose the pigments may be classified according to color, epoch, material, etc.; processing the data of the selected datasets in order to determine correlations between the selected pigment, the illumination of the pigments (e.g . the respective spectrum of the light illuminating it) and the changes occurring to the pigments over a certain amount of time.
  • processing the datasets of the artworks 140 in order to determine the datasets having similar pigments; as mentioned before, for this purpose the pigments may be classified according to color, epoch, material, etc.
  • processing the data of the selected datasets in order to determine correlations between the selected pigment, the illumination of the pigments (e.g . the respective spectrum of the light illuminating it) and the changes occurring to the pigments over a certain amount of time.
  • a feature selection is performed in order to select a set of most relevant features, which are linked to the ageing of the selected pigment.
  • a set of features which includes at least data identifying the spectrum of the light used to illuminate the selected pigment.
  • these features may include light intensity values for a plurality of wavelengths or wavelength ranges.
  • the features may also include one or more other data identifying the historical data of the exposition of the artworks 140, such as the temperature and humidity data.
  • the most influencing features may be selected, in particular a set of wavelengths or wavelength ranges is determined which is correlated strongly to the ageing of a given pigment.
  • ageing is also influenced strongly by temperature and humidity, i.e. the selected features may include the temperature and/or humidity in the exposition area or directly of the artwork 140.
  • this step is purely optional, because the method may also use a fixed set of features, such as a pre-selected set of features, e.g. the set of wavelengths or wavelength ranges may include a set of predetermined wavelength or wavelength ranges, possibly ranging from IR to UV.
  • the data of the respective features may be processed in order to determine maximum threshold values for the illumination of the selected pigment with these wavelengths or wavelength ranges, which do not generate (a significant) ageing of the artwork 140.
  • the data are processed via a machine learning method, which receives at input the data of the selected features and provides at output the ageing index.
  • the machine learning method generates an ageing model of the selected pigment based on the historical data stored to the central database.
  • the machine learning method may be an Artificial Neural Network (ANN) or a Support Vector Machine.
  • the method may simulate the ageing of the selected pigment, and select a combination of input values, which does not result in a significant ageing of the selected pigment.
  • the method may vary the input of the light intensity values for the selected set of wavelengths or wavelength ranges, and select a combination of values, wherein the ageing index at the output of the machine learning method is below a given maximum threshold value.
  • the method may also be used to adapt the illumination of the artwork to different ambient conditions, e.g. a different temperature or humidity in the exposition area 160, because the actually measured temperature and humidity values may be used as input for the ageing model.
  • the relative value of the intensity values are linked.
  • it is sufficient to vary (e.g. increase) a single intensity value calculate the other intensity values, monitor the respective ageing value at the output of the machine learning method and select the set with the highest intensity values for which the ageing value is still below a given maximum value.
  • the same operation may then be performed for the other pigments of the artwork 140, which is schematically shown via a verification step 1008. Specifically, in case not all pigments/areas of pigments have been processed (output “N” of the verification steps 1008, the method selects a next pigment/area of pigments and returns to the step 1006. Conversely, in case all pigments/areas of pigments have been processed (output “Y” of the verification step 1008), the method proceeds to a step 1010.
  • the processing system 13 OR determines the best light spectrum that minimizes the overall changes/ageing of all the pigments of an artwork 140. For example, for this purpose, the processing system 130R may select the minimum values of the (maximum) values for the various pigments as selected at the step 1006.
  • the method may transmit at a step 1012 the selected maximum values to the control system 130L or directly the light fixture(s) 110 in order to adapt the light emitted by the light fixture(s) 110.
  • the lighting system 110 e.g. the control system 130L, may change the setting of the light fixture(s) 110 either in a feed forward configuration or in a feed back configuration by monitoring the illumination of the artwork 140 via the light sensor 120 2.
  • the above data are stored to a central, e.g. cloud based, database 132R.
  • a central e.g. cloud based, database 132R.
  • the control system 130 or directly the light fixtures 110 may be configured to communicate with the central data base and the AI-program in order to receive automatically respective lighting setting.
  • the described methods and the application of AI-programs and subsequent optimization of local lighting is not just limited to paintings but can be applied to all pieces of artwork.
  • the pigment data, the (requested and historical) illumination data and the aging data may indeed relate to areas of the surface of the artwork 140.
  • the respective data may be mapped on the surface of a respective three-dimensional model of the artwork 140.
  • the 3D model of the surface of the artwork 140 may be obtained, e.g. via a 3D scanner, or may by calculated based on a depth map determined as a function of the images acquired by the camera 120i.
  • such depth maps may be determined based on images acquired via a stereo camera (which may also be implemented with the same camera 120i being placed in a plurality of different positions) and/or by illuminating the object with a light pattern.
  • various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with at least one light fixture. Specifically, in various embodiments, the method comprises:
  • each dataset comprising: o data identifying a list of pigments of the respective artwork; o data identifying the illumination of each pigment of the list of pigments during a given time period; o data identifying the ageing of each pigment of the list of pigments during the given time period;
  • Example 1.1 An illumination system comprising a light fixture 110, an optional sensor 120 and a control system 130 to illuminate an object 140 or a person 150.
  • Example 1.2 The illumination system of Example 1.1 providing the illumination locally to highlight certain aspects of an object 140 being less than an entirety of the object 140.
  • Example 1.3 The illumination system of Example 1.1 or 1.2 containing a light fixture 110 which provides pixelized light, the pixelized light produced by at least one of an LED matrix; a liquid crystal; and a Digital Micromirror Device (DMD).
  • a light fixture 110 which provides pixelized light, the pixelized light produced by at least one of an LED matrix; a liquid crystal; and a Digital Micromirror Device (DMD).
  • DMD Digital Micromirror Device
  • Example 1.4 The illumination system of any of the Examples 1.1 - 1.3 combining its illumination with ambient illumination to create a homogenous illumination of an object
  • Example 1.5 The illumination system of any of the Examples 1.1 - 1.4 comprising a user interface 134 which allows to control the illumination system 110 such as by gestures or third- party downloadable software applications.
  • Example 1.6 The illumination system of any of the Examples 1.1 - 1.5 wherein the sensor 120 is configured to detect degradation of an object 140 (e.g. its color) and adjust the light quality (intensity, spectrum) to reduce the damaging of the object 140.
  • the sensor 120 is configured to detect degradation of an object 140 (e.g. its color) and adjust the light quality (intensity, spectrum) to reduce the damaging of the object 140.
  • Example 1.7 The illumination system of any of the Examples 1.1 - 1.6 comprising a sensor system which measures the light parameters of the illumination either directly or indirectly.
  • Example 1.8 The illumination system of any of the Examples 1.1 - 1.7 comprising sensors 140 measuring environmental parameters such as one or more of a temperature, a humidity, and a chemical compound.
  • Example 1.9 The illumination system of any of the Examples 1.1 - 1.8 comprising a data processing unit 113, 123 and/or 133 which can process the data using artificial intelligence and/or machine learning.
  • Example 1.10 The illumination system of any of the Examples 1.1 - 1.9 comprising a data storage device 112 and/or 132 which stores different lightings for different settings.
  • Example 1.11 The illumination system of any of the Examples 1.1 - 1.10 comprising a data storage device 112 and/or 122 to collect data relating to an object over its lifetime.
  • Example 2.1 A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, the method comprising the steps of: obtaining data identifying requested spectral characteristics (208), obtaining data identifying a viewer’s eye characteristics (210),
  • Example 2.2 The method according to Example 2.1, wherein the characteristics of the light emitted by the one or more light fixtures (110) comprise one or more of: light intensity, frequency/color, polarization, direction and/or beam spread.
  • Example 2.3 The method according to Example 2.1 or Example 2.2, wherein the requested spectral characteristics, comprise a requested color and a requested brightness level.
  • Example 2.4 The method according to Example 2.3, wherein the requested color is specified via a color temperature or a color coordinate, such as in a CIE 1931 color space.
  • Example 2.5 The method according to Example 2.3 or Example 2.4, wherein the requested spectral characteristics comprise a selection, assortment or sequence of a plurality of requested colors and respective requested brightness levels.
  • Example 2.6 The method according to Example 2.5, wherein the sequence of a plurality of requested colors and respective requested brightness levels is stored in a light scenario matrix comprising data for a plurality of viewing locations.
  • Example 2.7 The method according to Example 2.6, comprising: obtaining data identifying a viewer’s position in the exposition area, and
  • Example 2.8 The method according to any of Examples 2.3 to 2.7, wherein the generating one or more control commands comprises:
  • the one or more control commands in order to vary the color of the light emitted by the one or more light fixtures as a function of the data identifying requested spectral characteristics and the data identifying a viewer’s eye characteristics.
  • Example 2.9 The method according to any of Examples 2.3 to 2.8, wherein the data identifying a viewer’s eye characteristics identify the sensitivity of the viewer’s eyes for a plurality of colors.
  • Example 2.10 The method according to Example 2.9, wherein the generating one or more control commands comprises:
  • Example 2.11 The method according to Example 2.10, wherein the generating data identifying modified spectral characteristics comprises: - increasing the intensity of one or more colors as a function of the data identifying the sensitivity of the viewer’s eyes for a plurality of colors.
  • Example 2.12 The method according to Example 2.10 or Example 2.11, wherein the generating data identifying modified spectral characteristics comprising: changing or altering one or more colors as a function of the data identifying the sensitivity of the viewer’s eyes for a plurality of colors.
  • Example 2.13 The method according to Example 2.12, wherein the one or more colors are shifted along the Planck-curve.
  • Example 2.14 The method according to Example 2.12 or Example 2.13, wherein the one or more colors are shifted as a function of data identifying one or more MacAdam ellipses.
  • Example 2.15 The method according to any of Examples 2.11 to 2.14, wherein the one or more colors are selected from: a first color in the red color spectrum, i.e. between 625 and 740 nm, a second color in the blue color spectrum, i.e. between 435 and 500 nm, and and a third color in the green color spectrum, i.e. between 520 to 565 nm.
  • a first color in the red color spectrum i.e. between 625 and 740 nm
  • a second color in the blue color spectrum i.e. between 435 and 500 nm
  • a third color in the green color spectrum i.e. between 520 to 565 nm.
  • Example 2.16 The method according to any of Examples 2.3 to 215, comprising: determining spectral characteristics of a natural and/or artificial ambient light in the exposition area (160), and
  • the one or more control commands in order to vary the characteristics of the light emitted by the one or more light fixtures (110) also as a function of the determined spectral characteristics of a natural and/or artificial ambient light in the exposition area (160).
  • Example 2.17 The method according to Example 2.16, wherein the spectral characteristics of a natural and/or artificial ambient light in the exposition area are determined via a light sensor (120) installed in the exposition area (160).
  • Example 2.18 The method according to Example 2.10 and Example 2.16, wherein the generating data identifying modified spectral characteristics comprises:
  • Example 2.19 The method according to any of the previous Examples 2.1 to 2.18, wherein the obtaining data identifying requested spectral characteristics comprises:
  • Example 2.20 The method according to any of the previous Examples 2.1 to 2.19, wherein the data identifying a viewer’s eye characteristics are stored in a database (210) or on a portable memory support, such as a memory card or smartcard (220) or a smartphone (220).
  • Example 2.21 The method according to any of the previous Examples 2.1 to 2.20, wherein the data identifying a viewer’s eye characteristics are stored in a viewer’s eye database (210) comprising a plurality of profiles, wherein each profile comprises a univocal viewer code identifying a respective viewer and the data identifying the respective viewer’s eye characteristics and/or viewer’s preferences, and wherein the obtaining data identifying a viewer’s eye characteristics comprises:
  • Example 2.22 The method according to any of the previous Examples 2.1 to 2.21, wherein the obtaining data identifying a viewer’s eye characteristics comprises:
  • Example 2.23 The method according to Example 2.22, wherein the data identifying a viewer’s eye characteristics identify the sensitive of the viewer’s eyes for a plurality of colors, and wherein the determining the data identifying a viewer’s eye characteristic as a function of the viewer’s age comprises: specifying via the data identifying a viewer’s eye characteristic a higher intensity for red and/or green and/or blue light with an increasing viewer’s age.
  • Example 2.24 The method according to any of the previous Examples 2.1 to 2.23, wherein the data identifying a viewer’s eye characteristics identify preferred illumination settings of the viewer.
  • Example 2.25 The method according to Example 2.24, wherein the preferred illumination settings of the viewer are varied in real-time in order to vary the spectral characteristics of the light emitted by the one or more light fixtures.
  • Example 2.26 The method according to any of the previous Examples 2.1 to 2.25, wherein the data identifying a viewer’s eye characteristics (210) represent default eye characteristics (210), and wherein the method comprises: acquiring via a camera (230) an image (240) of the artwork illuminated with the light emitted by the one or more light fixtures (110); obtaining data identifying a remote viewer’s eye characteristics (210),
  • Example 2.27 The method according to Example 2.26, wherein the image is modified by the remote display device (250) or before transmission to the remote display device (250)
  • Example 2.28 The method according to Example 2.26 or Example 2.27, wherein the generating a modified image comprises:
  • Example 2.29 The method according to any of Examples 2.26 to Example 2.28, wherein the generating a modified image comprises:
  • Example 2.30 The method according to Example 2.29, wherein the receiving data (212) identifying the spectral characteristics of the remote display device (250) comprises:
  • a display database comprising data identifying the spectral characteristics for a plurality of display device types, in order to obtain the data identifying the spectral characteristics associated with the received display device type.
  • Example 2.31 The method according to Example 2.30, wherein the display device type is selected from the group of: a projector model, a display type, such as an LCD display, an AMOLED display, a display model, a virtual reality or augmented reality glass model.
  • a display type such as an LCD display, an AMOLED display, a display model, a virtual reality or augmented reality glass model.
  • Example 2.32 The method according to any of Examples 2.26 to 2.31, wherein the display device is in a viewer’s location, and wherein the generating a modified image comprises: determining spectral characteristics of a natural and/or artificial ambient light in the viewer’s location, and
  • Example 2.32 The method according to Example 2.31, wherein the remote display device (250) is integrated in a computer device comprising a camera, and wherein the spectral characteristics of a natural and/or artificial ambient light in the viewer’s location are determined via the camera of the mobile device.
  • Example 2.33 A control system (130) for a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics in order to illuminate an artwork (140) in an exposition area (160), wherein the control system (130) is configured to send control commands to the one or more light fixtures (110) in order to vary the characteristics of the light emitted by the one or more light fixtures (110), wherein the control system (130) is configured to implement the method according to any of the previous Examples 2.1 to 2.32.
  • Example 2.34 The control system of Example 2.33, comprising a database (200) and/or a communication interface (131) in order to receive at least one of: data (202) identifying the characteristics of the one or more light fixtures (110); data (204) identifying the characteristics of the exposition area; data (206) identifying at least one artwork; data (210) identifying artist’s eye characteristics, such as an artist code, data (208) identifying requested spectral characteristics; data (210) identifying viewer’s eye characteristics, such as a viewer code, or a viewer’s age,
  • Example 2.35 The control system of Example 2.34, wherein the communication interface (131) comprises at least one of: a communication interface for connection to a local area network, a communication interface for connection to a wide area network, such as Internet, a communication interface for short range wireless communication, such as a Bluetooth® communication interface.
  • the communication interface (131) comprises at least one of: a communication interface for connection to a local area network, a communication interface for connection to a wide area network, such as Internet, a communication interface for short range wireless communication, such as a Bluetooth® communication interface.
  • Example 2.36 The control system of any of Examples 2.33 to 2.35, comprising a memory (200) having stored data (206) identifying requested spectral characteristics for a plurality of artworks.
  • Example 2.37 A lighting system (100) comprising: one or more light fixtures (110) configured to emit light with variable characteristics in order to illuminate an artwork (140) in an exposition area (160), and a control system (130) according to any of Examples 2.33 to 2.36.
  • Example 2.38 The lighting system of Example 2.37, comprising: at least one light sensor (120) configured to be installed in the exposition area (160).
  • Example 2.39 The lighting system of Example 2.37 or Example 2.38, comprising: at least one mobile device (250) comprising a display.
  • Example 2.40 The lighting system of any of Examples 2.36 to 2.39, wherein the mobile device (250) is a smartphone or a tablet.
  • Example 2.41 The lighting system of any of Examples 2.36 to 2.40, comprising: a plurality of portable memory supports, such as memory cards or smartcards (220), or smartphones (222), each portable memory support having stored data identifying at least one viewer’s eye characteristics.
  • Example 2.42 A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 2.1 to 2.32.
  • Example 2.43 A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 2.1 to 2.32.
  • Example 3.1 A method of selecting at least one light fixture (110) comprising: obtaining data (206) identifying characteristics of an artwork (140), obtaining data (204) identifying characteristics of an exposition area (160), determining a set of light fixtures (110) and/or operating setting for a set of light fixtures (110) as a function of said data (206) identifying characteristics of said artwork (140) and said data (204) identifying characteristics of said exposition area (160).
  • Example 3.2 The method according to Example 3.1, wherein at least one light fixture (110) of said set of light fixtures supports a plurality of operating setting having different characteristics, and wherein the method comprises: selecting one of said plurality of operating setting of said at least one light fixture (110) as a function of said data identifying characteristics of said artwork (140) and said data identifying characteristics said exposition area.
  • Example 3.3 The method according to Example 3.2, comprising generating control information for said light fixtures (110) of said determined set of light fixtures as a function of said selected operating setting.
  • Example 3.4 The method according to any of the previous Examples 3.1 to 3.3, wherein said determining a set of light fixtures: accessing a database (202) of light fixtures (110), said database (202) of light fixtures (110) comprising light fixtures installed in said exposition area (140) and/or installable in said exposition area (140); and selecting amongst said installed and/or installable light fixtures (110) a set of light fixtures as a function of said data (206) identifying characteristics of said artwork (140) and said data (204) identifying characteristics said exposition area (160).
  • Example 3.5 The method according to Example 3.4, wherein said database (202) of light fixtures (110) comprises: a first spotlight with a first light intensity level, and a second spotlight with a second light intensity level, said second light intensity level being greater than said first light intensity level.
  • Example 3.6 The method according to Example 3.4 or Example 3.5, wherein said database (202) of light fixtures (110) comprises: a light fixture with a fixed beam angle, and a light fixture with a variable beam angle.
  • Example 3.7 The method according to any of the previous Examples 3.4 to Example 3.6, wherein said database (202) of light fixtures (110) comprises: a light fixture with a framer, and/or a light fixture with a gobo.
  • Example 3.8 The method according to any of the previous Examples 3.4 to Example 3.7, wherein said database (202) of light fixtures (110) comprises: a light fixture with variable spectral characteristics.
  • Example 3.9 The method according to any of the previous Examples 3.4 to 3.8, wherein said database (202) of light fixtures (110) comprises data identifying characteristics of the respective light fixture (110), said characteristics of said light fixtures comprising one or more of the following data:
  • - brightness data such as data identifying a minimum brightness level and a maximum brightness level adapted to be emitted by the respective light fixture; spectral data identifying light colors adapted to be emitted by the respective light fixture; optics data, identifying a light transfer function of one or more optical elements of the respective light fixture, such as a reflector, diffusor, lens, shutters and/or framers; data identifying at least one of a spectral distribution, a color location, a Color Rendering Index, a beam direction, a beam spread angle, and a light polarization of the light emitted by the light fixture 110.
  • Example 3.10 The method according to Example 3.9, wherein said spectral data are a color temperature or a color coordinate or a color rendering index.
  • Example 3.11 The method according to any of the previous Examples 3.4 to 3.10, wherein at least one light fixture (110) supports a plurality of configuration conditions having different characteristics, and said data identifying the characteristics of said light fixtures (110) comprise data for said plurality of configuration conditions.
  • Example 3.12 The method according to any of the previous Examples 3.1 to 3.11, wherein said exposition area (160) is a room, and wherein said data identifying characteristics of said exposition area comprise a room height, the distance between the floor and a truss used to mount light fixtures and/or a configuration of said room.
  • Example 3.13 The method according to any of the previous Examples 3.1 to 3.12, wherein said data identifying characteristics of said exposition area (160) comprise data identifying characteristics of natural and/or artificial light in said exposition area (160), such as:
  • - brightness data such as data identifying a minimum brightness level and a maximum brightness level; data identifying at least one of a spectral distribution, a color location, a Color Rendering Index, a beam direction, a beam spread angle, and a light polarization of the light in said exposition area (160).
  • Example 3.14 The method according to Example 3.13, wherein said determining a set of light fixtures comprises: determining said set of light fixtures also as a function of said data identifying characteristics of a brightness level of natural and/or artificial light in said exposition area (160).
  • Example 3.15 The method according to any of the previous Examples 3.1 to 3.14, wherein said data (206) identifying characteristics of said artwork (140) indicate the type of said artwork, such as a drawing, a print-out, a photography, a textile, an Old Master painting, a modern art paining, a statue or a 3D-object.
  • said data (206) identifying characteristics of said artwork (140) indicate the type of said artwork, such as a drawing, a print-out, a photography, a textile, an Old Master painting, a modern art paining, a statue or a 3D-object.
  • Example 3.16 The method according to any of the previous Examples 3.1 to 3.15, wherein said data (206) identifying characteristics of said artwork (140) indicate the type of said artwork (140), said type of said artwork (140) being selected as a function of at least one of a material of canvas, color pigments, a frame material.
  • Example 3.17 The method according to any of the previous Examples 3.13 to 3.16, wherein said determining a set of light fixtures comprises: determining said set of light fixtures as a function of said type of said artwork and a respective room height, distance between the floor and a truss used to mount light fixtures and/or a room configuration.
  • Example 3.18 The method according to any of the previous Examples 3.13 to 3.15, wherein said determining a set of light fixtures comprises: accessing a data set, such as a table, said data set comprising a plurality of light fixture sets associated with a respective type of said artwork and a respective room height, and selecting one or more of said light fixture sets as a function of said type of said artwork and said room height.
  • a data set such as a table
  • said data set comprising a plurality of light fixture sets associated with a respective type of said artwork and a respective room height
  • selecting one or more of said light fixture sets as a function of said type of said artwork and said room height.
  • Example 3.19 The method according to any of the previous Examples 3.1 to 3.18, wherein said obtaining data (206) identifying characteristics of said artwork (140) comprises: showing on a screen a request to insert data (206) identifying characteristics of an artwork (140), and
  • Example 3.20 The method according to any of the previous Examples 3.1 to 3.19, wherein said obtaining data identifying characteristics of said artwork comprises: accessing a database of artworks (202), said database of artworks (202) having stored characteristics of a plurality of artworks (140); selecting at least one of said plurality of artworks and obtaining the respective characteristics stored in said database of artworks.
  • Example 3.21 The method according to Example 3.20, wherein said database of artworks (206, 216) has stored for each of said plurality of artworks (140) a respective digital representation (216), and wherein said selecting one of said plurality of artworks comprises: obtaining a digital representation (240) of said artwork (140); selecting one of said plurality of artworks (140) by comparing said obtained digital representation (240) with the digital representations stored in said database of artworks (206, 216).
  • Example 3.22 The method according to Example 3.20 or Example 3.21, wherein said database of artworks (206) has stored for each of said plurality of artworks (140) a respective univocal artwork code, and wherein said selecting one of said plurality of artworks comprises: obtaining a univocal artwork code; selecting one of said plurality of artworks (140) by comparing said obtained univocal artwork code with the univocal artwork codes stored in said database of artworks (206).
  • Example 3.23 The method according to Example 3.22, wherein said obtaining a univocal artwork code comprises: scanning an alphanumeric string, a barcode, a bi-dimensional barcode, such as a QR code, a magnetic support, communicating with a short-range wireless transmitter, such as an RFID or NFC transponder or a Bluetooth® transceiver,
  • Example 3.24 The method according to any of the previous Examples 3.1 to 3.23, wherein said obtaining data (204) identifying characteristics of said exposition area (160) comprises: showing on a screen a request to insert data identifying characteristics of an exposition area, and
  • Example 3.25 The method according to any of the previous Examples 3.1 to 3.24, wherein said obtaining data (204) identifying characteristics of said exposition area (160) comprises: accessing a database of exposition areas (204), said database of exposition areas (204) having stored characteristics of a plurality of exposition areas (160), such as rooms of a museum; selecting one of said plurality of exposition areas (160) and obtaining the respective characteristics stored in said database of exposition areas (204).
  • Example 3.26 The method according to any of the previous Examples 3.1 to 3.25, wherein said characteristics of said artwork (140) comprise one or more of the following data: descriptive data, such as the name of the artwork, the name of the artist, the period or creation year of the artwork, the type of the artwork, dimensional data of said artwork,
  • color data for said artwork, such as color analysis data, spectral data, reflectance and/or image pixel data, damage data, such as a local or global damage matrix,
  • Example 3.27 The method according to Example 3.26, wherein said obtaining data identifying characteristics of said artwork comprise: - taking a digital image (240) of said artwork (140), and extracting one or more of said data identifying characteristics of said artwork (140) from said digital image (240) of said artwork (140).
  • Example 3.28 The method according to any of the previous Examples 3.1 to 3.27, wherein said characteristics of said exposition area (160) comprise one or more of the following data: dimensional data of said exposition area, such as a room height, room width and room length,
  • - brightness data such as a brightness level of natural and/or artificial light and/or brightness profile of natural and/or artificial light and/or color temperature of natural and/or artificial light during at least one 24 hours day
  • - position data of said exposition area such as GPS position data, which may be used to estimate brightness data as a function of a local time
  • Example 3.29 The method according to Example 3.28, wherein said obtaining data (204) identifying characteristics of said exposition area (160) comprises:
  • Example 3.30 The method according to Example 3.28 or Example 3.29, wherein said brightness data are determined via at least one light sensor installed in said exposition area.
  • Example 3.31 The method according to any of the previous Examples 3.1 to 3.30, wherein said obtaining data (206) identifying characteristics of an artwork comprises obtaining a graphic representation of said artwork (140), a similar artwork or a generic artwork, and wherein the method includes:
  • Example 3.32 The method according to Example 3.31, wherein said rendering an image of said artwork (140) comprises rendering said image of said artwork (140) as a function of said data (204) identifying characteristics of said exposition area (160).
  • Example 3.33 The method according to Example 3.31 or Example 3.32, wherein said rendering an image of said artwork comprises rendering said image of said artwork (140) as a function of data identifying characteristics of a default exposition area.
  • Example 3.34 The method according to any of the previous Examples 3.31 to 3.33, wherein said rendering said image of said artwork (140), comprises
  • Example 3.35 The method according to any of Examples 3.31 to 3.34, comprising:
  • Example 3.36 The method according to any of Examples 3.31 to 3.35, comprising: obtaining data identifying characteristics of said display device.
  • Example 3.37 The method according to Example 3.36, wherein said characteristics of said display device comprise one or more of the following data:
  • Example 3.38 The method according to Example 3.36 or Example 3.37, wherein said rendering said image of said artwork (160) and/or said displaying said rendered image on said display device, comprises
  • Example 3.39 The method according to any of the previous Examples 3.1 to 3.38, wherein said determining a set of light fixtures (110) comprises: determining a plurality of sets of light fixtures and/or operating settings as a function of said data (206) identifying characteristics of said artwork (140) and said data (203) identifying characteristics said exposition area (160), displaying data identifying said plurality of sets on a display device,
  • Example 3.40 The method according to any of the previous Examples 3.1 to 3.39, wherein said determining said set of light fixtures (110) and/or said operating settings comprises: acquiring a training database of a plurally of reference illumination conditions comprising data identifying characteristics of a respective artwork, data identifying characteristics of a respective exposition area, a respective selected set of light fixtures and/or respective operating settings,
  • Example 3.41 A device comprising a display device, a user interface and at least one processing unit configured to implement the method according to any of the previous Examples 3.1 to 3.40.
  • Example 3.42 A system comprising: a device comprising a display device, a user interface, a first processing unit and a first communication interface connected to a wide area network, such as Internet, a server comprising a second communication interface connected to said wide area network, and a second processing unit, wherein said first processing unit and said second processing unit are configured to implement the method according to any of the previous Examples 3.1 to 3.40.
  • Example 3.43 The device according to Example 3.41 or the system according to Example 3.42, wherein said display device and said user interface are implemented with a touchscreen.
  • Example 3.44 The device according to Example 3.41 or the system according to Example 3.42, wherein said device is a smartphone, a tablet or a personal computer.
  • Example 3.45 A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 3.1 to 3.40.
  • Example 3.46 A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 3.1 to 3.40.
  • Example 4.1 A method of selecting at least one light sensor (120) for a lighting system (100) used to illuminate at least one artwork (140) in an exposition area (160) via one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, the method comprising the steps of: obtaining a digital model of said exposition area (160), said digital model including: o exposition area data (204) comprising data identifying the dimension of said exposition area (160); o artwork data (204, 206, 208) comprising data identifying the position of said at least one artwork (140) within said exposition area (160); o light fixture data (202, 204) comprising data identifying the position, orientation and illumination characteristics of said one or more light fixtures (110); and o background illumination data (204) comprising data identifying the position and illumination characteristics of other natural and/or artificial light sources emitting light within said exposition area 160; executing a plurality of illumination simulations (402, 404, 406) of said digital model of said exposition area by varying the illumination characteristics
  • Example 4.2 The method according to Example 4.1, wherein said digital model is a 2D model.
  • Example 4.3 The method according to Example 4.2, wherein said data identifying the dimension of said exposition area (160) comprise data identifying one or more widths and one or more lengths of said exposition area (160).
  • Example 4.4 The method according to Example 4.1, wherein said digital model is a 3D model.
  • Example 4.5 The method according to Example 4.4, wherein said data identifying the dimension of said exposition area (160) comprise data identifying one or more widths, one or more lengths and one or more heights of said exposition area (160).
  • Example 4.6 The method according to any of the previous Examples 4.1 to 4.5, wherein said exposition area data (204) further comprise data identifying at least one of: the reflectivity of one or more surfaces of said exposition area 160 and the position and dimension of obstacles within the exposition area 160, such as 3D objects and/or visitors at expected/estimated position when observing a given artwork 140.
  • Example 4.7 The method according to any of the previous Examples 4.1 to 4.6, wherein said exposition area data (204) are stored in an exposition area database.
  • Example 4.8 The method according to any of the previous Examples 4.1 to 4.7, comprising: acquiring via a camera a plurality of images of said exposition area (160), and determining said exposition area data (204) as a function of said images of said exposition area (160).
  • Example 4.9 The method according to any of the previous Examples 4.1 to 4.8, wherein said artwork data (204, 206, 208) further comprise data identifying at least one of: the dimension of said artwork (140), the reflectivity of said artwork (140) and a graphical representation of said artwork (140).
  • Example 4.10 The method according to any of the previous Examples 4.1 to 4.9, wherein said artwork data (204, 206, 208) further comprise data identifying at least one of: a requested target illumination of said artwork (140) and a maximum illumination for said artwork (140).
  • Example 4.11 The method according to Example 4.10, wherein said artwork data (204, 206, 208) further comprise data identifying a type of said artwork, and wherein the method comprises determining said requested target illumination of said artwork (140) and/or said maximum illumination for said artwork (140) as a function of said type of said artwork.
  • Example 4.12 The method according to Example 4.10 or Example 4.11, wherein said requested target illumination and/or said maximum illumination comprise a plurality of values for different colors.
  • Example 4.13 The method according to any of the previous Examples 4.1 to 4.12, wherein said artwork data (204, 206, 208) are stored in an artwork database.
  • Example 4.14 The method according to any of the previous Examples 4.1 to 4.13, comprising: acquiring via a camera a plurality of images of said exposition area (160), and determining said artwork data (204, 206, 208) as a function of said images of said exposition area (160).
  • Example 4.15 The method according to any of the previous Examples 4.1 to 4.14, wherein said light fixture data (202, 204) further comprise data identifying a beam spread or range of beam spreads for the light adapted to be emitted by said one or more light fixtures (110).
  • Example 4.16 The method according to any of the previous Examples 4.1 to 4.15, wherein said illumination characteristics of said one or more light fixtures (110) identify a light intensity and/or a color, or a range of light intensities and/or colors adapted to be emitted by said one or more light fixtures (110).
  • Example 4.17 The method according to any of the previous Examples 4.1 to 4.16, wherein said light fixture data (202, 204) are stored in a light fixture database.
  • Example 4.18 The method according to any of the previous Examples 4.1 to 4.17, wherein said background illumination data (204) comprising data identifying the position and dimensions of apertures in said exposition area (160), such as windows (164) and/or doors (165).
  • said background illumination data (204) comprising data identifying the position and dimensions of apertures in said exposition area (160), such as windows (164) and/or doors (165).
  • Example 4.19 The method according to any of the previous Examples 4.1 to 4.18, wherein said illumination characteristics of said other natural and/or artificial light sources comprises at least one of: a light intensity and/or a color, or a range of light intensities and/or colors adapted to be emitted by said other natural and/or artificial light sources; the direction or range of directions of the lighted emitted by said other natural and/or artificial light sources.
  • Example 4.20 The method according to any of the previous Examples 4.1 to 4.19, wherein said background illumination data (204) are stored in an exposition area database.
  • Example 4.21 The method according to any of the previous Examples 4.1 to 4.20, comprising: acquiring via a camera a plurality of images of said exposition area (160), and determining said background illumination data (204) as a function of said images of said exposition area (160).
  • Example 4.22 The method according to any of the previous Examples 4.1 to 4.21, wherein said executing a plurality of illumination simulations of said digital model of said exposition area by varying the illumination characteristics of said one or more light fixtures (110) and/or the illumination characteristics of said other natural and/or artificial light sources comprises at least one of: executing an illumination simulation of said digital model of said exposition area when said one or more light fixtures (110) are switched off; executing an illumination simulation of said digital model of said exposition area by varying the light intensity and/or color of said one or more light fixtures (110); and executing an illumination simulation of said digital model of said exposition area by varying the light intensity and/or color of said other natural and/or artificial light sources (164, 165).
  • Example 4.23 The method according to Example 4.22, wherein said executing a plurality of illumination simulations of said digital model of said exposition area by varying the illumination characteristics of said one or more light fixtures (110) and/or the illumination characteristics of said other natural and/or artificial light sources comprises at least one of: executing illumination simulations of said digital model of said exposition area for a minimum and a maximum value of the light intensity of said one or more light fixtures (110); and executing an illumination simulation of said digital model of said exposition area for a minimum and a maximum value of the light intensity of said other natural and/or artificial light sources (164, 165).
  • Example 4.24 The method according to any of the previous Examples 4.1 to 4.23, wherein said determining a set of light sensors (120) comprises for each artwork (140): analyzing the respective data identifying the expected illumination of said artwork (140) in order to determine a minimum and/or maximum light intensity value, and/or minimum and/or maximum light intensity values for a plurality of colors; and determining data identifying whether a light sensor (120) should be used or not used to monitor the artwork (140) as a function of said minimum and/or maximum light intensity value, and/or said minimum and/or maximum light intensity values for said plurality of colors.
  • Example 4.25 The method according to Example 4.24, wherein said determining a set of light sensors (120) comprises the following steps for each artwork (140): calculating the difference between said maximum and minimum light intensity value, and/or the differences between said maximum and/or minimum light intensity values for said plurality of colors, said difference or differences being indicative of the variability of the illumination of the respective artwork due to background illumination; comparing said difference or differences with at least one threshold; storing data identifying that a light sensor (120) should be used to monitor the artwork (140) when said difference or differences are greater than said at least one threshold; and optionally storing data identifying that a light sensor (120) should not be used to monitor the artwork (140) when said difference or differences are smaller than said at least one threshold.
  • Example 4.26 The method according to Example 4.24 or Example 4.25, wherein said determining a set of light sensors (120) comprises the following steps for each artwork (140): comparing said maximum light intensity value and/or said maximum light intensity values for said plurality of colors with at least one maximum threshold; storing data identifying that a light sensor (120) should be used to monitor the artwork
  • Example 4.27 The method according to any of the previous Examples 4.1 to 4.26, comprising determining (410) the type of each light sensor (120) in said set of light sensors (120).
  • Example 4.28 The method according to Example 4.27, wherein each light sensor (120) is selected from a list comprising at least two of the following light sensors (120): a light sensor positioned in proximity of the artwork (140) to be monitored, thereby measuring the light received at the artwork (140); a light sensor positioned in proximity of the light fixture (110) used to illuminate the artwork (140) to be monitored, thereby measuring the light emitted by the light fixture 110, which permits to calculate the light received at the artwork (140) to be monitored as a function of geometrical data specifying the position of the artwork (140) with respect to the light fixture (110); a light sensor, such as a camera, configured to measure the characteristics of the light reflected by the artwork (140) to be monitored; and a light sensor, such as a camera, configured to measure the characteristics of the light reflected by a reference surface positioned in proximity of the artwork (140) to be monitored.
  • a light sensor positioned in proximity of the artwork (140) to be monitored, thereby measuring the light received at the artwork (
  • Example 4.29 The method according to Example 4.27 or Example 4.28, wherein said determining (410) the type of each light sensor (120) comprises acquire data identifying the position of already installed light sensors (120) in said exposition area (160).
  • Example 4.30 The method according to Example 4.27 or Example 4.28, comprising determining (410) the type of each light sensor (120) as a function of the type and/or characteristics of the respective artwork (140), and/or the characteristics of the exposition area (160).
  • Example 4.31 The method according to any of the previous Examples 4.1 to 4.30, comprising:
  • - modifying said digital model of said exposition area (160) in order to include: o light sensor data (204, 218) comprising data identifying the position and characteristics of at least one light sensor (120) in said set of light sensors (120); and executing at least one illumination simulation (412) of said digital model of said exposition area, and determining for each illumination simulation data identifying a respective expected illumination of each of said at least one artwork (140) and a respective expected illumination of said at least one light sensor (120) in said set of light sensors (120).
  • Example 4.32 The method according to Example 4.31, comprising: determining an expected measurement value provided by said at least one light sensor (120) in said set of light sensors (120) as a function of said expected illumination of said at least one light sensor (120) in said set of light sensors (120) and the respective characteristics of said at least one light sensor (120) in said set of light sensors (120).
  • Example 4.33 The method according to Example 4.31 or Example 4.32, wherein said artwork data (204, 206, 208) further comprise data identifying a requested target illumination of said artwork (140) and/or a maximum illumination for said artwork (140), and wherein the method comprises: determining a target measurement value for said measurement value provided by said at least one light sensor (120) as a function of said expected illumination of said at least one light sensor (120) in said set of light sensors (120), the requested target illumination of said artwork (140) and the respective characteristics of said at least one light sensor (120) in said set of light sensors (120); and/or determining a maximum measurement value for said measurement value provided by said at least one light sensor (120) as a function of said expected illumination of said at least one light sensor (120) in said set of light sensors (120), the maximum illumination of said artwork (140) and the respective characteristics of said at least one light sensor (120) in said set of light sensors (120).
  • Example 4.34 The method according to Example 4.33, comprising:
  • control system (130) being configured to receive said measurement values from said light sensors (120) in said set of light sensors (120) and verify said measurement values and/or control the light fixtures (110) of said lighting system (100).
  • Example 4.35 The method according to any of the previous Examples 4.1 to 4.34, comprising: obtaining data (206) identifying characteristics of an artwork (140), obtaining data (204) identifying characteristics of an exposition area (160), determining a set of light fixtures (110) and/or operating setting for a set of light fixtures (110) as a function of said data (206) identifying characteristics of said artwork (140) and said data (204) identifying characteristics of said exposition area (160), wherein said light fixture data (202, 204) comprise data identifying the position, orientation and illumination characteristics of the light fixtures (110) in said set of light fixtures (110).
  • Example 4.35 may also comprise the features of any of the previous Examples 3.2 to 3.40.
  • Example 4.36 A device comprising a display device, a user interface and at least one processing unit configured to implement the method according to any of the previous Examples 4.1 to 4.35.
  • Example 4.37 A system comprising: a device comprising a display device, a user interface, a first processing unit and a first communication interface connected to a wide area network, such as Internet, a server comprising a second communication interface connected to said wide area network, and a second processing unit, wherein said first processing unit and said second processing unit are configured to implement the method according to any of the previous Examples 4.1 to 4.35.
  • Example 4.38 The device according to Example 4.36 or the system according to Example 4.37, wherein said display device and said user interface are implemented with a touchscreen.
  • Example 4.39 The device according to Example 4.36 or the system according to Example 4.37, wherein said device is a smartphone, a tablet or a personal computer.
  • Example 4.40 A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 4.1 to 4.35.
  • Example 4.41 A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Example 4.1 to 4.35.
  • Example 5.1 A lighting system (100) configured to monitoring the irradiation of an object (140) with light generated by a light fixture (110), comprising:
  • the light fixture (110) comprising one or more light sources (117), which together are configured to emit light (500) with a spatial radiation characteristic
  • a data processing unit (113, 123, 133) connected to the light fixture (110) and configured to obtain information on an intensity of the light (500) emitted by the light sources (117)
  • a first memory (202, 204) connected to the data processing unit (113, 123, 133), in which information about the spatial positioning of the light fixture (110) with respect to a surface (142) of the object (140) is stored
  • the data processing unit (113, 123, 133) is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface (142) of the object (140) as a function of the information on the light intensity, the information on the spatial radiation characteristic and the information
  • Example 5.2 The lighting system according to Example 5.1, further comprising a light sensor (120i) arranged within or adjacent to the light fixture (110) and configured to measure the intensity of the light emitted from the one or more light sources (117) in the light fixture (110), wherein the data processing unit (113, 123, 133) is connected to the light sensor (120i) to receive the information on the measured light intensity.
  • a light sensor 120i
  • the data processing unit 113, 123, 133
  • Example 5.3 The lighting system according to Example 5.1, further comprising a time measuring device (504) configured to determine an output an operating time of the light sources (117) in which the light sources (117) have been operated since they were put into operation for irradiating the object (140), a current and/or voltage measuring device (116k) configured to measure a current and/or voltage with which the one or more light sources (117) are operated, and a third memory (202), in which a function or table is stored, with which values of a light intensity are respectively assigned to a combination of a current value and/or a voltage value, and an operating time of the one or more light sources (117), wherein the data processing unit (113, 123, 133) is connected to the time measuring device (504), the current and/or voltage measuring device (116k) and the third memory (202) and is configured to receive the measured values for the current and/or voltage and the operating time respectively, and to calculate the information on the measured light intensity on the basis of the function or the table.
  • a time measuring device 50
  • Example 5.4 The lighting system according to any of the previous Examples 5.1 to Example 5.3, wherein the data processing unit (113, 123, 133) is configured to obtain sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).
  • Example 5.5 The lighting system according to Example 5.4, wherein the data processing unit (113, 123, 133) is configured to compare the calculated local intensity for at least one of the plurality of positions with a limit value in the sensitivity information for that position and to output a signal determined as a function of the comparison.
  • Example 5.6 The lighting system according to Example 5.5, wherein the signal is a control signal transmitted to a data processing unit (113) configured to receive the signal, and to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110) as a function of the control signal.
  • a data processing unit 113
  • the signal is a control signal transmitted to a data processing unit (113) configured to receive the signal, and to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110) as a function of the control signal.
  • Example 5.7 The lighting system according to any of the previous Examples 5.4 to Example 5.6, wherein the information obtained by the data processing unit (113, 123, 133) about an intensity of the light (500) emitted by the one or more light sources (117) includes respective information about light intensity for a plurality of different predetermined wavelength ranges, and wherein the sensitivity information obtained by the data processing unit (113, 123, 133) for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.
  • Example 5.8 The lighting system according to Example 5.7, wherein the data processing unit (113, 123, 133) is configured to calculate a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and to compare the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges.
  • Example 5.9 The lighting system according to any one of the previous Examples 5.4 to 5.8, comprising a camera (508) configured to scan the surface (142) of the object (140) in order to obtain color and/or brightness values for positions on the surface (142), and wherein the data processing unit (113, 123, 133) is configured to receive the position-dependent color and/or brightness values from the camera (508) and to calculate a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.
  • a camera 508 configured to scan the surface (142) of the object (140) in order to obtain color and/or brightness values for positions on the surface (142)
  • the data processing unit (113, 123, 133) is configured to receive the position-dependent color and/or brightness values from the camera (508) and to calculate a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.
  • Example 5.10 The lighting system according to any of the previous Examples 5.4 to 5.8, comprising a camera (508) or a reader device, such as a near-field communication device, configured to read an identification (510) attached to the object (140), and wherein the data processing unit (113, 123, 133) is configured to receive the identifier and obtain the sensitivity information for the object (140) to be irradiated from a memory (206) as a function of the identifier.
  • Example 5.11 The lighting system according to Example 5.9 or Example 5.10, wherein the light fixture (110), a control system (130) operatively connected to the light fixture, or a mobile unit operatively connected to the light fixture (110) comprises the camera (508).
  • Example 5.12 The lighting system according to Example 5.11, wherein the mobile unit is wirelessly connected to the light fixture (110).
  • Example 5.13 The lighting system according to Example 5.12, wherein the mobile unit is a smartphone or a tablet.
  • Example 5.14 The lighting system according to Example 5.13, wherein the smartphone comprises the data processing unit (133), and wherein the first, second and/or third memory is stored in the smartphone or in a cloud accessible by the smartphone.
  • Example 5.15 The lighting system according to any of the previous Examples 5.11 to 5.14, wherein the information on the spatial positioning of the light fixture (110) with respect to the surface (142) of the object (140) includes data identifying a distance (d) between the one or more light sources (117) and a reference point of the surface (142) of the object (140) and an angle of inclination (a) of the light fixture (110) with respect to a surface normal or a plane of the surface (142).
  • Example 5.16 The lighting system according to Example 5.15, wherein the light fixture (110) has associated or comprises a distance sensor (I2O 3 ), preferably an ultrasonic sensor, configured to measure the distance (d) between the light fixture (110) and the surface (142) and to transmit the measurement result to the data processing unit (113, 123, 133).
  • a distance sensor I2O 3
  • ultrasonic sensor configured to measure the distance (d) between the light fixture (110) and the surface (142) and to transmit the measurement result to the data processing unit (113, 123, 133).
  • Example 5.17 The lighting system according to any of the previous Example 5.1 to 5.16, comprising an inclination angle sensor (I2O 2 ), which is preferably provided in the light fixture (110) or on the surface (142) of the object (142), wherein the inclination angle sensor (I2O 2 ) is configured to measure an angle of inclination (a) of the light fixture (110) with respect to a surface normal or a plane of the surface (142), and to transmit the measurement result to the data processing unit (113, 123, 133).
  • an inclination angle sensor I2O 2
  • Example 5.18 The lighting system according to any of the previous Example 5.1 to 5.17, wherein the information on the spatial radiation characteristic of the one or more light sources (117) or the light fixture (110) includes data with a two-dimensional distribution of intensities on one plane, or on a plurality of planes at different distances from the one or more light sources (117), and wherein the data processing unit (113, 123, 133) is configured to calculate the local intensity at the given positions on the surface (142) of the object (140) by means of mathematical projection or inter- or extrapolation as a function of the two-dimensional distribution of intensities on the one plane, or on the plurality of planes.
  • the information on the spatial radiation characteristic of the one or more light sources (117) or the light fixture (110) includes data with a two-dimensional distribution of intensities on one plane, or on a plurality of planes at different distances from the one or more light sources (117), and wherein the data processing unit (113, 123, 133) is configured to calculate the local intensity at the given positions on the surface (
  • Example 5.19 The lighting system according to Example 5.18, wherein the plane or the plurality of planes are perpendicular to an optical axis (502) of the light (500) emitted by the one or more light sources (117) of the light fixture (110).
  • Example 5.20 A corresponding method of monitoring the irradiation of an object (140), e.g. comprising:
  • Example 5.21 A corresponding method of monitoring the irradiation of an object (140) with light generated by one or more light sources (117) of a light fixture (110) having a spatial radiation characteristic, e.g. comprising the steps of:
  • Example 5.22 The method according to Example 5.21, comprising receiving sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).
  • Example 5.23 The method according to Example 5.22, comprising comparing the calculated local intensity values for at least one of the plurality of positions with a limit value in the sensitivity information for that position.
  • Example 5.24 The method according to Example 5.23, wherein the control signal is configured to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110).
  • Example 5.25 The method according to any one of the previous Examples 5.21 to 5.24, wherein the information about an intensity of the light emitted by the one or more light sources (117) includes respective information about light intensity for a plurality of different predetermined wavelength ranges, and wherein the sensitivity information for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.
  • Example 5.26 The method according to Example 5.25, comprising: calculating a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and comparing the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges.
  • Example 5.27 The method according to any of the previous Examples 5.22 to 5.26, comprising:
  • Example 5.28 The method according to any of the previous Examples 5.22 to 5.26, comprising:
  • Example 5.29 A device comprising a display device, a user interface and at least one data processing unit (113, 123, 133) configured to implement the method according to any of the previous Examples 5.21 to 5.28.
  • Example 5.30 A system comprising: a device comprising a display device, a user interface, a first processing unit and a first communication interface connected to a wide area network, such as Internet, a server comprising a second communication interface connected to said wide area network, and a second processing unit, wherein said first processing unit and said second processing unit are configured to implement the method according to any of the previous Examples 5.21 to 5.28.
  • Example 5.31 The device according to Example 5.29 or the system according to Example 5.30, wherein said display device and said user interface are implemented with a touchscreen.
  • Example 5.32 The device according to Example 5.29 or the system according to Example 5.30, wherein said device is a smartphone, a tablet or a personal computer.
  • Example 5.33 A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Example 5.21 to 5.28.
  • Example 5.34 A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Example 5.21 to 5.28.
  • Example 6.1 A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, wherein a light sensor (120) is installed in said exposition area (160) in order to measure a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range, the method comprising the steps of: during a calibration phase (610-618), obtaining a global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range and measuring via said light sensor (120) the global and/or local light intensity values of the light (600) reflected by said artwork (140); during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the plurality of measured light
  • Example 6.2 The method according to Example 6.1, comprising an actuator (602) configured to vary the position of said light sensor (120) with respect to the artwork (140).
  • Example 6.3 The method according to Example 6.1 or Example 6.2, comprising: during said calibration phase (610-618), varying the position of said light sensor (120) according to a given profile, and measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light reflected by said artwork (140) for said at least one wavelength or wavelength range; during said training phase, determining said mathematical function or said dataset as a function of said sequence of said plurality of global and/or said plurality of local light intensity values of the light reflected by said artwork (140); during said normal operation phase (630-640), varying the position of said light sensor (120) according to said given profile, measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light reflected by said artwork (140), and estimating the global and/or the plurality of local light intensities at said artwork (140) as a function of said sequence of said plurality of global and/or said plurality of local light intensity values of the light reflected by said artwork (
  • Example 6.4 The method according to any of the previous Examples 6.1 to 6.3, wherein said varying the position of said light sensor (120) comprises varying the distance and/or angle of said light sensor (120) with respect to said artwork (140).
  • Example 6.5 The method according to any of the previous Examples 6.1 to 6.4, wherein said obtaining said global and/or said plurality of local light intensities at said artwork (140) comprises:
  • Example 6.6 The method according to any of the previous Examples 6.1 to 6.4, wherein said obtaining said global and/or said plurality of local light intensities at said artwork (140) comprises: obtaining geometrical data identifying the distance and optionally orientation of said one or more light fixtures (110) with respect to said artwork (140);
  • Example 6.7 The method according to any of the previous Examples 6.1 to 6.6, comprising: during said training phase, calculating specular and/or diffusive reflectance of said artwork (140), and during said normal operation phase, calculating the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or plurality of local measured light intensity values of the light (600) reflected by said artwork (140) and said specular and/or diffusive reflectance of said artwork (140).
  • Example 6.8 The method according to any of the previous Examples 6.1 to 6.7, comprising: during said calibration phase, sending (618) control commands to said one or more light fixtures (110) in order to vary the characteristics of the light (500) emitted by said one or more light fixtures (110), and each time obtaining (612, 614) the global and/or plurality of local light intensities at said artwork (140) and measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140).
  • Example 6.9 The method according to Example 6.8, wherein said control command is configured to vary at least one of the following characteristics of the light (500) emitted by said one or more light fixtures (110): light intensity, frequency/color, polarization, direction and/or beam spread.
  • Example 6.10 The method according to any of the previous Examples 6.1 to 6.9, comprising: during said calibration (610-618) and/or training phase, storing said global and/or said plurality of local light intensities at said artwork (140) and the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140) in a data structure, such as a Look-up Table; during said normal operation phase (630-640), estimating the global and/or the plurality of local light intensities at said artwork (140) via interpolation of the date stored in said data structure.
  • a data structure such as a Look-up Table

Abstract

L'invention concerne un procédé d'éclairage d'un tableau (140) dans une zone d'exposition. Le tableau (140) est éclairé avec un système d'éclairage comprenant un ou plusieurs appareils d'éclairage (110) configuré pour émettre de la lumière (500) ayant des caractéristiques variables en fonction d'une instruction de commande, un capteur de lumière (120) étant installé dans la zone d'exposition afin de mesurer une intensité de lumière globale et/ou une pluralité de valeurs d'intensité de lumière locale de la lumière (600) réfléchie par l'illustration (140) pour au moins une longueur d'onde ou une plage de longueurs d'onde. Particulièrement, le procédé comprend les étapes consistant à : pendant une phase d'étalonnage, obtenir une intensité de lumière globale et/ou une pluralité d'intensités de lumière locale au niveau du tableau (140) pour au moins une longueur d'onde ou plage de longueurs d'onde et mesurer par l'intermédiaire du capteur de lumière (120) les valeurs d'intensité lumineuse globale et/ou locale de la lumière (600) réfléchie par le tableau (140) ; au cours d'une phase d'apprentissage, à déterminer une fonction mathématique ou un ensemble de données conçu pour estimer l'intensité globale et/ou la pluralité d'intensités de lumière locale au niveau du tableau (140) en tant que fonction de la valeur globale et/ou de la pluralité de valeurs d'intensité de lumière locale mesurées de la lumière (600) réfléchie par le tableau (140) ; et pendant une phase de fonctionnement normal, mesurer par l'intermédiaire du capteur de lumière (120) la valeur globale et/ou la pluralité de valeurs d'intensité de lumière locale mesurées de la lumière (600) réfléchie par le tableau (140), et estimer par l'intermédiaire de la fonction mathématique ou de l'ensemble de données de l'intensité de lumière globale et/ou de la pluralité d'intensités de lumière locale au niveau du tableau (140) en tant que fonction de la valeur globale et/ou de la pluralité de valeurs d'intensité de lumière locale mesurées de la lumière (600) réfléchie par le tableau (140).
PCT/EP2020/072406 2019-09-25 2020-08-10 Procédés d'éclairage d'un tableau WO2021058191A1 (fr)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201962905609P 2019-09-25 2019-09-25
US62/905,609 2019-09-25
EP19218417 2019-12-20
EP19218417.4 2019-12-20
EP20157270 2020-02-13
EP20157270.8 2020-02-13
EP20167044 2020-03-31
EP20167044.5 2020-03-31
EP20174216.0 2020-05-12
EP20174216 2020-05-12

Publications (1)

Publication Number Publication Date
WO2021058191A1 true WO2021058191A1 (fr) 2021-04-01

Family

ID=71948602

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/072406 WO2021058191A1 (fr) 2019-09-25 2020-08-10 Procédés d'éclairage d'un tableau

Country Status (1)

Country Link
WO (1) WO2021058191A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210345469A1 (en) * 2019-01-18 2021-11-04 Opple Lighting Co., Ltd. Measurement method and device of light source parameters, illumination system and terminal apparatus
DE102021117963A1 (de) 2021-07-12 2023-01-12 OSRAM Opto Semiconductors Gesellschaft mit beschränkter Haftung Verfahren zum betreiben eines optoelektronschen bauelements und optoelektronische anordnung
WO2023031085A1 (fr) 2021-09-02 2023-03-09 Signify Holding B.V. Rendu d'un effet de lumière multicolore sur un dispositif d'éclairage pixelisé sur la base d'une couleur de surface
CN116456199A (zh) * 2023-06-16 2023-07-18 Tcl通讯科技(成都)有限公司 拍摄补光方法、装置、电子设备及计算机可读存储介质
WO2023203553A1 (fr) * 2022-04-22 2023-10-26 Doorandish Mehdi Système de gestion d'éclairage intelligent
EP4325186A1 (fr) * 2022-08-19 2024-02-21 PSLab Holding Ltd Dispositif et procédé de détermination de la caractéristique phototechnique d'un luminaire
EP4325184A1 (fr) * 2022-08-19 2024-02-21 Bartenbach Holding GmbH Dispositif et procédé de détermination de la caractéristique phototechnique d'un luminaire
CN117794034A (zh) * 2024-02-06 2024-03-29 广州万锐照明设备有限公司 一种舞台灯智能联动控制系统及方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030072483A1 (en) 2001-08-10 2003-04-17 Stmicroelectronics, Inc. Method and apparatus for recovering depth using multi-plane stereo and spatial propagation
US20070258243A1 (en) * 2006-05-04 2007-11-08 Zary Segall Semantic light
US7796034B2 (en) 2004-08-13 2010-09-14 Osram Sylvania Inc. Method and system for controlling lighting
US8254667B2 (en) 2007-02-16 2012-08-28 Samsung Electronics Co., Ltd. Method, medium, and system implementing 3D model generation based on 2D photographic images
WO2013017287A1 (fr) 2011-08-02 2013-02-07 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Système de surveillance pour une pluralité d'objets
WO2016203423A1 (fr) 2015-06-16 2016-12-22 Sklaer Gmbh Dispositif d'éclairage, en particulier pour éclairer des objets d'exposition
US20170265267A1 (en) 2016-03-10 2017-09-14 Osram Gmbh Method for setting illumination light
WO2018189007A1 (fr) 2017-04-13 2018-10-18 Osram Gmbh Commande d'un dispositif d'éclairage comportant au moins deux sources de lumière électriques
US20180372537A1 (en) * 2015-12-16 2018-12-27 Shields Energy Services Limited Measurement and control of lighting
US10275945B2 (en) 2014-01-03 2019-04-30 Google Llc Measuring dimension of object through visual odometry

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030072483A1 (en) 2001-08-10 2003-04-17 Stmicroelectronics, Inc. Method and apparatus for recovering depth using multi-plane stereo and spatial propagation
US7796034B2 (en) 2004-08-13 2010-09-14 Osram Sylvania Inc. Method and system for controlling lighting
US20070258243A1 (en) * 2006-05-04 2007-11-08 Zary Segall Semantic light
US8254667B2 (en) 2007-02-16 2012-08-28 Samsung Electronics Co., Ltd. Method, medium, and system implementing 3D model generation based on 2D photographic images
WO2013017287A1 (fr) 2011-08-02 2013-02-07 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Système de surveillance pour une pluralité d'objets
US10275945B2 (en) 2014-01-03 2019-04-30 Google Llc Measuring dimension of object through visual odometry
WO2016203423A1 (fr) 2015-06-16 2016-12-22 Sklaer Gmbh Dispositif d'éclairage, en particulier pour éclairer des objets d'exposition
US20180372537A1 (en) * 2015-12-16 2018-12-27 Shields Energy Services Limited Measurement and control of lighting
US20170265267A1 (en) 2016-03-10 2017-09-14 Osram Gmbh Method for setting illumination light
WO2018189007A1 (fr) 2017-04-13 2018-10-18 Osram Gmbh Commande d'un dispositif d'éclairage comportant au moins deux sources de lumière électriques

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
COLBY, KAREN M.: "A Suggested Exhibition / Exposure Policy for Works of Art on Paper", THE LIGHTING RESOURCE - MONTREAL MUSEUM OF FINE ARTS, 22 January 2019 (2019-01-22), Retrieved from the Internet <URL:http://www.lightresource.com/research-papers/A-Suggested-Exhibition-Exposure-Policy-for-Works-of-Art-on-Paper.pdf.>
THOMSON, GARY, THE MUSEUM ENVIRONMENT, 1994
UNITED STATES PATENT OFFICE MANUAL OF PATENT EXAMINING PROCEDURES, July 2010 (2010-07-01)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210345469A1 (en) * 2019-01-18 2021-11-04 Opple Lighting Co., Ltd. Measurement method and device of light source parameters, illumination system and terminal apparatus
DE102021117963A1 (de) 2021-07-12 2023-01-12 OSRAM Opto Semiconductors Gesellschaft mit beschränkter Haftung Verfahren zum betreiben eines optoelektronschen bauelements und optoelektronische anordnung
WO2023031085A1 (fr) 2021-09-02 2023-03-09 Signify Holding B.V. Rendu d'un effet de lumière multicolore sur un dispositif d'éclairage pixelisé sur la base d'une couleur de surface
WO2023203553A1 (fr) * 2022-04-22 2023-10-26 Doorandish Mehdi Système de gestion d'éclairage intelligent
EP4325186A1 (fr) * 2022-08-19 2024-02-21 PSLab Holding Ltd Dispositif et procédé de détermination de la caractéristique phototechnique d'un luminaire
EP4325184A1 (fr) * 2022-08-19 2024-02-21 Bartenbach Holding GmbH Dispositif et procédé de détermination de la caractéristique phototechnique d'un luminaire
CN116456199A (zh) * 2023-06-16 2023-07-18 Tcl通讯科技(成都)有限公司 拍摄补光方法、装置、电子设备及计算机可读存储介质
CN116456199B (zh) * 2023-06-16 2023-10-03 Tcl通讯科技(成都)有限公司 拍摄补光方法、装置、电子设备及计算机可读存储介质
CN117794034A (zh) * 2024-02-06 2024-03-29 广州万锐照明设备有限公司 一种舞台灯智能联动控制系统及方法

Similar Documents

Publication Publication Date Title
WO2021058191A1 (fr) Procédés d&#39;éclairage d&#39;un tableau
JP6895010B2 (ja) 炎を模した発光デバイス及び関連の方法
US20150035440A1 (en) Detector controlled illuminating system
CN104206025B (zh) 具有选择性地应用的面部照明构件的照明方法和设备
WO2013111134A1 (fr) Système d&#39;éclairage commandé par un détecteur
KR20190137076A (ko) 조명 기구 및 방법
US20170167675A1 (en) Linear pendant luminaire
US11913613B2 (en) Lighting assembly with light source array and light-directing optical element
US11729877B2 (en) Lighting fixture and methods
US11140757B2 (en) System for monitoring the irradiation of an object with light
CN106104375A (zh) 具优化波谱功率分布的闪灯光
CN106105393B (zh) 用于基于反射光校准光输出的方法和装置
CN111712671A (zh) 天窗灯具
CN111527384A (zh) 光黑视素活性指示器
Perrin et al. SSL adoption by museums: survey results, analysis, and recommendations
CN109973858A (zh) 一种用于水下暗场成像的照明器
Cerpentier et al. Adaptive museum lighting using cnn-based image segmentation
US20150319823A1 (en) Device for forming a light source
Bardsley et al. Solid-State Lighting R&D Plan-2015
Ragazzi Research on a bright light source: optics, technology and effects on humans
Vinh et al. Optimization and Characterization of LED Luminaires for Indoor Lighting
Huguet Ferran Algorithms for light applications: from theoretical simulations to prototyping
Ogunleye Optimization of Lightings in Commercial Premises
Miller et al. Demonstration of LED retrofit lamps at the Smithsonian American Art museum, Washington, DC
Pickering Towards a systematic methodology for the design, testing and manufacture of high brightness light emitting diode lighting luminaires

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20751170

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20751170

Country of ref document: EP

Kind code of ref document: A1