Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHODS OF ILLUMINATING AN ARTWORK
Document Type and Number:
WIPO Patent Application WO/2021/058191
Kind Code:
A1
Abstract:
A method of illuminating an artwork (140) in an exposition area is disclosed. The artwork (140) is illuminated with a lighting system comprising one or more light fixtures (110) configured to emit light (500) with variable characteristics as a function of a control command, wherein a light sensor (120) is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light (600) reflected by the artwork (140) for at least one wavelength or wavelength range. Specifically, the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork (140) for at least one wavelength or wavelength range and measuring via the light sensor (120) the global and/or local light intensity values of the light (600) reflected by the artwork (140); during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at the artwork (140) as a function of the global and/or the plurality of local measured light intensity values of the light (600) reflected by the artwork (140); and during a normal operation phase, measuring via the light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by the artwork (140), and estimating via the mathematical function or the dataset the global and/or the plurality of local light intensities at the artwork (140) as a function of the global and/or the plurality of local measured light intensity values of the light (600) reflected by the artwork (140).

Inventors:
ALFIER ALBERTO (IT)
HAAS NORBERT (DE)
ANGELINI MARCO (IT)
FRISON RENATO (IT)
VENTURATI CARLO (IT)
MORRA ANDREA (IT)
BRUDNJAK BENJAMIN (DE)
SUSIN INNA (DE)
ANGENENDT GUIDO (DE)
Application Number:
PCT/EP2020/072406
Publication Date:
April 01, 2021
Filing Date:
August 10, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
OSRAM GMBH (DE)
CLAY PAKY SPA (IT)
International Classes:
H05B45/22; G01J1/02; H05B47/11; G02B27/09
Domestic Patent References:
WO2016203423A12016-12-22
WO2018189007A12018-10-18
WO2013017287A12013-02-07
Foreign References:
US20070258243A12007-11-08
US20180372537A12018-12-27
US7796034B22010-09-14
IB2016055923W2016-10-04
US20030072483A12003-04-17
US8254667B22012-08-28
US10275945B22019-04-30
US20170265267A12017-09-14
Other References:
COLBY, KAREN M.: "A Suggested Exhibition / Exposure Policy for Works of Art on Paper", THE LIGHTING RESOURCE - MONTREAL MUSEUM OF FINE ARTS, 22 January 2019 (2019-01-22), Retrieved from the Internet
THOMSON, GARY, THE MUSEUM ENVIRONMENT, 1994
UNITED STATES PATENT OFFICE MANUAL OF PATENT EXAMINING PROCEDURES, July 2010 (2010-07-01)
Attorney, Agent or Firm:
OSRAM GMBH (DE)
Download PDF:
Claims:
CLAIMS

1. A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, wherein a light sensor (120) is installed in said exposition area (160) in order to measure a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range, the method comprising the steps of: during a calibration phase (610-618), obtaining a global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range and measuring via said light sensor (120) the global and/or local light intensity values of the light (600) reflected by said artwork (140); during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the local plurality of measured light intensity values of the light (600) reflected by said artwork (140); and during a normal operation phase (630-640), measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140), and estimating via said mathematical function or a dataset the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light (600) reflected by said artwork (140).

2. The method according to Claim 1, comprising an actuator (602) configured to vary the position of said light sensor (120) with respect to the artwork (140).

3. The method according to Claim 1 or Claim 2, comprising: during said calibration phase (610-618), varying the position of said light sensor (120) according to a given profile, and measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light reflected by said artwork (140) for said at least one wavelength or wavelength range; during said training phase, determining said mathematical function or said dataset as a function of said sequence of said plurality of global and/or said plurality of local light intensity values of the light (600) reflected by said artwork (140); during said normal operation phase (630-640), varying the position of said light sensor (120) according to said given profile, measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140), and estimating the global and/or the plurality of local light intensities at said artwork (140) as a function of said sequence of said plurality of global and/or said plurality of local light intensity values of the light (600) reflected by said artwork (140).

4. The method according to any of the previous claims, wherein said varying the position of said light sensor (120) comprises varying the distance and/or angle of said light sensor (120) with respect to said artwork (140).

5. The method according to any of the previous claims, wherein said obtaining said global and/or said plurality of local light intensities at said artwork (140) comprises:

- measuring (612) the global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range.

6. The method according to any of the previous claims, wherein said obtaining said global and/or said plurality of local light intensities at said artwork (140) comprises: obtaining geometrical data identifying the distance and optionally orientation of said one or more light fixtures (110) with respect to said artwork (140);

- measuring a global and/or a plurality of local intensities of light (500) emitted by said one or more light fixtures (110) for at least one wavelength or wavelength range; and calculating (614) the global and/or the plurality of local light intensities at said artwork (140) as a function of said measured global and/or local intensities of light (500) emitted by said one or more light fixtures (110 and said geometrical data.

7. The method according to any of the previous claims, comprising: during said training phase, calculating specular and/or diffusive reflectance of said artwork (140), and during said normal operation phase, calculating the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or plurality of local measured light intensity values of the light (600) reflected by said artwork (140) and said specular and/or diffusive reflectance of said artwork (140).

8. The method according to any of the previous claims, comprising: during said calibration phase, sending (618) control commands to said one or more light fixtures (110) in order to vary the characteristics of the light (500) emitted by said one or more light fixtures (110), and each time obtaining (612, 614) the global and/or plurality of local light intensities at said artwork (140) and measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140).

9. The method according to Claim 8, wherein said control command is configured to vary at least one of the following characteristics of the light (500) emitted by said one or more light fixtures (110): light intensity, frequency/color, polarization, direction and/or beam spread.

10. The method according to any of the previous claims, comprising: during said calibration (610-618) and/or training phase, storing said global and/or said plurality of local light intensities at said artwork (140) and the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140) in a data structure, such as a Look-up Table; during said normal operation phase (630-640), estimating the global and/or the plurality of local light intensities at said artwork (140) via interpolation of the date stored in said data structure.

11. The method according to any of the previous claims 1 to 9, comprising: during said training phase, training a machine learning algorithm, such as an artificial neural network; and during said normal operation phase (630-640), estimating the global and/or the plurality of local light intensities at said artwork (140) via said machine learning algorithm.

12. The method according to any of the previous claims, comprising: switching off said one or more light fixtures (110);

- measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140); and estimating a global and/or a plurality of local light intensities of ambient light at said artwork (140) as a function of the measured global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range.

13. The method according to Claim 12, comprising: detecting the presence of persons in said exposition area (160); and switching off said one or more light fixtures (110), when no persons have been detected in said exposition area (160).

14. The method according to Claim 12 or Claim 13, wherein natural light may enter into said exposition area (160) through an aperture (164) in said exposition area (160), wherein said aperture has associated means for varying the intensity of natural light entering through said aperture (164), wherein the method comprises: sending one or more control commands to said means for varying the intensity of natural light entering through said aperture (164) as a function of the estimated global and/or plurality of local light intensity of ambient light at said artwork (140).

15. The method according to any of the previous claims, comprising: sending one or more control commands to said one or more light fixtures (110) in order to vary the characteristics of the light emitted by said one or more light fixtures (110) as a function of said estimated global and/or plurality of local light intensity at said artwork (140).

16. The method according to Claim 15, comprising: obtaining data (208) identifying requested global and/or plurality of local intensity characteristics for at least one wavelength or wavelength range; sending one or more control commands to said at least one light fixture (110) in order to vary the light intensity at said artwork (140), such that the global and/or plurality of local light intensities at said artwork (140) corresponds to said requested global and/or plurality of local intensity characteristics.

17. The method according to Claim 16, wherein said requested intensity characteristics comprises requested intensity values for a plurality of wavelengths/colors.

18. The method according to any of the previous claims, comprising: comparing said estimated global and/or plurality of local light intensities at said artwork (140) with at least one threshold value, and sending one or more control commands to said at least one light fixture in order to reduce said global and/or plurality of local light intensities at said artwork (140) below said at least one threshold value.

19. The method according to Claim 18, wherein said at least one threshold value comprises respective threshold values for a plurality of wavelengths/colors.

20. The method according to Claim 18 or Claim 19, comprising receiving sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).

21. The method according to Claim 20, comprising comparing the calculated local intensity values for at least one of the plurality of positions with a limit value in the sensitivity information for that position.

22. The method according to Claim 21, wherein the control signal is configured to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110).

23. The method according to any one of the previous claims 20 to 22, wherein the sensitivity information for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.

24. The method according to Claim 23, comprising: determining a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and comparing the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges.

25. The method according to any of the previous claims 20 to 24, comprising:

- receiving position-dependent color and/or brightness values from a camera (508), and calculating a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.

26. The method according to any of the previous claims 20 to 24, comprising:

- receiving an identifier from a reader device, and obtaining the sensitivity information for the object (140) to be irradiated from a memory (206) as a function of the identifier.

27. The method according to any of the previous claims, wherein said light sensor (120) is configured to provide a plurality of local light intensity values for different wavelengths/colors.

28. The method according to any of the previous claims 1 to 26, wherein said light sensor (120) is a 2D light sensor providing pixel data, wherein the value of each pixel is indicative of a respective light intensity, wherein said measuring via said light sensor (120) a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) comprises: determining a subset of pixels comprising said artwork (140), and determining a global and/or a plurality of local light intensity values as a function of the values of the subset of said pixels comprising said artwork (140).

29. The method according to Claim 27 or Claim 28, wherein said light sensor (120) is a camera.

30. A lighting system (100) configured to illuminating an artwork (140) in an exposition area (160), said lighting system (100) comprising: one or more light fixtures (110) configured to illuminate said artwork (140) with light having variable characteristics as a function of a control command; a light sensor (120) configured to be installed in said exposition area (160) in order to measure a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range; and a control system (130) comprising a memory having stored a mathematical function or dataset adapted to estimate a global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength of wavelength range as a function of the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), and wherein said control system (130), during a normal operation phase, is configured to: o measure via said light sensor (120) the global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), o estimate via said mathematical function or dataset the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), o send one or more control commands to said one or more light fixtures (110) in order to vary the characteristics of the light emitted by said one or more light fixtures (110) as a function of said estimated global and/or plurality of local light intensities at said artwork (140).

31. The lighting system (100) according to Claim 30, wherein said control system (130) is configured to: during a calibration phase, measure via said light sensor (120) the global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), and obtain the global and/or plurality of local light intensities at said artwork (140), and during a training phase, determine said mathematical function or dataset adapted to estimate the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), and store said mathematical function or dataset to said memory.

32. The lighting system (100) according to Claim 31, wherein the control system (130) is configured to implement the steps of the method according to any of Claims 1 to 29.

33. A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Claims 1 to 29.

34. A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Claims 1 to 29.

35. A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) and a light sensor (120), wherein the light fixture (110) comprises a light module (118) configured to emit light and a first set of optical elements (1151) arranged in the light path of said light emitted by said light module (118), wherein said light fixture (110) is configured to emit light with variable characteristics as a function of a control command, wherein the method comprises the steps of:

- producing a translucent optical element (1152) for said light fixture (110), wherein said translucent optical element (1152) is configured to be mounted in a first plane (1106) at a first distance (dl) from said first set of optical elements (115i), wherein said translucent optical element (1152) is implemented with a translucent material (1150, 1152) comprising a first surface (1154) for receiving a light radiation ( F ') and an opposite second surface (1156) for providing an attenuated second light radiation (F'), wherein said second surface (1156) is arranged at a given variable thickness ( L ) from said first surface (1154), wherein said producing said translucent optical element (1152) comprises: o sending one or more control commands to said light fixture (110) in order to operate said light module (118) with a first operating condition, o obtaining a first matrix of first light intensity values (F'), wherein each first light intensity value ( F ') is associated with a respective area of said first surface (1154) and identifies the intensity of light expected to enter the respective area of said first surface (1154), wherein said obtaining said first matrix of first light intensity values (F1) comprises determining (1112, 1114) the beam pattern of the light provided by the first set of optical elements (115i) in said first plane (1106), when said light module (118) operates in said first operating condition, o obtaining a second matrix of second light intensity values (F*) having the same dimension as said first matrix, wherein each second light intensity value (F*) is associated with a respective area of said second surface (1156) and identifies the intensity of light requested to exit the respective area of said second surface (1156) when the expected intensity of light enters said first surface, wherein said determining said second matrix of second light intensity values (F*) comprises obtaining requested illumination values for said artwork (140), obtaining the position of said artwork (140) with respect to said light fixture (110) and calculating a requested beam pattern in said first plane (1106) via geometrical projection of the requested illumination values as a function of the position of said artwork (140) with respect to said light fixture (110), o calculating a matrix of light transmission ratios (7) having the same dimension as said first matrix and said second matrix, wherein each light transmission ratio (7) is calculated as a function of a respective first light intensity value ( F ') and a respective second light intensity value (F*), o obtaining an attenuation factor of said translucent material (1150, 1152), o calculating a matrix of thickness values ( L ) having the same dimension as said matrix of light transmission ratios (7), wherein each thickness value ( L ) is calculated as a function of a respective light transmission ratio (7) and said attenuation factor of said translucent material (1150, 1152), and wherein said matrix of thickness values ( L ) identifies the requested thickness of said translucent material (1150, 1152) between said first surface (1154) and said second surface (1156) in order to obtain said intensity of light requested to exit said second surface (1156) when said expected intensity of light enters said first surface, and o producing said translucent optical element (1152) by shaping said translucent material (1150, 1152) as a function of said matrix of thickness values ( L );

- mounting said translucent optical element (1152) at said first distance (dl) from said first set of optical elements (115i);

- generating modified illumination values by: o obtaining data identifying a viewer’s eye characteristics (210), and o modifying said requested illumination values for said artwork (140) as a function of said viewer’s eye characteristics (210); determining a global and/or a plurality of local light intensities at said artwork (140) via said light sensor (120), wherein said the light sensor (120) is configured to measure a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range, and wherein said determining a global and/or a plurality of local light intensities at said artwork (140) comprises: o during a calibration phase (610-618), obtaining a global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range and measuring via said light sensor (120) the global and/or local light intensity values of the light (600) reflected by said artwork (140), o during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light reflected by said artwork (140), and o during a normal operation phase (630-640), measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light reflected by said artwork (140), and estimating via said mathematical function or a dataset the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light reflected by said artwork (140); and sending one or more control commands to said light fixture (110) in order to vary the characteristics of the light emitted by the one or more light fixtures (110), such that the global and/or the plurality of local light intensities at said artwork (140) correspond to the modified illumination values.

Description:
METHODS OF ILLUMINATING AN ARTWORK

TECHNICAL FIELD

The embodiments of the present description relate to illumination/lighting systems, such as illumination systems configured to illuminate artworks in an exposition area.

BACKGROUND

Illumination/lighting systems configured to illuminate objects and/or persons are well known in the art. In general, an illumination system comprises at least one light fixture comprising one or more light sources, such as lighting diodes (LEDs). For example, with LED-based lighting, whose color, color temperature, and intensity can be controlled to a great degree over a wide range of parameters, smart illumination has become possible. Smart illumination can be adapted to specific requirements of persons or objects, or it can be adapted to sensor data. For example, in the simplest case, the light sources can be switched on in the presence of persons and/or the light intensity may be varied based on the ambient light. However, smart illumination may also be used for more complex applications.

Thus, in general, smart illumination may be used in office buildings, factories, but also in a museum (including an art gallery, i.e. illumination of artworks) or in the entertainment sector, e.g. for effect lighting purposes. For example, in this context may be cited United States Patent Publication No. US 7,796,034 B2.

For example, Figure 1 shows an example of a typical exposition area 160 comprising one or more artworks 140, such as a painting, a picture, a sculpture, an assortment of various pieces of art, people and the like. An artwork may also encompass (at least in part) self-lit objects. For example, the exposition area 160 may be located in an art museum or an art gallery or an exhibition or somewhere else, both inside a building or outside.

For example, in Figure 1, the artworks 140, such as paintings, may be fixed to walls 163 of a room representing the exposition area 160. Such a room may thus also comprise a ceiling 161, a floor 162, an entrance and/or exit 165, such as a door, and optionally one or more windows 164. In the exposition area 160 may thus be arranged one or more light fixtures configured to illuminate the floor 162 and/or the walls 163, in particular the artworks 140. Similarly, the one or more light fixtures may be configured to illuminate a sculpture positioned at a given place in the room 160. Accordingly, in general, an artwork may be illuminated by natural light, e.g. entering through a window 164, and/or artificial light provided by the one or more light fixtures. Illumination in such exposition areas may play a crucial role. For example, modem light fixtures for home or industrial application often have variable/settable spectral characteristic. For example, such light fixtures may often be controlled remotely in order to set a brightness level and/or a color. However, when illuminating artworks, the light generated by the light fixtures often has to satisfy more stringent requirements. For example, pigments or other materials (e.g. textiles or canvas) in artworks can be sensitive to electromagnetic radiation, including visible light. Illuminating them with a too high intensity or with light of the wrong wavelength (e.g. UV- light) may damage the artwork permanently. In addition, the illumination, i.e. the color temperature or the color, plays an important role in the presentation of the artwork, as the right illumination can highlight certain aspects of the artwork, whereas the wrong illumination can ruin the whole impression of the artwork.

SUMMARY

Accordingly, there is a need to provide improved solutions for illuminating an object, such as an artwork, or a person.

According to one or more embodiments, one or more of the above objectives is achieved by means of methods of illuminating an artwork. Embodiments moreover concern related lighting systems, and related light fixtures and light sensors, as well as computer-program products, loadable into the memory of at least one processor and comprising portions of software code capable of implementing the steps of the method when the product is run on at least one processor. Thus, as used herein, reference to such a computer-program product is understood to be equivalent to a reference to a computer-readable, non-transitory medium containing instructions for controlling the processing system for coordinating implementation of the method according to the invention. The reference to "at least one processor" is evidently intended to highlight the possibility that the present invention is implemented in a modular form and/or distributed.

The claims are an integral part of the technical teaching of the disclosure provided herein.

As mentioned before, various embodiments of the present disclosure relate to lighting systems and/or methods for illuminating an artwork in an exposition area with at least one light fixture. Generally, an illumination system comprises a light fixture, an optional sensor and a control system. Possible embodiments of such lighting systems are detailed at the following point "Example 1".

For example, various embodiments of the present disclosure relate to solutions for determining the illumination of an artwork. For example, a first aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the artwork for at least one wavelength or wavelength range. Specifically, in various embodiments, the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork for at least one wavelength or wavelength range and measuring via the light sensor the global and/or local light intensity values of the light reflected by the artwork; during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork; and during a normal operation phase, measuring via the light sensor the global and/or the plurality of local light intensity values of the light reflected by the artwork, and estimating via the mathematical function or the dataset the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork.

Possible embodiments of this first aspect are detailed at the following point "Example 6".

A second aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a reference luminance target is installed in proximity of the artwork, whereby the reference luminance target is illuminated with the light emitted by the one or more light fixtures, and wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the reference luminance target for at least one wavelength or wavelength range. Specifically, in various embodiments, the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork and/or at the reference luminance target for at least one wavelength or wavelength range, and measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target; during a training phase, determining a mathematical function and/or a dataset adapted to estimate the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity values of the light reflected by the reference luminance target; and during a normal operation phase, measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target and estimating the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity of the light reflected by the reference luminance target.

Possible embodiments of this second aspect are detailed at the following point "Example 7".

A third aspect of the present disclosure relates to a lighting system configured to monitoring the irradiation of an object with light generated by a light fixture. In various embodiments, the light system comprises the light fixture comprising one or more light sources, which together are configured to emit light with a spatial radiation characteristic, a data processing unit connected to the light fixture and configured to obtain information on an intensity of the light emitted by the light sources, a first memory connected to the data processing unit, in which information about the spatial positioning of the light fixture with respect to a surface of the object is stored, and a second memory connected to the data processing unit, in which information about the spatial radiation characteristic of the one or more light sources or the light fixture is stored. In various embodiments, the data processing unit is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface of the object as a function of the information on the light intensity, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture. Possible embodiments of this third aspect are detailed at the following point "Example 5".

Various embodiments of the present disclosure relate also to embodiments of light fixtures. For example, a fourth aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a light fixture comprising a plurality of light sources, a driver circuit configured to provide an individually controllable power supply to each of the light sources as a function of one or more control signals, a data storage device having stored at least one preset configuration data item, and a data processing unit comprising a memory. Specifically, in various embodiments, the method comprises: reading a preset configuration data item from the data storage device and storing the preset configuration data item into the memory; and generating the one or more control signals as a function of the configuration data stored to the memory.

Possible embodiments of this fourth aspect are detailed at the following point "Example 8".

A fifth aspect of the present disclosure relates to a method of operating a light fixture comprising a light module comprising one or more light sources, a power supply circuit configured to provide a DC voltage, a regulated current generator configured to provide an output current to the one or more light sources as a function of a reference signal, a current sensor configured to provide a first measurement signal indicative of the output current, and a data processing unit operatively connected to the regulated current generator and the current sensor. Specifically, in various embodiments, the method comprises executing the following steps via the data processing unit: setting the reference signal as a function of data identifying a requested illumination to be generated by the one or more light sources; determining an upper and a lower current threshold as a function of the reference signal; obtaining the first measurement signal;

- verifying whether the first measurement signal is between the upper and the lower current threshold; and in case the verification indicates that the first measurement signal is not between the upper and the lower current threshold, generating an error signal.

Possible embodiments of this fifth aspect are detailed at the following point "Example 9".

A sixth aspect of the present disclosure relates to a method of producing a translucent optical element for a light fixture, wherein the translucent optical element is implemented with a translucent material comprising a first surface for receiving a light radiation and an opposite second surface for providing an attenuated second light radiation, wherein the second surface is arranged at a given variable thickness from the first surface. Specifically, in various embodiments, the method comprises the steps of: obtaining a first matrix of first light intensity values, wherein each first light intensity value is associated with a respective area of the first surface and identifies the intensity of light expected to enter the respective area of the first surface; obtaining a second matrix of second light intensity values having the same dimension as the first matrix, wherein each second light intensity value is associated with a respective area of the second surface and identifies the intensity of light requested to exit the respective area of the second surface when the expected intensity of light enters the first surface; calculating a matrix of light transmission ratios having the same dimension as the first matrix and the second matrix, wherein each light transmission ratio is calculated as a function of a respective first light intensity value and a respective second light intensity value; obtaining an attenuation factor of the translucent material; calculating a matrix of thickness values having the same dimension as the matrix of light transmission ratios, wherein each thickness value is calculated as a function of a respective light transmission ratio and the attenuation factor of the translucent material, and wherein the matrix of thickness values identifies the requested thickness of the translucent material between the first surface and the second surface in order to obtain the intensity of light requested to exit the second surface when the expected intensity of light enters the first surface; and producing the translucent optical element by shaping the translucent material as a function of the matrix of thickness values.

Possible embodiments of this sixth aspect are detailed at the following point "Example 11". Various embodiments of the present disclosure relate also to embodiments of the operation of the control system. For example, a seventh aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with at least one light fixture. Specifically, in various embodiments, the method comprises:

- receiving one or more datasets for each of a plurality of artworks and storing each dataset in a database, each dataset comprising: o data identifying a list of pigments of the respective artwork; o data identifying the illumination of each pigment of the list of pigments during a given time period; o data identifying the ageing of each pigment of the list of pigments during the given time period;

- receiving data identifying a list of pigments of the artwork to be illuminated; determining a maximum illumination threshold for the illumination of the artwork to be illuminated as a function of the list of pigments of the artwork to be illuminated and the datasets stored in the database; controlling the illuminating the artwork to be illuminated in order to ensure that the illumination of the artwork corresponds to or is smaller than the maximum illumination threshold for the illumination of the artwork.

Possible embodiments of this seventh aspect are detailed at the following point "Example 10".

An eights aspect of the present disclosure relates to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command. Specifically, in various embodiments, the method comprises the steps of: obtaining data identifying requested spectral characteristics, obtaining data identifying a viewer’s eye characteristics,

- generating one or more control commands in order to vary the characteristics of the light emitted by the one or more light fixtures as a function of the data identifying requested spectral characteristics and the data identifying a viewer’s eye characteristics.

Possible embodiments of this eights aspect are detailed at the following point "Example 2". Various embodiments of the present disclosure relate also to solutions for selecting and/or operating light fixtures and/or light sensors. For example, a ninth aspect of the present disclosure relates to a method of selecting at least one light fixture by: obtaining data identifying characteristics of an artwork, obtaining data identifying characteristics of an exposition area, determining a set of light fixtures and/or operating setting for a set of light fixtures as a function of the data identifying characteristics of the artwork and the data identifying characteristics of the exposition area.

Possible embodiments of this ninth aspect are detailed at the following point "Example 3".

A tenth aspect of the present disclosure relates to a method of selecting at least one light sensor for a lighting system used to illuminate at least one artwork in an exposition area via one or more light fixtures configured to emit light with variable characteristics as a function of a control command. Specifically, in various embodiments, the method comprises the steps of: obtaining a digital model of the exposition area, the digital model including: o exposition area data comprising data identifying the dimension of the exposition area; o artwork data comprising data identifying the position of the at least one artwork within the exposition area; o light fixture data comprising data identifying the position, orientation and illumination characteristics of the one or more light fixtures; and o background illumination data comprising data identifying the position and illumination characteristics of other natural and/or artificial light sources emitting light within the exposition area; executing a plurality of illumination simulations of the digital model of the exposition area by varying the illumination characteristics of the one or more light fixtures and/or the illumination characteristics of the other natural and/or artificial light sources, and determining for each illumination simulation data identifying a respective expected illumination of each of the at least one artwork, and determining a set of light sensors for monitoring the illumination of the at least one artwork as a function of the data identifying the expected illumination of the at least one artwork.

Possible embodiments of this tenth aspect are detailed at the following point "Example 4".

BRIEF DESCRIPTION OF THE FIGURES

Embodiments of the present disclosure will now be described with reference to the annexed drawings, which are provided purely by way of non-limiting example and in which:

- Figure 1 shows an example of an exposition area comprising one or more artworks;

- Figure 2 shows an embodiment of an illumination system comprising at least one light fixture and a control system;

- Figure 3 shows an embodiment of an integrated illumination system comprising at least one light fixture and a control system;

- Figure 4 shows an embodiment of the illumination system of Figure 2 or Figure 3;

- Figure 5 shows an embodiment of a light fixture comprising a driver circuit and a lighting module;

- Figure 6 shows an embodiment of a lighting module;

- Figure 7 shows a first embodiment of a control system for an illumination system configured to control one or more light fixtures as a function of a viewer’s eye characteristics;

- Figure 8 shows a second embodiment of a control system for an illumination system configured to control one or more light fixtures as a function of a viewer’s eye characteristics;

- Figure 9 shows an embodiment of a method of selecting a set of light fixtures and displaying a preview of the illumination;

- Figures 10 and 11 show embodiments for selecting a set of light fixtures in the method of Figure 9;

- Figure 12 shows an example of a selected set of light fixtures;

- Figure 13 shows an embodiment of a method of selecting a set of light sensors for a given exposition area; and

- Figure 14 shows an example of a selected set of light sensors. - Figure 15 shows a first embodiment of a system for monitoring the irradiation of an object with light generated by a light fixture;

- Figure 16 shows an example of an intensity distribution on a surface of an object, as determined by the system of Figure 15;

- Figure 17 shows an example of a spatial position between a light fixture and an object;

- Figure 18 shows a flow chart of the operation of the system of Figure 16;

- Figure 19 shows a second embodiment of a system for monitoring the irradiation of an object with light generated by a light fixture;

- Figures 20 and 21 show flow charts of the operation of the system of Figure 19;

- Figures 22 and 23 show a third embodiment of a system for monitoring the irradiation of an object with light generated by a light fixture;

- Figure 24 shows a flow chart of the operation of the system of Figure 23;

- Figure 25 shows an embodiment of a light fixture with individually controllable light sources;

- Figures 26, 28 and 29 show embodiments of light modules for the light fixture of Figure 25;

- Figure 27 shows an embodiment of optics adapted to be used with the light modules of Figures 26, 28 and 29; and

- Figure 30 shows a block diagram and Figure 31 a flowchart of the operation of the light fixture of Figure 25;

- Figure 32 shows a first embodiment of a light fixture comprising a regulated current generator and a data processing unit;

- Figure 33 shows a second embodiment of a light fixture comprising a regulated current generator and a data processing unit;

- Figure 34 shows a third embodiment of a light fixture comprising a regulated current generator and a data processing unit;

- Figure 35 shows a fourth embodiment of a light fixture comprising a regulated current generator and a data processing unit;

- Figure 36 shows a fifth embodiment of a light fixture comprising a regulated current generator and a data processing unit;

- Figure 37 is a flowchart showing embodiments of the operation of the data processing unit of Figures 32 to 36;

- Figure 38 shows a lighting system configured to monitor ageing of an artwork;

- Figure 39 shows an embodiment of the operation of the lighting system of Figure 38.

- Figures 40A, 40B and 40C show embodiments of the optics of a light fixture;

- Figures 41A, 41B and 41C show embodiments of the illumination of an artwork;

- Figure 42 shows an embodiment of the optics of a light fixture comprising a translucent optical element; - Figure 43 shows an embodiment of a method of producing the translucent optical element of Figure 42;

- Figure 44 shows an example of the material of a translucent optical element; and

- Figures 45 A, 45B, 46A and 46B show embodiments of the translucent optical element of Figure 42.

DETAILED DESCRIPTION

In the following description, numerous specific details are given to provide a thorough understanding of embodiments. The embodiments can be practiced without one or several specific details, or with other methods, components, materials, etc. In other instances, well- known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the embodiments.

The headings provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.

In the following Figures 2 to 46, parts, elements or components which have already been described with reference to Figure 1 are denoted by the same references previously used in such Figures; the description of such previously described elements will not be repeated in the following in order not to overburden the present detailed description.

As described in the foregoing, various embodiments of the present disclosure relate to illumination/lighting systems adapted to be used for illuminating artworks in an exposition area, such as a room of a museum or a gallery, or other rooms, e.g. of historical buildings, e.g. religious buildings, or even outdoor environments. Moreover, in general, the disclosed lighting systems may also be used to illuminate home or industrial areas and may also be used to illuminate other objects and/or persons.

In general, the term room refers any area having a floor and one or more walls. Accordingly, the term room includes both closed environments (e.g. having four walls and an entrance/exit) and open environments, such as an open stage. Moreover, the term does not necessarily imply a room of a building, e.g. having brick or cement walls, but also refers to temporary installations, such as at faire. Accordingly, a room may be both inside a building or outdoors. Finally, when referring to the room height, the respective height may correspond to the height of a ceiling of the room, or to infinite in case of a room without ceiling, such as an open stage, wherein light fixtures may be installed on sperate support structures.

Figure 2 shows a first embodiment of a lighting system 100. In the embodiment considered, the illumination system 100 comprises a control system 130 and one or more ( i.e . a given number n of) light fixtures 110i..l l0 n.

In various embodiments, the illumination system 100 comprises also one or more (i.e. a given number m of) sensors 120i..l20 n. For example, the sensors 120 may be configured to measures parameters related to the light fixture, the illuminated object 140 (e.g. damages or certain properties of the object), the environment, or the person 150 visiting the object 140. In various embodiments, the control system 130 is configured to control one or more parameters of the light fixtures 110i..ll0 n. For example, the control system 110 may be configured to switch-on or switch-off one or more of the light fixtures 110i..ll0 n. In general, the control system 130 may be configured to modify the illumination, such as the intensity, color, color temperature, illumination pattern, etc., based on pre-defmed rules, signals received from the sensors 120i..l20 n or configuration data received via a user interface and/or a communication interface. For example, the control system 130 may perform an automated calibration of the light emitted by the light fixtures 110i..l 10 n or assist a user to configure the light emitted by the light fixtures 1101..110 n or may even perform an automated mesh-like configuration of a network of light fixtures 110i..ll0 n. Moreover, during operation, the control system 110 may be configured to vary the light emitted by the light fixtures 1101..110 n as a function of data received via the user interface and/or the communication interface and/or sensor 120 in order to permit an interaction with a user.

Figure 3 shows an embodiment of an integrated lighting system 100, wherein the control system 130 may be integrated with a light fixture 110. In various embodiments, one or more sensors, e.g. a sensor 120i, may be integrated with the light fixture 110 and/or one or more sensors, e.g. a sensor 1202, may be external with respect to the light fixture.

Figure 4 shows an embodiment of the various blocks of the illumination systems 100 shown in Figure 2 and 3.

As mentioned before, the illumination system 100 comprises at least one light fixture 110, a control system 130 (possibly integrated with the light fixture) and optionally one or more sensor 120

In various embodiments, in order to exchange data between the various circuits, the light fixture 110, the control system 130 and the optional sensor 120 comprise a respective communication interface 111, 131 and 121. In general, any wired or wireless interface may be used. Moreover, any digital or analog communication may be used for exchanging data between the various circuits.

For example, in an integrated system as shown in Figure 3, the control system 130 may measure an analog signal provided by a sensor and/or the control system 130 may vary an analog control signal of the light fixture 110, such as a voltage or current signal indicative of a request brightness.

Possible digital wired communication interfaces 111, 121 and 131 may include a Digital Addressable Lighting Interface (DALI), an Ethernet interface, and/or a Controller Area Network (CAN) bus.

Possible wireless communication interfaces 111, 121 and 131 include Bluetooth®, Zigbee®, Wi Fi ( i.e . wireless communication according to one of the IEEE 802.11 standards), a mobile communication interface, such as General Packet Radio Services (GPRS), Universal Mobile Telecommunications System (UMTS) or Long-Term Evolution (LTE) communication module, and/or optical wireless communication, e.g. via a modulation of visible and/or infrared (IR) radiation.

In general, each of the communication interfaces 111, 121 and 131 may also comprise a plurality of communication interfaces.

Thus, in general the control system 130 may be connected to the light fixtures 110 and optionally to the sensors 120 via any suitable dedicated or shared communication channel. For example, in various embodiments, a communication network is formed between the interfaces 111, 121 and 131, wherein each of the communication interfaces 111, 121 and 131 has associated a respective address. In general, the network may comprise a Local Area Network (LAN) and/or a Wide Area Network (WAN), such as Internet.

Accordingly, in various embodiments, the interfaces 111, 121, 131 are configured to connect light fixtures 110 with other light fixtures 110, sensors 120 and control systems 130. Alternatively or in addition, the interfaces 111, 121, 131 may be configured to connect sensors 120 with light fixtures 110, other sensors 120 and control systems 130. Further alternatively or in addition to either or both the preceding, the interfaces 111, 121, 131 may be configured to connect the control systems 130 with light fixtures 110, sensors 120 and other control systems 130.

In general, the communication interfaces 111, 121, 131 may be used only to exchange data, or the communication interfaces may also provide energy, e.g. to the light fixture and/or the sensor 120

As shown in Figure 5, in general, a light fixture 110 comprises at least one lighting module 118 comprising one or more light sources 117 for the illumination of an object 140 or a person 150. In general, the illumination provided by the light fixture 110 may be static or dynamic, i.e. the intensity or the color or the beam spread can change over time. In various embodiments, the light fixture 110 is configured to provide an illumination with a CRI (Color Rendering Index) of 93 or higher.

In various embodiments, the light fixture 110 may comprise a mounting feature to mount the light fixture to a wall 163 or ceiling 160, a track or a frame. It can be fixedly installed or portable. In various embodiments, the light fixture 110 may be produced so that its parts can easily be replaced, for example it can be structurally configured in a modular arrangement.

In various embodiments, the one or more light sources 117 may be selected from the following group of light sources or a combination thereof: light emitting diode (LED) including a phosphor conversion LED (pc-LED) using a fluorescent and/or phosphorescence substance for conversion, laser diode (LD), laser activated remote phosphor (LARP) light sources, organic light source such as OLED, or a quantum dot based light source.

In various embodiments, the one or more light sources 117 are configured to emit radiation in the wavelength range between UV and infrared, but preferably in the visible range, i.e. between 400 and 800 nm. In various embodiments, the one or more light sources 117 are of one color like e.g. white. Preferably, the light module comprises a plurality of light sources having different colors, such as red, green and blue to create white light. The light module 118 can also comprise white light sources of different color temperatures. In various embodiments, the light emitted by the light source 117 may be combined by an optics, e.g. to form a homogenous beam of one color. For example, the optics 115 may include one or more of the following elements: a lens and/or reflector,

- means for reducing glare, a diffuser or diffusive layer, optical filters for color changing, and/or a framer or shutter.

For example, in various embodiments, the diffuser has an inhomogeneous distribution of diffusing strength along the diffusive area. For example, the diffusion may be homogenous on a certain area of the diffuser and stronger, i.e. more highly diffused, on the side where the illumination of a first light fixture overlaps with the illumination of another light fixture. The other light fixture may have a diffuser with stronger diffusion on the overlapping edge as well, thus resulting in a more homogenous illumination in the overlapping area. The different diffusion strength can be realized, e.g. , by the form of the diffuser and/or the distribution of scattering materials in the diffuser.

In various embodiments, the light fixture 110 may also include one or more actuators 114. For example, the actuator 114 may comprise a motor which can modify the orientation of the light fixture 110 or the position of parts of the light fixture 110 like the optics 115 or the light source 117 to achieve a flexible beam angle.

Accordingly, the actuator(s) 114 are configured to change their position or positions of part of the light fixture 110, such as the optics 115. Thus, the optics may have a fixed position relative to the light module or the position can be changed using an actuator. Accordingly, in various embodiments, the optics 115 may be configured to perform a beam shaping and the actuators 114, such as motors, may be configured to modify the orientation of the light fixture 110 and/or the light sources 117 or the position, shape or properties of the optics 115. This can comprise the angle relative to a reference plane or moving the light fixture and/or the optics in a certain direction or spatial position. It can even be possible to change the form of the optics 117 and/or the light fixture 110 using an actuator 114.

For example, in various embodiments, an actuator 114 is configured to modify the form of a framer. A framer is understood in the art as a shutter system that lets the light pass which fits the contour of the artwork but shuts out the rest of the light. As an example, art lighting is conventionally a spotlight and emits some form of circular illumination. However, certain artworks, especially two-dimensional artworks such as paintings, are commonly rectangular. Thus, shutters can be added at the edges of the luminaire that shut out the round edges of the illumination, resulting in a projected rectangular beam. The position of the shutters is not fixed, so they are adaptable to the size of the painting sought to be illuminated. In some more customizable embodiments, a framer can have any form so that the contour of the artwork is met, such as e.g. the silhouette of a three-dimensional sculpture such as that of a person. A commercial embodiment of such a customizable framer is available from Osram GmbH (Munich, Germany) under the trade designation “Applaud” (available at www.art-centric- lighting.com/product/applaud/), which is an LED spotlight which has not only a “gobo” holder but also a beam shaper. Such a customizable framer is also referred to as a “gobo”. The term “Gobo” (or gobo) is a term in the art, coined from the phrase “Go Between Optics” referring to the location where the filter-like element is positioned in the light path of a lighting fixture. An exemplary gobo is a stenciled circular disc used in lighting fixtures to create a projected image or pattern. Conventional gobos are made of a variety of materials depending on their purpose, such as made of sheet-metal and referred to as “black and white”; or made of glass with a thin layer of aluminum and etched and can be “colored” to function as a color filter; or made of plastic typically for use with LED fixtures.

In various embodiments, the light module may be configured to generate a pixelized light (“digital gobo”) and e.g. comprise an array of light sources. For example, each pixel of the array can include a single LED, more than one LEDs or a single or several mini-LEDs or micro-LEDs. Each pixel and/or each LED can be controlled independently. Alternatively, a liquid-crystal based system (e.g. LCoS (liquid crystal on silicon) can be used to generate a pixelized light. The liquid crystal system can be used in transmission or reflection. Furthermore, a device referred to in the lighting arts as a Digital Micromirror Device (DMD) could be used as well to create a pixelized light. For example, with a pixelized light it is possible to project a defined pattern on the object without using mechanical means such as a mechanical framer to limit the geometrical extent of the illumination pattern. The boundaries of the illumination can be set using sensors or they can be set according to user definitions. Using a digital gobo can provide highly specific illumination for 3D-objects. The form of the object to be illuminated can be provided using a depth camera (such as a time-of-flight (TOF) camera). For example, one of the sensors 120 may be such a depth camera.

In various embodiments, the lighting module 118 may thus have associated a driver circuit 116 configured to control the power provided to the lighting module 118, and in general the operation of the lighting module 118. For example, based on the type of lighting module used, the driver 116 may provide a DC or AC current to the light source(s) 117. It can drive the light sources 117 using pulse width modulation (PWM) or any other pulse pattern modulation, or current modulation or a combination thereof. It can dim the light sources 117 individually and adjust the color temperature and the color of the light module or any other parameter relevant for the light quality, such as the CRI or the spectrum, in particular the spectral power distribution.

For example, Figure 5 shows an embodiment of a driver circuit 116 and a lighting module 118. Specifically, in the embodiment considered, the driver circuit 116 is an AC/DC or DC/DC electronic converter. Therefore, the electronic converter 116 includes two input terminals 116a and 116b for the connection to an AC or DC power supply, such as the mains, and two output terminals 116c and 116d for connection to one or more lighting modules 118. In general, based on the type of the lighting module, the electronic converter 116 may be either a voltage generator or a current generator. Similarly, the lighting module 118 may be configured to be supplied with a regulated voltage or current. Accordingly, the electronic converter 116 may receive at input, via terminals 116a and 116b, e.g. an alternated voltage V m.A c, such as 110 or 230 VAC, and supplies an output via a positive terminal 116c and a negative terminal 116d, a regulated voltage Vout , such as e.g. 12 or 24 VDC, or a regulated current i out.

Figure 6 shows an embodiment of a lighting module 118. Specifically, the lighting module 118 includes a positive input terminal 118a and a negative input terminal 118b, for the connection to the terminals 116c and 116d of the electronic converter 116. For example, the lighting module 118 may be connected, either directly or through a cable, to the electronic converter 116.

For example, in the embodiment considered, the lighting module 118 is a LED module including one or more LEDs (or laser diodes) 117, connected between the terminals 118a and 118b. For example, module 118 may include a LED chain or string 117, wherein a plurality of LEDs 117i and IP2 (or similarly laser diodes) are connected in series.

If the lighting module 118 is supplied with a regulated voltage, the lighting module 118 typically includes a current regulator 118c, connected in series with the LED string 117. In the simplest of cases, the current regulator 118c may be a resistor or a linear current regulator. The current regulator 118c may also be implemented by current mirrors or by a switched mode current source, typically including an inductor and an electronic switch.

Generally speaking, also a plurality of lighting modules 118 may be connected to the electronic converter 116. For example, if a regulated voltage V 0 ut is used, the lighting modules 118 may be connected in parallel to the terminals 116c and 116d. On the other hand, if a regulated current i o t is used, the lighting modules 118 are typically connected in series between the terminals 116c and 116d.

The various topologies of either non-insulated (boost, buck, buck-boost, etc.) or of insulated (flyback, forward, resonant, etc.) electronic switching converters are well known.

For example, Figure 7 shows an embodiment of a generic switched mode power supply/electronic converter 116 for a lighting module. In the embodiment considered, the electronic converter/driver 116 includes a switching stage 116h and in case of an AC/DC converter also rectification circuit 116f. Specifically, the input of the rectification circuit 116f, such as e.g. a diode bridge, is connected (e.g. directly) to the terminals 116a and 116b. Therefore, the rectification circuit 116f receives at input the input voltage V m,A c and provides at output a DC voltage V m ,DC. In general, between the input terminals 116a and 116b and the rectification circuit 116f may also be provided an input filter circuit 116e, configured to filter the noise produced by the electronic converter 116. Moreover, between the rectification circuit 116f and the switching stage 116h may be provided a filter circuit 116g, such as e.g. a capacitor connected in parallel with the output terminals of the rectification circuit 116f. Therefore, in this case, the filter circuit 116g receives (e.g. directly) the voltage V m ,DC and provides at output a filtered voltage, typically called a bus voltage, Vbus. In this case, therefore, the switching stage 116h receives at input the voltage Vbus. As an alternative, or in addition to filter circuit 116g, there may be provided an electronic converter with power factor correction (PFC).

In various embodiments, the switching stage 116h includes one or more electronic converters, adapted to control the current flow through a reactive element Rn 6. Typically, such a reactive element Rn 6 is a resonant circuit, including one or more inductive elements Ln 6 , such as inductors, and one or more capacitive elements Cii 6 , such as capacitors. In various embodiments, the switching stage 116h is configured to apply an alternated voltage to the reactive circuit Rn 6. For example, the switching frequency of stage 116h may be in a range between 1 kHz and 500 kHz, preferably between 20 kHz and 200 kHz. In various embodiments, a further filter circuit 116i may be provided, which is connected between the output of converter 116.

In various embodiments, the switching stage 116h is driven by a control circuit 116m, i.e. the control circuit 116m is configured to generate one or more drive signals DRV116 for driving the switching stage 116h, so as to regulate the output voltage V out or the output current i out to a desired value. The control circuit 116m may be any analog or digital circuit.

As shown in Figure 7, for this purpose the driver 116 may comprise a feedback circuit 116k configured to provide a feedback signal FBue which is determined as a function of the output voltage V out (for a voltage generator) or of the output current i out (for a current generator). For example, the operation of such an electronic converter is described in PCT/IB2016/055923, the content whereof is incorporated in the present specification as a reference.

Accordingly, the control circuit 116m may be configured to generate one or more drive signals DRV P6 until the feedback signal FBue corresponds to a requested value, e.g. indicative of a requested power supply to be provided to the lighting module 118.

Thus, by regulating the average power supply provided to the lighting module 118, the control circuit 116m may regulate the brightness of the light emitted by the lighting module 118. In various embodiments, the control circuit 116m may be configured to vary the average power supply by:

- varying the instantaneous power supply, e.g. by varying the reference value for the feedback signal, thereby performing an amplitude modulation of the current provided to the lighting source 117; and/or enabling/disabling the output of the electronic converter 116, e.g. by using a pulse width modulation (PWM) or any other pulse pattern modulation, wherein the pulse pattern modulation has a frequency which is smaller than the switching frequency of the switching stage 116h.

In general, the driver circuit 116 may also comprise a plurality of switching stages (with respective feedback loop) for providing a different power supply to a plurality of light sources. For example, by individually varying the power supply provided to light sources having different colors, the driver circuit 116 is able to perform a color mixing operation. In various embodiments, the control circuit 116m may receive data identifying the requested power supply form the control system 130 ( e.g . via the communication interfaces 111 and 131) or form a data processing unit (DPU) 113 of the light fixture 110, such as a digital processing unit, such as a microprocessor programmed via software instructions. In general, the data processing unit 113 may control the driver 116 in collaboration with the control system 130 or independently from the control system 130. Moreover, the control circuit 116m may also vary the requested power supply as a function of the data provided by a sensor 120, such as an ambient light sensor.

Accordingly, in various embodiments, the illumination generated by the light fixture 110 may be controlled by the control system 130 via the driver 116, wherein preferably the individual light sources 117 may be controlled independently.

As mentioned before, in various embodiments, the light fixture 110 may also include a data processing unit 113 and/or a data storage device (DSD) 112, e.g. having stored the software to be executed by a microprocessor 113. In general, the processing unit 113 may also be configured to implement (at least in part) the operation of the control circuit 116m. Similarly, in case of an integrated lighting system, the processing unit 113 may implement (at least in part) the control system 130.

Figure 4 also shows an embodiment of a sensor. Specifically, as mentioned before, the sensor 120 comprises an interface 121 for providing an analog signal or digital data to the control system 130 and/or the light fixture 110. As mentioned before, the sensor 120 may also be part of the light fixture 110, but it can also be separate from the light fixture 110 or it can be part of another device like a smartphone or a guiding-system used by a visitor in a museum. An example of a guiding-system used by a visitor in a museum is a virtual reality headset with transparent lenses such as that sold by Microsoft Corp. under the trade designation or trademark “HoloLens”. The sensor can be used to measure features of the light fixture. The sensors can also be used to monitor an object or a person.

Generally, any sensor 120 may be used, such as a resistive, a capacitive, an inductive, a magnetic, an optical (e.g. a camera or spectral sensor), an acoustic and/or a chemical sensor.

In various embodiments, the sensor 120 may be configured to process the information transmitted via the interface 121. For example, for this purpose, also the sensor 120 may comprise a data processing unit 123, such as a digital processing unit, such as a microprocessor programmed via software instructions, and optionally a data storage device 122, e.g. having stored the software to be executed by a microprocessor 123.

In various embodiments, the sensor 120 may comprise one or more actuators 124, such as motors, e.g. controlled by the processing unit 123. For example, an actuator 124 may be configured to change the position or position of part of the sensor 120. This can comprise the angle relative to a reference plane or moving the sensor in a certain direction or spatial position. Thus, in various embodiments, the control system 130 is configured to receive data from the sensor 120 and control the light fixture 110 and optionally the sensor 120. For example, the data can be analyzed by the data processing unit 133 and/or stored in the data storage device 132. Thus, in various embodiments, the sensors 120 may measure data indicative of an object 140 and/or a person 150. The control system 130 may then analyze the data and set the illumination of the light fixture 110 accordingly to illuminate the person 150 and/or the object 140.

For example, in various embodiments, the sensor 120 is configured to measure parameters of an object 140 and/or a person 150. One parameter could be the form of an object 140, so that the light fixture can be set to only illuminate this form. Another parameter could be the reflectance of the object 140 or parts of the object 140 and its temporal change to identify damages of the object.

In various embodiments, the sensor 120 is configured to detect the presence, behavior, gestures, voice or action of a person 150. The signals might be used to modify settings of the light fixture (e.g. increase the light intensity when a person approaches the object) or to provide a warning to security personnel (when a person come too close to the object).

In various embodiments, the sensor 120 is configured to react to external stimuli (e.g. sound or music) so that the light fixture sets its illumination accordingly.

The settings for the light fixture 110 may also be derived from data or settings stored in the data storage device 112 and/or 132. For example, in various embodiments, the data are compared to values/parameters stored in the data storage device 112 and/or 132.

In various embodiments, the sensor 120 may comprise a photometric sensor or a camera to measure parameters like the intensity, the color temperature, or a 2-dimensional distribution of these parameters of the illumination system, the ambient light or the intensity of light reflected by the painting. The measurement may include maximum values and time-integrated values. For example, in this way, the sensor may be used to detect critical illumination that might damage the object.

In various embodiments, the sensor 120 is used to measure the on-time of the system, i.e. the overall illumination as well as the illumination at certain wavelengths. For example, for this purpose the sensor may use the processing unit to analyze the data and the data storage device 122 for storing the processed data.

In various embodiments, the sensor is used for predictive maintenance. For example, the sensor 120 may (provide data which permit to) detect or predict a failing light module or a failing light source. The term “failing” can be understood as an excessive excursion from the desired parameter. Failing in the context of present embodiments can mean that the light intensity is reduced, the light spectrum changed, and it can also mean that the light intensity can be undesirably increased which could damage an object.

In various embodiments, the sensor 120 is configured to detect deteriorating, ageing or damaging of an object 140.

In various embodiments, the sensor 120 is configured to measure temperature, humidity and/or chemical components like pollutants or other compounds that could damage the artwork. The sensor can be connected to an HVAC-System to control temperature and humidity, either directly or via the control system.

Figure 4 also shows an embodiment of the control system 130. The control system may comprise a data processing unit 133, such as a digital processing unit, such as a microprocessor programmed via software instructions.

In various embodiments, the control system comprises a data storage device 132, e.g. having stored the software to be executed by a microprocessor 133, i.e. the control system can comprise a computer program configured for performing a method of controlling and/or calibrating the light source 117 and/or optics 115 and/or lighting fixture 110 and/or sensor 120, wherein the computer program can be stored in the data storage device.

Accordingly, the control system 130 may be implemented with any processing system, possibly also including distributed processing systems, such as cloud-based systems, e.g. including a personal computer, a local and/or remote server, and/or a mobile device, such as a smartphone or tablet. Accordingly, the control system may be available locally, e.g. on a server or a smartphone, it can also be installed non-locally, i.e. in the cloud. Parts of the control system can be distributed among several units (e.g. cloud and smartphone) and interact via the interface(s) 131.

The data storage device 132 may also be used to store and retrieve settings and parameters for the light fixture 110, settings and parameters for the sensor and data collected by the sensor. The data storage device can also contain pre-defmed scenes for the illumination (e.g. settings for a certain artist, or a certain object or a certain epoch). The pre-defmed scenes can include settings for the overall color temperature of the light fixture or settings to highlight certain aspects of an object, e.g. when using a light fixture with a pixelized light. Highlighting can be understood as to increase the intensity on the object in a certain area; or to use a marker, e.g. an arrow to point to a certain area on the object; or to provide explanatory commentary such as by text or symbols.

In various embodiments, the control system 130 is configured to control the light fixture 110 and/or the sensor 120 and collect data measured by the sensor 120. As mentioned before, the control system 130 may comprise for this purpose a communication interface 131. For example, the interface 131 may connect the control system 130 with the light fixture 110, the sensor 120 and other control systems (which can include other users or control systems like HVAC).

In various embodiments, the control system 130 comprises a user interface 134, such as one or more visual, acoustic or haptic indicators, buttons, etc. A user can also be a visitor of a museum or a show. For example, in various embodiments, the user interface 134 may be implemented with a touchscreen.

Accordingly, the user interface 134 may be a graphical user interface, it can also use sensor data, e.g. to interpret gestures or behavior of a person, or voice commands. It can comprise a display to communicate with the user. The user interface may be implemented as an application for a mobile device, such as a smartphone, a tablet, a mobile computer or similar devices. As mentioned before, the control system 130 may control several light fixtures 110. These light fixtures 110 might illuminate different objects (i.e. each light fixture illuminates one object 140) or several light fixtures 110 illuminate one object 140 ( e.g . from different sides or because the object is too large to be illuminated by one light fixture). The light of the light fixtures 110 can be aligned so that the illuminated areas do not overlap but also do not show non-illuminated borders between the illuminated areas. Or the illumination areas can overlap. In this case, the intensity of the LEDs in the overlapping area can be reduced so that the overall light intensity on the objects is homogenous and equal. The overlapping area can be detected by a sensor system, or the information can be provided by a user using a user interface.

In various embodiments, the control system 130 is configured to exchange settings between other control systems 130 e.g. via the interface 131. For example, the settings for the illumination of a certain painting can be transferred to another museum if the artwork will be lent to the museum to assure that it will be illuminated with the right parameters.

In various embodiments, the control system 130 is configured to adjust parameters of the light fixture 110 and the sensor 120 as a function of data received via the user interface 131 or the communication interface 134. The change can be done while the user is near the light fixture 110 or remotely such as over the internet. The change can for instance be done using an application running on a smartphone. In this way, a user can, for example, set the color temperature in Kelvin or the intensity in lumen or modify the light spectrum of the light fixture. The user can also set the beam shape and the control system will control the actuators accordingly. Accordingly, in various embodiments, the control system 130 is configured to set one or more spectral characteristics of the light emitted by the light fixture(s) 110. For example, in various embodiments, the control system 130 is configured to vary the spectrum considering an eye- sensitivity curve of a user and/or the age-dependent eye-sensitivity of a user.

In various embodiments, the control system 130 is configured to allow the user to define a schedule for the light fixture 110 and/or the sensor 120. For example, the light can be switched on at a pre-defmed time, the color temperature can change over the day (considering that certain illuminations might damage an object, thus restricting the circadian illumination in intensity and/or with respect to certain wavelengths), or the light can be dimmed at certain times.

In various embodiments, the control system 130 is configured to receive via the interfaces 131 and/or 134 data identifying the artwork 140, such as the epoch, the object or the artist, and the control system 130 may propose settings for the light fixture 110 and/or the sensor 120. It can store these pieces of information in the data storage device for later usage. In another aspect, a camera can take a photograph of the object 140 and the control system 130 may analyses the image and identify the object, the artist, the epoch, and, on the basis of one or a combination of these sensed inputs, proposes settings for the light fixture 110 and/or the sensor 140. Of course, the proposed settings can be overridden by the user using the user interface.

In various embodiments, the control system 130 is configured to take sensor data into account when setting the parameters for the light fixture 110. For example, the sensor 120 can measure the ambient light and the control system 130 can reduce the intensity of the light fixture so that the desired light intensity, considering either the overall intensity or the intensity at certain wavelengths, illuminates the object. Alternatively, the sensor 120 measures the spectrum of the light and the control system 130 sets the drivers 116 of the light fixtures 110 so that a desired overall color or color temperature is reached (either measured directly at the luminaire or measured at the picture, such as through its luminance).

In various embodiments, the control system is configured to provide methods to minimize the possible damage to an object 140 due to the illumination by measuring and analyzing the direct light emitted by the light fixture 110, the light reflected by the object 140 or the ambient light, either in absolute values or measuring values relative to a defined target value. To achieve this, the control system 130 may e.g. analyze the sensor data (e.g. intensity overall or at certain wavelengths), it can store the critical parameters over time and integrate them, or it can calculate the intensity at the object using the light distribution of the light fixture and the geometry between light fixture and object. The control system 130 may compare actual data with data stored some time ago (e.g., days, weeks, months, years) in the data storage device. Preferably, the light exposure is monitored cumulatively, somewhat analogously to the concept in health physics and radiation protection known as “dosimetry” to assess absorption of radiation over time.

In various embodiments, the data relevant for the safety of the artwork may be stored in the data storage device 112, 122 or 132. Preferably, the data are stored in the data storage device 132 of the control system 130 and/or database which tracks the cumulative life-exposure of the object 140. In addition, other parameters related to possible effects on an object such as humidity, temperature and/or vibrations or shocks during transportation can also be stored in the data storage system and/or database. These data can be referenced by using one or a combination of the name of the object, some other reference provided by the user, or a signature derived from the object itself (e.g. colors and shapes measured by the camera), thus providing a digital fingerprint. The stored data can be passed on if the object should be illuminated with a different light fixture or at a different place. The digital fingerprint, especially a digital fingerprint created under normalized illumination conditions, can be used to identify the object.

In various embodiments, the control system 130 is configured to perform actions to protect objects 140 from being damaged by the illumination, e.g. by comparing the measured values and time-integrated values with threshold values for the respective object. The actions can comprise one or a combination of reducing the illumination intensity, switching the illumination off, or providing an alarm signal.

In various embodiments, the control system 130 is configured to perform actions based on user behavior. For example, the control system 130 may change the illumination level in response to presence or absence of a user, such as when a person approaches the object, the intensity is increased quickly, or alternatively the intensity is reduced slowly when the person has left, or the control system 130 may provide a warning to the security personnel such as when a person comes too close to the object.

In various embodiments, the control system 13 is configured to track how a person 150 is moving through the exposition area and/or a sequence of exposition areas, e.g. rooms of a building, thus e.g. analyzing which objects 140 are of more interest to the visitor of a place.

In various embodiments, the control system is configured to localize a person in an exposition are and provide additional information about an object 140, e.g. by projecting it on the wall 163, or through a downloadable software application (“app”) on a smartphone, or by using augmented or virtual reality.

In various embodiments, the control system 130 is configured to store the settings of the light fixture 110 either locally or in the cloud to restore them quickly after a power failure.

In various embodiments, the control system 130 can support the commissioning of the light fixtures 110 and/or the sensors 120. For example, in various embodiments, the user scans a bidimensional bar-code, such as QR-code, on the luminaire, assigns a name (instead of using an encryption code), and defines the position on a map of the building.

In various embodiments, the light fixture 110 is installed and a camera, which could be attached to a light fixture 110 or stand-alone, takes a picture, then the object 140 is identified, and then the position of the light fixture 130 can be derived from the known position of the object 140.

In various embodiments, the control system 130 is configured to collect data about the lifetime of the light fixtures 110 and sensors 120 and/or detected damages. Artificial intelligence (“AT’) can be used to detect degradation of a light fixture based on the sensed data. The control system 130 may provide this information graphically to the user e.g. by presenting the floor map and highlighting damaged fixtures or fixtures close to their end-of-life. This information could also be used by the supplier to provide a Light as a Service (LaaS) maintenance program to repair and/or replace lighting.

In various embodiments, the control system 130 is configured to distribute firmware updates to the light fixtures 110 and the sensors 120.

In various embodiments, the control system 130 provides an application programming interface (API) for controlling the illumination system by a third party and/or for third party data integration, e.g. settings for the light fixture or information about the artwork that can be presented to a user. The API can also be used to control the illumination system using software (e.g. an app) on a third-party device such as a smartphone.

In various embodiments, the control system 130 uses machine learning (also referred to as artificial intelligence) and/or data mining to optimize the illumination settings for a given object 140, e.g. the control system can analyze pictures taken by users, and which users potentially have shared on social media together possibly with commentary in text form describing their impressions, critiques or other feedback, to optimize the light settings. Illumination of an artwork based on viewer’s eye characteristics

Figure 8 shows an embodiment of an illumination system 100 configured to change the illumination of an artwork based on a viewer’s eye characteristics

As described with respect to Figure 4, a light fixture 110 may comprise pre-set or adjustable light sources 117, in particular with respect to light intensity and color, orientation of irradiation etc. Optionally, the light fixture 110 may comprise a variety of optical elements 115, such as lenses, diffuser, color filter, etc., and/or sensors, such as temperature, humidity, light intensity, color sensors, as well as sensors for people tracking, such as by using IR-radiation (emission and sensing). Operation of the light fixture is controlled by a driver circuit 116 and optionally a data processing unit 113 and/or a control system 130, which e.g. may be configured to monitor the operation of the light sources 117 and/or control the intensity and color of the light emitted by the light source 117. Moreover, in order to control mechanical components of the light fixture, such components may have associated one or more actuators controllable by the control system 130 and/or the data processing unit 113. For example, one or more actuators 114 may be configured to control movement and orientation of the light sources 117 and/or the optical elements 115.

For example, in various embodiments, the light fixture 110 may have a control unit 113 and a driver 116 for a plurality of LED light sources 117 having different colors. Accordingly, the control unit 113 may be configured to adjust the resulting color temperature of the irradiated light to a certain value. For example, the requested value may be received from the control system 130.

Accordingly, in various embodiments, the light fixtures 110 may be configured to provide a tunable light output in particular a tunable white light output.

In various embodiments, such combination of LED light sources 117, LED driver 116 and control unit 113 is configured to adjust the color temperature of the total light output so that the color point lies on or in the vicinity of the Planck Curve also called Planckian locus, e.g. in a CIE color diagram, such as the CIE 1931 color diagram.

In particular, in various embodiment, the light sources 117 are selected (and the driver 116 is configured), such that the processing unit 116 may adjust and keep the resulting color temperature approximately on and along the Planck Curve, for example, in the range between 1800 K and 6000 K. Specifically, the term approximately indicates that the deviation from the Planck Curve is at most 2 MacAdam Ellipses, preferably within 1 MacAdam Ellipse, further preferably within a circle with a diameter that equals the minor axis of such a MacAdam Ellipse and the circle center being located on the Planck Curve.

As described in the foregoing, color mixing may be obtained by regulating the proportion between the intensity of light emitted by light sources having different colors. Such regulation may be obtained by controlling the average power supply provided to the light sources, e.g. by applying a PWM or other dimming methods to the light sources. In general, color mixing may be performed via fed-forward control, e.g. based on a lookout table, such as a PWM-LED-Color-Look-Up-Table, assigning given power-supply parameters to the light sources 117 based on a requested color temperature. Additionally or alternatively, a sensor 120 (e.g. integrated in the light fixture or arranged in the vicinity of the object to be illuminated) may be used to provide a feedback of the color, e.g. the color temperature, emitted by the light fixture. Specifically, in various embodiments, this color feedback loop is an outer/second feedback loop with respect to the inner/first feedback loop used to regulate the power supply to the light sources 117.

As described in the foregoing, an artwork may be any object 140, such as a painting, a picture, a sculpture, an assortment of various pieces of art, people and the like. An artwork may even encompass self-lit objects. In general, the object 140 may be located in any kind of exposition area 160, such as a room of an art museum or an art gallery or exhibition or somewhere else (inside and outside of a building).

As already explained, a user/viewer is a person who wants to experience and see an artwork 140, be it a painting, a sculpture or something else. Such experience can be performed on-site (i.e. life in person) or remote (i.e. virtual). A user might want to see and experience an artwork as intended by the artist. However, the fulfillment of such wish is not easily achievable, since many factors play a role.

In general, a user usually perceives an artwork 140 with the eyes, although other sensory inputs like hearing, touch and smell, may be used as well. It is known from scientific research, that the human eyes degenerate over time due to a variety of causes, like macular degeneration, cataracts, arcus senilis, corneal changes, and decreasing performance of photoreceptors. All this may result in decreased visual acuity, declining sensitivity of a visual field, decreasing contrast sensitivity, and increased dark adaptation threshold. One effect could be that the differentiation or distinction between a light blue color and a yellow-green color becomes more difficult for an elderly person. All this may affect user experience when visiting an exposition area 160 or seeing a piece of artwork 140 remotely on a display.

For example, when a user performs an onsite viewing, an artwork 140 may be illuminated by natural and/or artificial light for viewing purposes. In this respect, an artwork 140 may be illuminated with various kinds of light sources 110 at the same time under different irradiating angles and beam diameters. In various embodiments, such illumination setting may be described by an Illumination Matrix (IM).

For example, in various embodiments, the control system 130 may have associated a one or more databases 200 comprising a light fixture database 202 having stored data identifying the installed light fixtures 110 configured to illuminate a given artwork 140. For example, these data may include for each light fixture in a given exposition area data identifying light intensity, frequency/color, polarization, direction and/or beam spread.

Moreover, in various embodiments, an exhibition area database 204 may have stored the characteristics of the exhibition area 160, such as the position of the artworks, and/or an artwork database 206 may have stored the characteristics of the artwork 140, such as the dimension, respective color and reflectivity data, etc.

Thus, knowing the characteristics of the light fixtures 110, the position of the artwork 140 and optionally the characteristics of the artwork 140, the control system 130 may calculate/estimate the illumination of an artwork 140. Additionally or alternatively, at least part of the actual illumination conditions of an artwork 140 may also be measured via one or more sensors 130.

In various embodiments, the light fixture database 202 may contain data specifying the range of each modifiable parameter of a light fixture 110, such as the possible light intensity and color range. These data may also be in addition to the data identifying the current illumination condition ( e.g . in case of estimation).

In general, the database 200 may be a single database or a distributed database. Moreover, the database 200 or portions of the database 200 may be stored within the control system 130 or within another computer, such as a remote database server, accessibly by the control system 130. In various embodiments, data compressing algorithms may be used to reduce the dimension of one or more of the databases 200.

In the case of remote viewing, one or more photographs have to be taken and/or an image recording has to be performed. This is schematically shown in Figure 8, wherein a camera 230 acquires one or more images 240 of the artwork 140 which is transmitted to a remote device 250. For example, an image database 216 may be used to store the images 240 of the various artworks 140, which may be seen remotely. In various embodiments, data compressing algorithms may be used to reduce the dimension of the images 240.

However, also image taking may be done under various lighting situations and under various positions and angles with respect to the illuminated artwork. This means that image measurement (image taking) is functionally related to the artwork Illumination (ambient and artificial), the reflectivity features of an artwork and the image measurement characteristics (see also the related description of the light fixture database 202 and artwork database 206).

Moreover, also the camera 230, e.g. a CCD or CMOS camera, uses optics and photoelectrical sensor chips for pixelated image measurement, as well as software digitizing and transforming such data using representations in color spaces. Usually, a camera needs to have filter segments that are placed in front of the image-sensor chips, for example RGB filters in a Bayer configuration/setting, in order to allow for color perception and respective measurement. However, filter segments, sensor-chips and signal procession will influence the image acquisition.

Accordingly, in various embodiments, a camera database 214 may have stored the characteristic of the camera 230 used to acquire the image 240.

Finally, also the display device used to view the image may play a role. In fact, a Display Device has the ability to transform digital image contents into a visual representation, i.e. it has the ability to transfer digital image data (however complex) into electronic commands for display or projection pixel control. The display of such a display device may be of any kind of currently used (or anticipated of future use) display or displaying units, for example LCD-Displays, AMOLED-Displays, Laser Projection Devices, Plasma Screens, Augmented and Virtual Reality Glasses, and the like. Each display has its own color representation possibilities, viewing angles, brightness range and related limitations. In this respect, often a display device ( e.g . a laser projector) has some kind of control unit with data processing, data storing and data communication capabilities that is configured to select, calculate and apply Optical Transfer Functions (OTF) to a provided Digital Image Representation, so that a pixelated image can be properly displayed.

Accordingly, in various embodiments, a display device database 212 may have stored the characteristic of the display device 250.

All these transformations of the spectral characteristics of the illumination of the artwork should be taken into account in order to reproduce the intended visualization. Such transformations may be expressed via Image Transfer Function (ITF) or an optical transfer function, which describes how an incoming light distribution is changed when passing through or being deflected or reflected by the next optical element or passing from one display medium to another. In this context, an optical transfer function describes how an Image Representing Digital Data Set (DDS) is changed when passed on to the next image displaying process in chain.

This means that a process chain may have several optical transfer functions. For example, in various embodiments, the camera database 214 may have stored data identifying the optical transfer function of the camera and the display device database 212 may have stored data identifying the optical transfer function of the display device.

However, also the viewer’s eye has an optical transfer function. It may therefore be appreciated if the characteristics of a viewer’s eyes are known, either by previous measurements or by or actual on-site measurements (at least in regard to some visual aspects) and taken into account when selecting the best illumination of an artwork or when viewing a virtual representation of an artwork. Personal Eye Data (PED) may be based on a variety of measurements and testing procedures for visual perception. Of course, such an illumination may take into account the combined effect of natural (sunlight) and artificial lighting. Accordingly, in various embodiments, the database 200 may comprise a visitor’s eye database 210 having stored data identifying the optical transfer function of the visitor’s eye.

Accordingly, in various embodiments, an artist (i.e. any person or even an artificial Intelligence computer who wants to present an artwork; the artist may or may not correspond to the creator of the artwork) may specify preferred lighting settings for the artwork, for example with respect to the illumination condition, which may correspond, e.g., to the spectral characteristics of candle light, sunrise, midday or evening lighting scenarios, or lighting under a certain angle and beam spread, or changing colors according to a pre-defmed or ad-hoc generated time table or schedule. In various embodiments, the artist may thus provide data identifying these lighting setting e.g. by specifying a color temperature or color location, or based on reference lighting settings, a requested Image Representing Digital Data Set (DDS), spectral reference data or color tables. Accordingly, in various embodiments, the database 200 may comprise an artist illumination database 208 having stored the data identifying requested lighting settings, such as an Artist Lighting Scenario Matrix (ALSM).

In various embodiment the operator of the exposition area receives these settings (from the database 208) and transforms them into operating commands for each of the installed light fixtures 110 so that such input data and related lighting settings are adjusted and represented as best as possible.

For example, in order to experience and see an artwork, a user may interact with on-site lighting conditions or, being remotely, with remote display functions in combination with off-site lighting conditions, both affecting color perception and spectral accuracy of what is to be seen. For example, as mentioned before, some of the visual -related conditions for on-site viewing of an artwork include: illumination of the artwork (natural and/or artificial; steady or changing), perception conditions of the viewer, the artist preferred lighting settings, etc. Conversely, some of the visual -related conditions for off-site/remote viewing of an artwork include the ambient lighting at the viewer’s site, display type with color setting, the viewer APP and GUI interfaces, transfer function for Digital Image Data into pixelated display data, etc.

In various embodiments, a viewer may also specify preferred settings, e.g. via a user interface (such as a Smartphone APP or any other GUI), which permit to set preferred lighting conditions (at least within specified boundaries so that lighting does to affect an artwork in a harmful way). For example, these data may be stored in the database 210.

Thus, in various embodiments, a user may be a person with given a User Visual Perception (UVP) and optionally given user preferences for lighting conditions and color perception (UVPP). For example, in various embodiments, the user visual perception is specified via eye deficiency data, which are stored in the database 210.

Specifically, a human eye is an organ of perception. A human eye is a very complex biological product that finally transfers signals to the visual cortex area of the brain. Color perception is based on many influencing factors (both physiological and psychological). For example, in this context may be cited https://www.sciencedirect.com/topics/engineering/colour-perc eption.

A human eye and therefore color perception will degenerate over time caused by biological shortcomings and degenerations. Therefore, each human eye has its own visual characteristics and limitations. This means that the eye performance is correlated to the age of a person.

A user may know the individual Eye Deficiencies Matrix (EDM) based on a variety of measuring methods. For example, for this purpose may be used an eye testing device and method. Such a device/method measures visual eye characteristics, like color perception. Due to complexity and certainly also due to incomplete understanding of eye function and color recognition, only some aspects have so far been accessible for research and testing. Since every human is affected by ageing, certain eye deficiencies occur over time, like Presbyopia (difficulty with near vision focus), cataract glaucoma and macular degeneration. Some of these conditions may be measured with standard test procedures like Ishihara plate tests, Holmgren tests and Farnsworth tests. Other People may be affected with color blindness or color deficiencies, like Protanomaly (red weak), Deuteranomaly (green weak) and Tritanomaly (blue weak). Certain methods of detection of eye deficiencies have been described in literature. Some approaches have been developed to mitigate this problem by providing re-coloring techniques, especially by color enhancing or color change for computer-based vision. For example, in this context may be cited “Improving Discrimination in Color Vision Deficiency by Image Re-Coloring”, available at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6567888/.

Accordingly, in various embodiments, such matrices (UVP, EDM) may be stored in the database 210. An on-site or off-site user (viewer) may be willing to provide the personal UVP and EDM data to the operator of the exposition area or to a display device provider and allow use of such data for changed or improved image display and lighting setting.

Specifically, in various embodiments, the eye characteristics of a plurality of visitors are stored in the database 210, wherein respective visitor’s eye characteristics are associated with a univocal visitor code. In this case, a visitor may provide his/her univocal code, which e.g. may be stored in any suitable manner on a support 224, such as an alphanumeric string, a barcode, a bi-dimensional barcode, such as a QR code, a magnetic support or a short range wireless transmitter, such as an Radio-Frequency Identification (RFID) or Near-Field Communication (NFC) transponder or a Bluetooth® transceiver, to the operator of the exposition area and the operator may obtain the respective eye data from the database 210.

Additionally, or alternatively, the visitor’s eye data may also be stored on a portable memory support, such as a memory card 220, such as a SD card or chip card, or a smartphone 222. Accordingly, in various embodiments, an artist may store preselected lighting conditions, e.g. in the form of an Artist Lighting Scenario Matrix stored to the database 208, so that it can be used by: an operator of the exposition area for adapting the characteristics of artificial light emitted by one or more light fixtures 110, or a display device APP or GUI for proper transformation of digital image data (DDS) into pixelated display settings.

An artist may even want to include his own personal eye deficiency data into that database 208, which thus permits that a viewer perceives an artwork as the artist has perceived the object.

In fact, an artist creates an artwork, like a painting or a sculpture. An artist may work during various lighting conditions like candle light (Michelangelo) or natural sunlight in the morning, during midday or in the evening, or with artificial light (fluorescent lamps, halogen lamps LEDs). An artist (either the creator of the artwork or another artist) may thus want to specify the illumination and color perception or various other settings of illumination (angle of incoming light, beam spread, spectral distribution etc.) that refer to how the artist wants the artwork to be viewed. In other words, an artist can specify illumination settings and communicate these for an exhibited artwork. Now, since the eyes of an artist also might have already undergone ageing or other detrimental effects, these conditions can be stored in the database 208. Specifically, knowing a viewer’s eye characteristics in regard to color perception and other optical characteristics can be used to improve visual reception by adjusting the lighting setting of the light fixtures in the exposition area. Knowing a user’s preferred lighting setting may further improve user experience.

For example, in various embodiments, once having obtained the requested light data from the database 208 and the user’s eye characteristics from the database 210 or the support 220/222, the control system 130 may be configured to vary the characteristics of the light emitted by the light fixtures, i.e. the control system 130 may send one or more control commands to the light fixtures 130. In this way, the control system 130 may adjust/optimize the illumination of an artwork when the user in there in person, thereby taking into account the viewer’s eye characteristics so that the perception of the artwork is improved, e.g. closer to what the artist experienced, thereby e.g. compensation eye degradation. For example, an illumination unit may be adjusted to emit light with a higher red or green content (spectral intensity) or light with a shifted color location, for example along the Planck-curve or freely in color-space. Therefore, a piece of art may be - depending on viewer’s data (PED) - illuminated with adjusted light characteristics, here called Preferred Viewer Illumination (PVI).

As mentioned before, this approach may include that a person stores his or her personal eye data electronically, e.g. on a chip card 220 or in a Smartphone App 222 so that these data can be transferred to a lighting control unit 130 and used for proper (improved) illumination. This would allow then a user to transmit the eye characteristics upfront to a museum or art gallery, which may then store the data in the database 210. Once the viewer is identified being in front of a picture these data are used to provide personalized lighting (PVI) for an individualized viewing experience, i.e. the control system 130 may use a sensor 140 to determine which visitor is viewing a given artwork, obtain the respective eye data from the database 210 and then adapt the illumination of the artwork as a function of the obtained eye data.

Such a personalized lighting setting might even take into account to actual viewing position, i.e. the lighting setting is adjusted to the actual user position that can, for example, be defined by using camera detection or face or body recognition (e.g. by a 3D-face infrared imaging method as currently used in certain Smartphones).

In various embodiments, a user may also change in real-time the illumination conditions, e.g. via the user interface of the control system 134 or by using a smartphone APP. For example, in this way, a visitor may view an artwork under different lighting scenarios and select some (or all) of them according to preference.

Since many people may not be in the position to visit an exposition area in person (due to illness, age or other circumstances) or when making artwork more accessible to a wider audience (like school classes, seniors, remotely living people etc.), a virtual display of an artwork may be preferable.

However, as described in the foregoing, to perceive an artwork 140 as naturally as possible, the following factors may need to be taken into account: illumination of the artwork during image taking and representation in a chosen color space and subsequent digital representation with image data 240, the sending of such digital information to a display device 250, and finally the illuminating conditions at the user’s site and the user’s visual conditions (as already described above).

Moreover, as mentioned before, The controlling unit of such chosen display device 250 ( e.g . a smartphone, a tablet or a laser projection unit) may use its own display settings, and assumed or measured degradation characteristics and factor which influence the optical transfer function of the display device 250, i.e. transforming and applying digital data to a display device, such as AMOLED or LCD based displays, displays using micro (p)LEDs, Laser Projection devices, Virtual Displays, Augmented Reality glasses etc.

For example, in various embodiments, once having obtained a nominal the optical transfer function of the display device 250 this function may be adapted in order to determine/calculate a corrected optical transfer function taking into account the display setting and optionally degradation characteristics, which e.g. may be estimated based on a operation time of the display device. In various embodiments, the image data 240 may be adapted either prior to transmission (e.g. via the control system 130) or within the display device 250 itself. For example, this is shown schematically in Figure 8, wherein the control system 130 and/or the display device 250 may access the display device database 212.

In various embodiments, the display device 250 may receive the viewer’s eye data, such as an Eye Deficiency Matrix (EDM) data, which may be stored in the device 250, in the database 210 or a portable memory 220/222.

In various embodiments, the display device 250 may receive via a user input and/or via an automated data exchange with the control system 130, preferred user settings.

In various embodiment, external lighting conditions when viewing an artwork on a display may be taken into account as well. For example, in various embodiments, the display device 250 may comprise or have associated a sensor 120’ configured to measure the ambient lighting (e.g. level of illumination, color temperature). For example, the sensor 120’ may be a camera sensor of the device 250.

Thus, by knowing the viewer’s eye data and optionally the preferred user settings, and the ambient lighting conditions, the image data 240 may be adapted in order to show a graphical representation similar to the artist’s preferred light setting. All this means that care should be taken that a representation of an artwork on a display comes as close as possible to the original perception (or the intended perception) of the artist. This also means that when there are severely limiting factors (like display degeneration) a best fit approach (image transfer function) may be employed.

Accordingly, the described enhanced user experience method permits an improved or adjusted or individualized image perception for on-site or off-site viewers based on personal visual perception, viewer’s personal preferences, ambient light settings (both natural and artificial), on site lighting of an exposition area, off-site ambient lighting conditions, display or projection devices, and other image or optical transfer function, user Interfaces, and artist preferred lighting settings.

Accordingly, various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command. Specifically, in various embodiments, the method comprises the steps of: obtaining data identifying requested spectral characteristics, obtaining data identifying a viewer’s eye characteristics,

- generating one or more control commands in order to vary the characteristics of the light emitted by the one or more light fixtures as a function of the data identifying requested spectral characteristics and the data identifying a viewer’s eye characteristics.

Possible embodiments of this solution are detailed at the following point "Example 2". Determining configuration of light fixtures

As described in the foregoing, the lighting systems 100 described herein may be used to illuminate art objects 140, such as classic and modem paintings, sculptures, drawings, photographs, textile compositions, various sorts of canvases etc., whereby the art objects 140 may be placed in in exposition area, such as an art gallery or museum, both indoor and outdoor, and in various compositions.

In this respect, a variety of different light fixtures with respective possible operating parameters may be available. For example, as described in the foregoing, the light fixture 110 may include a spotlight with high intensity or low intensity, it can have a fixed beam angle or a variable beam angle, the color temperature or color coordinates can be variable/changeable or fixed, and the light fixture can be equipped with a framer or gobo. For example, in various embodiments, the light fixtures 110 are equipped with a variety of light emitting diodes 117 that either work as direct emitting or phosphor-converted LEDs. In various embodiments, each of the lighting fixtures 110 may be equipped with a variety of LEDs 117, including, for example, phosphor converted LEDs that emit a whitish light with different color temperatures, or direct emitting red, green, blue, lime, or amber LEDs.

The inventors have observed that artworks 140 often should be illuminated with specific lighting conditions. This may thus influence the decision which light fixture 110 or combination of light fixtures 110 should be used to illuminate artworks 140 in an exposition area 160. For example, this decision should take into account the kind of artwork, e.g. paintings, photos, drawings, etc., its size and environmental factors, such as, in case of a room 160, the height of the room, wall color and/or reflectivity, and/or the brightness/illumination in the room.

Typically, the decision which illumination system 110 should be used (in particular which light fixtures 110 and which configuration and operating parameters should be used) is done manually, e.g. by the operator of the exposition area. The responsible person takes such a decision usually based on recommendations given by light designers or light suppliers. However, providing these recommendations means additional effort and cost which can be a significant cost factor especially for smaller light installations.

In various embodiments, a method is thus proposed which (at least in part) automatically assesses and evaluates the exposition situation and provide a recommendation which kind of illumination (fixture type, fixture combination, operating parameters, placement of fixtures relative to the art object) should be used.

For example, such method may be implemented with a software program to be executed by one or more computer. In various embodiments, such a computer comprises a data processing unit, a data storage and a display with a graphical user interface (GUI). For example, the method may be implemented with a web application executed by a Webserver and/or an APP to be executed by a mobile device, such as a smartphone or tablet. In various embodiments, the computer may also comprise a photo-electric sensor, such as a camera, e.g. the camera of a smartphone or tablet. Various embodiments relate thus to a tool (device with implemented software programs und user interface) that comprises and uses the above-mentioned components and methods.

In various embodiments, the tool is configured to determine a recommendation for suited light fixtures 110 for a specific exposition area 160 based on some data input provided by a user. For example, in various embodiments, the parameters of light fixtures 110 already installed in the exposition area 160 and/or installable in the exposition area may be stored in a light fixture database, such as the database 202 of Figure 8.

For example, in various embodiments, the light fixtures 110 installed or to be installed may include (at least): a first spotlight corresponding to a high intensity spotlight with warm color, e.g. of down to 1800 K, a second spotlight corresponding to a low intensity spotlight, a first luminaire corresponding to a luminaire with a framer or a gobo, and a second luminaire with a wide angle of illumination like a wall-washer.

Accordingly, also in line with the description of Figure 8, the light fixture database 202 may comprise for each of these light fixtures 110 one or more of the following data:

- brightness data, such as data identifying a minimum brightness level and a maximum brightness level adapted to be emitted by the respective light fixture; spectral data identifying light colors adapted to be emitted by the respective light fixture, such as a color temperature or a color coordinate; optics data, identifying a light transfer function of one or more optical elements of the respective light fixture, such as a reflector, diffusor, lens, shutters and/or framers.

As mentioned before, one or more light fixtures may also support a plurality of configuration conditions having different characteristics (based on the configuration settings). In this case, the data identifying the characteristics of these light fixtures comprise preferably data for the plurality of configuration conditions. In various embodiments, the tool is configured to show the lighting effects as they would appear in one or more various scenarios, such as in a standardized environment (size, height, wall, floor and ceiling properties), or in the specified location (room) with or without the art objects. For this purpose, the tool may use pictures (graphical representations) of the art objects 140. Such graphical representations can be obtained from the taken picture (see below) or from a database of art objects 140. For example, in various embodiments, the tool may use for this purpose the exposition area database 204 and/or the artwork database 206 and/or the image database 216.

In various embodiments, the tool may thus show what the illuminated artworks 140 would look like using the recommended light fixture 110 or combination of light fixtures 110. In various embodiments, the tool is also configured to change this recommended configured, e.g. in order to provide examples what the artworks 140 would look like using different light fixtures or illumination settings thus giving the responsible decider and/or designer the possibility to choose the requested illumination system 100 according to his likings. As mentioned before, this selection may include both already installed light fixtures 110 or light fixtures which need to be installed.

In various embodiments, the tool is configured to determine then automatically the technical specification of the recommended and selected light fixture and settings (operational, positional, orientation).

In various embodiments, this technical specification is then provided to the control system 130, which may generate one or more control commands for the light fixtures 110 in order to generate the requested illumination. Thus, the decider is not required to have any real technical understanding to create the technical specification and control commands for the chosen illumination system 110.

Figure 9 shows an embodiment of the operation of the tool.

After a start step 300, the tool receives at a step 302 data identifying characteristics of the exposition area 160 and the artwork 140, a determines at a step 304 a recommended configuration of light fixtures 110.

For example, in various embodiments, the tool may show at the step 302 a graphical user interface (GUI) for acquiring the characteristics of the exposition area 160 and the artwork 140. For example, in various embodiments, the tool shows questions (provided by the software program) e.g. requesting the insertion of the characteristics of the exposition area, such as room size, material of floor, walls, ceiling, and the kind of artwork(s) to be illuminated. Specifically, in various embodiments, the tool acquires at least a room height. Generally, this parameter may correspond to the actual height of a room representing the exposition area or the distance between the floor and a truss used to mount light fixtures.

In various embodiments, these questions may be configured as a decision tree. For example, as shown in Figure 10, the tool may show in a step 320 a screen for selecting at the type of artwork 140, such as: selection Si: painting, photograph, textile, selection S2: Old Master, selection S3: Modern art, selection S4: Statue, selection S5: other 3D objects.

Generally, the type of artwork 140 may also be selected based on other characteristics, such as the material of canvas, color pigments, frame material, etc.

In the embodiment considered, in case of a painting, photograph, textile (output “Si” of the step 320), the tool proceeds to a step 322 for selecting whether the room height is greater than a given threshold, e.g. 5 m. For example, in case the room height is smaller than the given threshold (output “<5m” of the step 322), the tool may select an installation configuration INST2, e.g. comprising (and preferably consisting in) the first luminaire mounted or to be mounted at the ceiling of the room. Conversely, in case the room height is greater than the given threshold (output “>5m” of the step 322), the tool may select an installation configuration INST1, e.g. comprising (and preferably consisting in) the first luminaire mounted on a support to lower the luminaire to a distance of 3 m with respect to the floor.

In case of an Old Master (output “S2” of the step 320), the tool selects an installation configuration INST5, e.g. comprising (and preferably consisting in) the first spotlight.

In case of a modern art (output “S3” of the step 320), the tool proceeds to a step 326 for selecting whether the room height is greater than a given threshold, e.g. 5 m. For example, in case the room height is greater than the given threshold (output “>5m” of the step 326), the tool may select an installation configuration INST7, e.g. comprising (and preferably consisting in) the first spotlight. Conversely, in case the room height is smaller than the given threshold (output “<5m” of the step 326), the tool may proceed to a step 328 for selecting whether the room is a bright or dark room. For example, in case the room is a bright room (output “Y” of the step 328), the tool may select an installation configuration INST8, e.g. comprising (and preferably consisting in) the second luminaire and a second spotlight. Conversely, in case the room is a dark room (output “N” of the step 328), the tool may select an installation configuration INST9, e.g. comprising (and preferably consisting in) the second spotlight.

In case of a statue (output “S3” of the step 320), the tool selects an installation configuration INST6, e.g. comprising (and preferably consisting in) the first luminaire with a gobo.

In case of other 3D objects (output “S5” of the step 320), the tool proceeds to a step 324 for selecting whether the room height is greater than a given threshold, e.g. 5 m. For example, in case the room height is smaller than the given threshold (output “<5m” of the step 324), the tool may select an installation configuration INST4, e.g. comprising (and preferably consisting in) the second spotlight. Conversely, in case the room height is greater than the given threshold (output “>5m” of the step 324), the tool may select an installation configuration INST3, e.g. comprising (and preferably consisting in) the first spotlight.

Figure 11 shows an alternative embodiment, wherein the selection of installation is organized in a look-up table 330. For example, in the embodiment considered, the look-up table receives at input an artwork type “ATYPE” and an indication whether the room height is smaller or greater that a given threshold, e.g. 5 m. Accordingly, in the embodiment considered, the tool may determine the recommended installation as a function of the artwork type and the room height. For example, the lookup table may perform the following mapping: drawing (ATYPE 1) and height < 5 m: first luminaire, drawing (ATYPE 1) and height > 5 m: first luminaire mounted at a height of 3 m with respect to the floor,

- photograph (ATYPE2) and height < 5 m: first luminaire,

- photograph (ATYPE2) and height > 5 m: first luminaire mounted at a height of 3 m with respect to the floor, textile (ATYPE3) and height < 5 m: first luminaire, textile (ATYPE3) and height > 5 m: first luminaire mounted at a height of 3 m with respect to the floor,

Old Master (ATYPE4) and height < 5 m: first spotlight,

Old Master (ATYPE4) and height > 5 m: first spotlight,

- modern art (ATYPE5) and height < 5 m: second spotlight,

- modern art (ATYPE5) and height > 5 m: first spotlight, statue (ATYPE6) and height < 5 m: first luminaire with gobo, statue (ATYPE6) and height > 5 m: first luminaire with gobo, other 3D objects (ATYPE7) and height < 5 m: second spotlight, and other 3D objects (ATYPE7) and height > 5 m: first spotlight.

In various embodiments, also other parameters such as color temperature, soft or sharp edges, or the size of the artwork may be included in the decision tree or look-up table. For example, also the matrix/lookup-table may take into account the brightness of room.

In various embodiments, the characteristics of the exposition area 160 and/or the artwork 140 may be obtained by acquiring via a camera of the device (or another device) an image of the exposition area 160 and/or the artwork 140, and extracting the respective characteristics from the image the respective characteristics.

As mentioned before, images of the artworks 140 may also be stored already in an image database 216. Moreover, as mentioned before, the characteristics of the exposition area 160 and/or the artwork 140 may be stored in an exposition area database 204 and artwork database 206.

Accordingly, the tool may permit to select the image of the artwork, the characteristics of the exposition area stored in the database 204 and/or the characteristics of the artwork stored in the database 206. In various embodiments, the artwork database 206 and the image database 216 may also be linked, wherein selecting an artwork permits also to obtain a respective image of the artwork and vice versa. In general, these databases may be stored within the device or remotely. For example, the selection may be done by:

- performing a manual selection of an image/exposition area or artwork, - receiving a univocal code indicative of an image/exposition area or artwork and obtaining for the received univocal code the respective data.

For example, such a univocal code indicative of an artwork or an exposition area may be stored in any suitable manner on a support, such as an alphanumeric string, a barcode, a bi-dimensional barcode, such as a QR code, a magnetic support or a short range wireless transmitter such as an RFID or NFC transponder or a Bluetooth® transceiver. This support, which may be e.g. an adhesive applied to the artwork 140 or the exposition are 160, may then be inserted manually in the tool and/or read automatically, e.g. by using a camera, or an NFC or Bluetooth® transceiver of the device. In various embodiments, the univocal artwork code may also be obtained from a distributed blockchain ledger.

In various embodiments, the above described methods for obtaining the required information may also be combined in any suitable manner.

Accordingly, in various embodiments, the acquired information may include the characteristics of the exposition area 160, such as height of the ceiling, the color of a wall, e.g. if it is a bright or dark color, the characteristics of the object 140 to be illuminated, e.g. if it is an old master painting, a modem painting, a sculpture, a drawing, a photograph, or a textile, the size of the object, and the effect needed, e.g. if the edges of the artwork should be illuminated with sharp or soft edges, if just one artwork should be highlighted or if several artworks should be illuminated at the same time, and optional a graphical representation/image of the object 140. “Sharp edges” means that outside the art object the light intensity is abruptly reduced to zero, while soft edges provide a smooth transition from maximum to zero illumination.

For example, in various embodiments, the exposition area database 204 contains the following data: dimensional data of the exposition area, such as a room height, room width and room length,

- the color of the floor and/or ceiling and/or one or more walls of the exposition area, an optional brightness level of natural and/or artificial light and/or brightness profile of natural and/or artificial light during at least one day.

The brightness data of the exposition are 160 are purely optional, because the exposition area database 204 may also comprise other data adapted to be used to estimate the brightness level or profile, such as position data of the exposition area 160, such as GPS position data, which may be correlated to a local time. Moreover, the brightness data may also be obtained via a light sensor 120 installed in the exposition are 160.

In general, the brightness data both of the exposition area 140 and the light fixture 110 may comprise also further information such as a spectral distribution and/or a color location, and/or a Color Rendering Index (CRI) and/or a beam direction and/or a beam spread angle and/or a light polarization of the light in the exposition area 160 or the light emitted by the light fixture 110, respectively.

Conversely, in various embodiments, the artwork database 206 contains the following data: descriptive data, such as the name of the artwork, the name of the artist, the period or creation year of the artwork, the type of the artwork, dimensional data of the artwork,

- global or local ( e.g . pixel) color data for the artwork, such as color analysis data, spectral data, and/or image pixel data (which may also be stored in a separate image database 216 linked to the artwork database 206), optional damage data, such as a local or global damage matrix, option reflectivity data, such as local or global reflectivity data, such as a reflectivity matrix.

In various embodiments, the tool may also obtain data identifying the light fixtures already installed in the exposition area 160. As mentioned before, these data may be stored already in the light fixture database 202. In various embodiments, the exposition area database 204 and the light fixture database 202 may also be linked, wherein selecting an exposition area 160 permits also to obtain the respective data of the installed light fixtures 110 and vice versa. In various embodiments, e.g. for storing the respective data to the databases 202 and/or 204, the technical information of the available light fixtures 110 may also be provided as a manual input.

As mentioned before, based on the above information, the tool may determine a recommendation of a set of light fixtures 110 to be used for illuminating the artwork 140. As described with respect to Figures 10 and 11, the tool may use for this purpose a decision tree and/or a decision matrix, i.e. two or more parameters like the kind of artwork or height of room may be used to determine the recommended light fixture(s) to be used and optionally the recommended settings for the light fixture(s). Thus, in various embodiments, to provide the recommendation, the tool accesses the (local or remote) database of light fixtures 202 and the respective data are matched to done or more features of the exposition area 160 and/or the artwork 140.

In various embodiments, the tool may use artificial intelligence, i.e. a machine learning method based on a reference dataset comprising a plurality of reference installations for a respective artwork or artwork type in a given exposition area, and the respective used light fixtures and settings. Thus, by receiving at input the characteristics of the artwork and the exposition area, the machine learning may provide at output the recommended light fixtures and settings.

For example, in various embodiments, the machine learning method may directly access the reference dataset in order to determine, based on the characteristics of the exposition area 160 and the artwork 140, one or more best-matching reference installations and then display the set of light fixture(s) used for these one or more best-matching reference installations.

Alternatively, the machine learning method may use the reference dataset during a training phase in order to generate a mathematical function configured to estimate a set of light fixtures and the respective settings as a function of the characteristics of an artwork and an exposition area, i.e. the method may comprise: acquiring a training database of a plurality of illumination conditions/ reference installations comprising data identifying characteristics of a respective artwork, data identifying characteristics of a respective exposition area, a respective selected set of light fixtures and a respective selected configuration condition (settings),

- training a machine learning algorithm, such as an artificial neural network, by using the training database, and determining the recommended set of light fixtures via the machine learning algorithm as a function of the current data identifying the characteristics of the artwork 150 and the exposition area 160.

Accordingly, once having determined a recommended a set of light fixtures and the respective settings, the tool may display at a step 306 the recommended set of light fixtures and optionally the respective settings.

In various embodiments, as alternative or in addition, the tool may generate/render and display an image providing a visual impression what the illuminated artwork 140 will look like using the recommended illumination system 110. For this purpose, the tool may use the graphic representation/image of artwork 140 or by using a generic artwork which fits the category of artworks that will be displayed in the exposition area or by calculating a modified image as seen under the recommended illumination setting. For example, the use of a generic artwork might be necessary when the illumination system is installed for the first time, i.e. with no artworks yet available in the exposition area.

In various embodiments, the rendering operation may thus use the following data:

- the characteristics of the exposition area 160, e.g. the graphical representation/image of the exposition area 160 or other data, such as the wall color, etc., or a default exposition area,

- the characteristics of the artwork 140, e.g. the graphical representation/image of the artwork 160 or other data, such as the artwork type;

- the characteristics of the selected light fixture(s) 110, which may be used to estimate the illumination of the artwork 140.

Similar to the description of the device 250 of Figure 8, also in this case, the rendering operation and/or the display operation of the rendered image may take into account the characteristics of the display device and/or the ambient illumination conditions of the device 250. For the respective description may thus be made reference to the description of Figure 8.

In various embodiments, the tool may also show at a step 308 a screen which permits to select a different set of light fixtures and/or different settings for the light fixture(s). In this case, the tool may also generate/render and display an updated image providing a visual impression what the illuminated artwork 140 will look like using the different illumination system 110. For example, in case the tool recommends a light fixture with a framer for drawings and a certain color temperature, the user want to see the impression of the artwork using a simple spotlight or a different color temperature.

In various embodiments, the tool may then receive at a step 310 an input indicating which set of light fixtures and settings should be used (i.e. the recommended or a different setup). In various embodiments, before the method terminates at a stop step 314, the tool may then determine at a step 312, as a function of the selected setup (light fixtures and settings) and the light fixtures 110 already available in the exposition area, at least one:

- the configuration parameters for the already installed light fixtures 110,

- the technical specification for the light fixtures to be installed, and optionally the configuration parameters for the light fixtures 110 to be installed.

The technical specification may include thus data which permit to provide and install the selected illumination system.

Accordingly, when performing the set-up of the illumination of an artwork/object 140, the tool permits to automatically select a set of light fixtures 110 and the respective (target) illumination characteristics. This may involve the use of already installed light fixtures 110, and/or the selection and installation of new light fixtures 110. As will be described in greater detail in the following, the tool may also automatically select a set of sensors 120 and, if support, the respective sensor configuration. This may involve the use of already installed sensors 120, and/or the selection and installation of new sensors 110.

In various embodiments, the tool may also determine the installation position of the light fixtures 110, preferably within a 3D model of the exposition area 160. For example, such a 3D model of the exposition area 160 may already be stored in the exposition area database 204. Alternatively, the respective data may be inserted manually, e.g. via a user interface, or at least in part automatically. For example, in various embodiments, a camera, e.g. a camera of the device having installed the software tool, is used to acquire image data of the exposition area 160, which permits to calculate a 3D model of the exposition area 160. Additionally or alternatively, a 3D scanner, such as a Light Detection and Ranging (LIDAR) system, may be used.

For example, Figure 12 shows a top view of the model of a simple exposition area 160 in the form of a room having a given width, length and height. In general, 3D model reconstruction from a plurality of images, e.g. a video, is well-known in the art rendering a more detailed description herein superfluous. For example, in this context may be cited documents US 2003/0072483 Al, US 8,254,667 B2 or US 10,275,945 B2.

In various embodiments, the 3D model includes also the apertures of the exposition area 160, e.g. doors 165 and windows 164, through which natural or artificial light may enter. In various embodiments, the 3D model may also include color and/or reflectivity data of the surfaces of the exposition area 160, e.g. of the floor 162, the walls 163 and/or the ceiling 161 (see also Figure 1) .

Similarly, the tool may acquire data identifying the position of the artworks 140 within the exposition area 160. For example, also in this case may be used data already stored in a database, e.g. in the exposition area database 204 or the artwork database 206, or by receiving the data via a manual input or at least in part automatically, e.g. by acquiring image data of the exposition area 160 and calculating the position and dimension of each artwork in the exposition area. In various embodiments, the artwork data may also include color and/or reflectivity data of the artwork 140. For example, in Figure 12 are shown three pictures/paintings 140i, 140 2 , 140 3 and a 3D sculpture 140 4.

Next the selected light fixtures 110 are positioned in the 3D model. For example, in Figure 12 are shown four light fixtures 1 10i, 1 I O2, 1 I O3 and 1 I O4.

Generally, for already installed light fixtures 110, such as light fixtures IIO3 and IIO4, the respective installation position of the light fixture 110 is usually fixed. Accordingly, also the respective position data may already be stored in a database, e.g. the light fixture database 202 or the exposition area database 204. Alternatively, a light designer may position the already installed light fixtures 110 manually within the model of the exposition area 160.

In various embodiments, if supported by the already installed light fixtures 110, the tool may determine a (recommended) orientation of each already installed light fixture 110. For example, the tool may propose to orient the light fixtures I I O3 and I I O4 in order to point with the respective optical axis towards the center of the artworks 140 3 and 140 4 , respectively.

In various embodiments, the tool is configured to determine a (recommended) position and orientation of each new light fixture 110 as a function of the characteristics of the light fixture 110 and the position of the artwork 140 within the exposition area 160.

Specifically, in various embodiments, each artwork 140 has associated respective target and/or maximum illumination data. For example, these data may be stored in the artwork data base 206 and/or the artist illumination database 208, inserted manually or determined automatically by classifying the artwork 140, e.g. as a function of an image of the artwork 140. Thus, the tool may use the 3D model of the exposition area 160 and the data of the artwork 140 and the selected light fixtures 110 in order to determine a (recommended) position and orientation for each new light fixture 110.

Specifically, for this purpose, the tool may determine an initial position for each new light fixture 110. For example, in case of 2D artworks having a substantially flat surface, such as artworks 140i and 140 2 , the tool may select the initial position by: determining a normal plane with respect to the surface of the artwork 140, wherein the normal plane is parallel to the vertical axis and preferably located at the center of the artwork 140; determining the mounting height, which (as mentioned before) may correspond (approximately) to the height of a ceiling of a room representing the exposition area 160 or the mounting height of a support structure; and determining a given distance with respect to the artwork 140, e.g. selected as a function of the characteristics of the selected light fixture, e.g. as a function of the dimension of the artwork and the beam angle of the light fixture 110.

For example, the initial mounting position of a given light fixture may be determined by: determining within the normal plane a circle segment (around the center of the artwork) having a radius corresponding to the given distance to the artwork 140; determining within the plane a line parallel to the floor at the mounting height; and determining the point of intersection of the circle segment and the line.

For example, this is shown schematically for the light fixtures 110i and 1 IO2 which are mounted at a given distance from the center of the artworks 140i and 140 2.

Generally, instead of simple circle segments, the distance from the artwork may also depend on the mounting height (or similarly the vertical displacement with respect to the center of the artwork 140), because the inclination of the light fixture 110 with respect to the artwork 140 may also influence the dimension of the light spot on the artwork 140.

Accordingly, in various embodiments, the tool may determine the initial mounting position of a given light fixture 110 as a function of the position of (the center of) the artwork 140, the mounting height and the requested distance between the artwork 140 and the light fixture 110, which in turn may be determined as a function of at least one of: the characteristics of the light fixture 110, the dimension of the artwork 120, and the mounting height.

In various embodiments, the tool may then perform a light simulation of the 3D model of the exposition area 160 and determine the illumination of each artwork 140. Specifically, in various embodiments, this simulation may include also at least one of:

- the background light, e.g. by simulating windows 164 and doors 165 as additional (fixed) light sources with given (fixed or variable) light characteristics;

- the reflectivity data of the surfaces of the exposition area 160 and/or the artwork 140; and shadows generated by obstacles within the exposition area 160, such as 3D art objects or visitors at expected/estimated position when observing a given artwork 140.

Once having determined the illumination of each artwork 140, the tool may determine whether the illumination of each artwork 140 corresponds to the target illumination and/or is smaller than the maximum threshold.

For example, in case the illumination exceeds the maximum threshold, the tool may reduce the light provided by the respective light fixture 110. Additionally, the tool may determine whether a given artwork 140 is also illuminated by another light fixture 110, and in this case vary the position of the respective light source 110.

Thus, in various embodiments, the tool may vary the characteristics of the set of light fixtures 110 and/or the respective mounting position via an iterative process of simulations until the illumination of each artwork 140 corresponds to the target illumination and/or is smaller than the maximum threshold.

Accordingly, various embodiments of the present disclosure relate to a method of selecting at least one light fixture by: obtaining data identifying characteristics of an artwork, obtaining data identifying characteristics of an exposition area, determining a set of light fixtures and/or operating setting for a set of light fixtures as a function of the data identifying characteristics of the artwork and the data identifying characteristics of the exposition area. Possible embodiments of this solution are detailed at the following point "Example 3".

Light fixtures

As mentioned before, various embodiments of the present disclosure relate to a lighting system 100 comprising a control system 130, one or more light fixtures 110, and optionally one or more sensors 120. For a general description of these blocks, reference can be made, e.g ., to the description of Figures 2 to 7. For example, in the context of the illumination of artworks, often the terms “Art Lighting’ ’ or “Art Centric Lighting” are used. In this type of application, the light fixture(s) 110 should be configured to emit light with high quality, i.e. the spectral characteristics of the light should be within given boundaries, e.g. in order to obtain a high CRI. Moreover, preferably, the spectral characteristics of the light emitted by the light fixture are settable/programmable. For example, as mentioned before, when using light fixtures 110 comprising a plurality of light sources 117 having different colors, the spectral characteristics of the light emitted by the light fixture 110 may be varied by varying the brightness of the light emitted by the light sources 117.

For example, as described with respect to Figures 5 to 7, and as shown in greater detail in Figure 25, the control system 130 or a data processing unit 113 of the light fixture 110 may vary the brightness of {i.e. perform a dimming of) the light emitted by the light sources 117 by varying/regulating the average power supply provided to the light sources 117.

In order to perform a dimming operation, each light source or set of light sources 117 may be powered via a separate electronic converter 116, a sperate switching stage 116h or an additional output stage of the switching stage 116h (see also Figure 7). Conversely, Figure 25 shows an embodiment, wherein four light sources or sets of light sources 117i, 117 2 , 117 3 and 117 4 , such as LEDs or laser diodes, receive a power supply from the same driver 116. Specifically, in the embodiment considered, the driver 116 is configured as current generator providing a (regulated) current i o t. Moreover, in the embodiment considered, the light sources or sets of light sources 117i ... 117 4 are connected in series between the output terminals of the current generator 116. For example, in this case, the average current provided to the light source or set of light sources 117i... 117 4 may be varied via at least one of:

- by varying the amplitude of the current i out provided by the driver 116, e.g. by setting a different reference value z re /for the driver 116;

- by controlling the current provided to all light sources 117i... 117 4 , e.g. via an electronic switch SW5 configured to switched on or off the power supply for all light sources

1171...117 4 ; and

- by controlling the current provided to each light source or set of light sources

117 1. . . 117 4 , e.g. via a respective electronic switch SW1..SW4 configured to switched on or off the power supply for the respective light source or set of light sources 117i... 117 4.

For example, the electronic switches SW1..SW4 and/or the electronic switch SW5 may be driven via respective pulsed drive signals DSW1..DSW5, such as pulse width modulation signals. While Figure 25 refers to the case of a current generator 116, similar dimming operations may also be performed in case the driver 116 is configured as voltage generator providing a (regulated) voltage Vout (see also Figure 6), e.g. by selectively connecting the light sources 117 to the voltage Vout and/or by controlling the operation of the current regulator 118c.

Accordingly, when the light sources 117i ...117 4 are configured to emit light with different colors, the spectral characteristics of the combined light emitted by the light fixture may be varied by performing an individual dimming operation of the separate light source or set of light sources 117i... 1174.

For example, as described in the foregoing, the spectral characteristics of the light emitted by the light fixture(s) 110 may be varied as a function of, e.g. at least one of: requested spectral characteristics 208, e.g. as specified by an artist; data of the viewer, such as the above-mentioned eye-characteristics and/or the Preferred

Viewer Illumination; sensor data provided by one or more sensors 120, such as a light sensor configured to monitor the illumination of the artwork 140 and/or the ambient light in the exposition area 160; and maximum illumination values for the artwork 140.

For example, as schematically shown in Figure 25, in various embodiments, the control system 130 may send (via the interfaces 131 and 111) a control command CMD to the data processing unit 113 of the light fixture 110. In response to this command, the data processing unit 113 may vary the power supply of one or more light sources 117, e.g. by varying the reference signal of the driver 116 and/or controlling the operation of the switches SW1..SW5.

However, often the light fixture(s) 110 should also be configured to provide other features, such as an illumination which only (or at least mainly) illuminates the artwork 140 and/or the possibility to highlight certain aspects of an artwork 140, such as certain colors or parts of the artwork 140 (e.g. heads of persons in a painting).

For example, for this purpose, the light fixture 110 may comprise one or more framers 115 configured to limit the illuminated range of the light emitted by the light fixture 110 and/or other optical elements used to focus the light generated by the light sources 117. For example, framer usually comprises one or more mechanical shades, which are moved in a position where they shade light that would illuminate areas outside of the artwork 140. While framers can easily produce rectangular borders, shading a circular object or an object of random borderlines (e.g. a sculpture) may be a complex task. Conversely, in order to highlight given zones of an artwork 140, usually additional spotlights are used. Accordingly, one or more additional spotlights may highlight certain aspects of an artwork 140.

Accordingly, as shown in Figure 25, in various embodiments, the light fixture 110 may also be configured to generate one of more drive signals D114 for an actuator 114 of the light fixture 110, such as one or more actuators associated with a framer and/or one or more actuators associated with optical elements 115 used to orientate and/or focus the light generated by the light fixture 110. Accordingly, the processing system 113 may be configured to vary a significant number of parameters of the light fixture in response to the control command CMD received from the control system 130, such as:

- the power supply/brightness level of each light source or set of light sources;

- the settings of the actuators 114, e.g. actuators associated with one or more optical elements 114 and/or one or more sensors 120 integrated in the light fixture 110 and/or the complete light fixture 110, e.g. in order to move (e.g. shift, rotate, pan or tilt) the light fixture 110.

For example, in various embodiments, the data processing 113 unit may be configured to determine for each light source or set of light sources 117i ... 117 4 a relative power supply/brightness level, e.g. as a function of a requested color, such as a requested color temperature. Next, the data processing unit 113 may be configured to determine for each light source or set of light sources 117i... 117 4 an absolute power supply/brightness level as a function of the relative power supply/brightness level and a requested brightness of the light emitted by the light fixture 110. Accordingly, in this case, one or more commands CMD may comprise data identifying the requested color and a requested brightness.

First embodiment of light fixture

Figure 26 shows a first embodiment of a light module 118 comprising a plurality of light sources 117.

As shown in Figure 27, in the embodiment considered, the light sources 117 are arranged on one side of a support, preferably a flat substrate, such as a printed circuit board.

For example, in various embodiments, the light module 18 is rectangular and the light sources 117 are arranged in a matrix having a given number of rows and columns. For example, in Figure 26 are shown 8 columns and 6 rows.

However, also any other arrangement could be used. For example, Figure 28 shows a circular light module 118, wherein the light sources 117 are arranged along parallel lines. Conversely, Figure 29 shows that a first set of light sources may be arranged along a circle and a second set of light sources is arranged along parallel lines.

Thus, the light module 118 may have any form and the light sources 117 may be arranged in any suitable manner. However, in various embodiments, the light module 118 has preferably an axial symmetric form, such as a rectangular, circular, elliptical form and/or the light sources 117 are arranged (preferably equidistant) along rectilinear lines and/or circle segments.

In various embodiments, the light sources 117 are selected from the group of: light emitting diodes including a phosphor conversion LEDs, laser diodes, organic light source such as OLED, or a quantum dot based light source. For example, in various embodiments, all light sources 117 are LEDs, preferably mini -LEDs or micro-LEDs. Generally, the light sources 117 may comprise a single LED (e.g. a white LED) or a plurality of LEDs (e.g. red, green and blue, or red, green, blue and white). For example, in the embodiment shown in Figure 26, the LEDs may form the pixels of the matrix, wherein each pixel may be controlled individually and wherein each pixel may consist in a single LED or comprise a plurality of LEDs.

Generally, as shown in Figure 27, the light fixture 110 may also comprise one or more optical elements 115. For example, in the embodiment considered, the light fixture 110 comprise a first lens structure 115a configured as collimator lens and a second lens structure 115b configured to focus the light towards the artwork 140. For example, the lens structure 115a may be implemented with micro-lenses, wherein one or more micro-lenses are arranged in correspondence with each light source 117. Conversely, the second lens structure 115b may comprise a convex lens mounted at a given distance D from the first lens structure 115a. In various embodiments, the distance D may also be variable, ( e.g . via an actuator 114 configured to move the lens 115b), thereby selectively varying the focal point of the light generated by the light fixture 110.

In various embodiments, the light module 118 is connected to a driver 116 configured to individually control the power supply of sub-sets of light sources and preferably the power supply of each light source 117, and more preferably (in case the light source 117 comprises a plurality of LEDs) the power supply of each LED. Accordingly, in various embodiments, the data processing unit 113 may be configured to control:

- the power supply of sub-sets of light sources and preferably the power supply of each light source 117, and more preferably (in case the light source 117 comprises a plurality of LEDs) the power supply of each LED; and optionally one or more actuators 114.

Accordingly, a significant number of possible configurations has to be managed by the control system 130 and the data processing unit 113. Accordingly, in various embodiments and as also shown in Figure 25, the data storage device 112 of the light fixture 110 may have stored a data structure 800, such as a look-up table, having stored a plurality of preset configurations, wherein with each configuration item is associated a univocal code. Generally, the association may be explicit, e.g. each configuration data item may have stored a respective univocal code, or implicit, e.g. the univocal code may be determined as a function of (e.g. corresponds to) the index of the configuration data item.

For example, as shown in Figure 30, the data processing unit 113 may comprise a digital processing unit 1130, such as a micro-processor programmable via software instructions, and a temporary memory 1132, e.g. implemented registers or a Random-Access Memory (RAM). Specifically, as shown in Figure 31, after a start step 802, the data processing unit 113 may proceed at a step 804, where the data processing unit 113 waits for a new command CMD. Once a new command CMD has been received, the data processing unit 113 extracts from the command CMD a field comprising a univocal code of a preset configuration, and proceeds to a step 808. Specifically, at the step 808, the data processing unit 113 retrieves from the data structure 800 the configuration stored for the received univocal code and stores the respective configuration to the memory 1132. Accordingly, the data processing unit 113 may use the data stored in the memory 1132 to drive the driver 116 and optionally the one or more actuators 114. Additionally or alternatively, the data processing unit 113 may also receive at the step 804 data from one or more sensors 120. Accordingly, in various embodiments, the data processing unit 113 may also select a different preset configuration as a function of the sensor data. For example, the data structure 800 may comprise a plurality of preset configuration data items, wherein the light emitted by the light fixture has the same color characteristics but different brightness levels. Accordingly, by selecting a different preset configuration data item, a different brightness level (with the same color characteristics) may be used.

Additionally or alternatively, the data processing unit 113 verifies at a step 806 whether the received command or the measured data indicate that a different preset configuration has to be loaded from the data structure 800 or whether the data stored to the memory 1132 should be adapted.

In case a different preset configuration item has to be loaded from the data structure 800 (output PRE of the verification step 806), the data processing unit 113 proceeds to the previously described step 808.

In case the data stored to the memory 1132 should be adapted (output DYN of the verification step 806), the data processing unit 113 proceeds to the step 810, where the data processing unit 113 adapts the data stored to the memory 1132.

Accordingly, in this case, the driver 116 and/or the actuators are driven as a function of the adapted data stored to the memory 1132.

Once having obtained new preset data or having adapted the data stored to the memory 1132, the data processing unit 113 may thus return to the step 804 for receiving a new command from the control system 130 or sensor data from a sensor 120.

For example, in various embodiments, the data processing unit 113 may be configured to: receive a first command requesting the activation of a given preset configuration, such as data identifying for each light source 117 relative power supply data, e.g. in order to implement a color mixing operation or to generate a spotlight with a subset of the light sources 117; optionally receive a second command requesting an adaptation of the data stored to the memory 1132, e.g. in order to set a requested brightness level of the light fixture 110; and optionally receive data from a sensor 120 used to adapt the data stored to the memory 1132, e.g. in order to regulate the brightness level of the light fixture as a function of the actual illumination of an artwork 140 as monitored by a light sensor 120.

Preferably, the data structure 800 is stored in a non-volatile memory. Generally, the data structure 800 may be stored in a Read-Only Memory (ROM). However, a programmable non volatile memory is preferably. For example, in this case, the control system 130 and the data processing unit 113 may be configured to perform an update operation of the preset configurations 800. For example, in this case, the control system 130 may: access a database, such as the light fixture database 202, having stored a plurality of preset configurations and the control system 130 may store via the data processing unit 113 only a selected subset of the preset configurations to the data structure; and/or determine one or more parameters of the preset configuration as a function of at least one of: o requested illumination characteristics, or a sequence thereof, e.g. as stored in the artist illumination database 208; o the characteristics of the artwork 140 to be illuminated, e.g. as stored in the artwork database 206 and/or as measured via one or more sensors, o the characteristics of the exposition area, e.g. as stored in the exposition area database 204 and/or as measured via one or more sensors, o data identifying characteristics of the viewer, e.g. as stored in the viewer’s eye database 210.

For example, in various embodiments, the requested illumination characteristics, are used to determine a plurality of preset configurations necessary to obtain the requested illumination characteristics. Conversely, in various embodiments, the sensor data and/or the viewer’s eye characteristics are used by the data processing unit 113 to adapt the preset configuration data in order to obtain the requested illumination (as a function of the sensor data) and/or to obtain the requested perceived illumination (as a function of the viewer’s eye characteristics).

Accordingly, in various embodiments, also only a single preset configuration data item may be stored to the data-structure 800 (identifying a requested illumination) and the variation of the illumination may be performed by varying the data transferred to the memory 1112.

In various embodiments, the preset configuration data may also take into account the shape and/or dimension of the object 140 to be illuminated. For example, knowing the shape and dimension of the object 140 to be illuminated, the expected illumination of the object 140 may be determined (e.g. as a function of the optical characteristics of the light fixture 110 and the distance from the object 140), and the preset configuration data may indicate that given light sources 117 should be switches off (power supply disabled), thereby illumination only the object, such as a painting or a statue. For example, for this purpose a sensor, such as a camera, may monitor directly the illumination of the artwork 140.

Alternatively, the control system 130 may also obtain data identifying the shape and/or dimension of the object 140 to be illuminated. For example, these data may be obtained via a camera or by reading respective data from the artwork database 206. For example, as mentioned before, each artwork 140 may have associated a univocal (artwork) code, such as a QR code applied to the artwork 140, and a reader device may be used to obtain the univocal code. Accordingly, the control system 130 may obtain the data identifying shape and/or dimension of the objected associated with the univocal (artwork) code. However, also any other method described in the foregoing may be used to obtain data associated with an artwork 140, such as image recognition or via a user interface.

Thus, once having obtained the preset configuration data (which as mentioned before may be determined, e.g ., as a function of the characteristics of the artwork 140, the exposition area 160, a requested illumination, etc.), the control system 130 may transmit the selected set of preset configurations to the light fixture 110 for storage into the data structure 800.

In various embodiments, the preset configuration data may also be obtained/determined with the previously described tool used to determine the configuration of light fixtures. In fact, already this tool may be used to determine requested illumination settings for the light fixtures 110 installed in and/or to be installed in the exposition area 160. Accordingly, the tool may determine the preset configuration data to be stored to the data structure 800 of the light fixture(s), and data identifying the control commands CMD to be sent to the control system 130 to the light fixture(s), e.g. the univocal code (or sequence thereof) to be sent to the light fixture(s) 110. Accordingly, in various embodiments, the preset configuration data stored to the data structure 800 may include preset configuration data, wherein each preset configuration data item identifies one or more of: requested illumination data for the light sources 110 associated with one or more specific artworks and/or respective artists and/or epochs; a global color temperature or local color temperatures for each light source of subsets of light sources 117; data identifying whether given light sources 117 are switched on or off, e.g. in order to implement a virtual framer or gobo, wherein the transition between the activated and the deactivated light sources 117 may be abrupt (e.g. one pixel is switched on, and an adjacent pixel is switched off) or it can be smooth (e.g. a given first pixel is switched on and a given second pixel is switched of, and wherein one or more intermediate pixels have a reduced intensity); and data identifying a set of light sources 117 to be supplied with a higher power supply, thereby generating highlighted areas of the artwork 140.

Generally, the above data have in common that they relate to the power supply parameters of each light source or set of light sources 117. Accordingly, in various embodiments, each preset configuration data item may comprise a respective parameter indicating a requested power supply or brightness level for each light source or set of light sources 117. For example, when considering a matrix of 6 x 8 = 48 light sources, each preset configuration data item may comprise 48 power supply parameters. Generally, in order to implement a framer or gobo or to highlight given areas with high resolutions, the light module 118 should comprise a significant number of light sources, such as at least 1000 light sources.

The data of each preset configuration data item may be stored in several ways in the data storage device 112. For example, when having a matrix of light sources 117, the data associated with each light source 117 may be stored as pixel data, wherein each pixel is associated with a given horizontal and vertical of the module. For example, in case each pixel corresponds to a LED configured to emit light with a given color, the pixel data may comprise data identifying the intensity/power supply for the respective LED, for example in a range between 0 and 255. Conversely, in case each pixel corresponds to a set of a plurality of LEDs configured to emit light with different colors, the pixel data may comprise: data identifying the intensity/power supply for each of the LEDs; or data identifying the intensity/power supply for the complete set of LEDs, and the color of the light to be emitted by the set of LEDs, which thus may be used to calculate the intensity/power supply for each of the LEDs.

Instead of storing the power supply data for the light sources, each preset configuration data item may also comprise data specifying a requested illumination pattern, such as pixel data specifying the color and brightness of a matrix of pixels, wherein this illumination matrix may also have a different resolution than the matrix of the light sources 117. For example, the illumination matrix may be represented by an image, such as an RGB image, wherein each pixel specifies the requested intensity and color of the illumination of a given area of the artwork 140. In this case, the data processing unit 113 may be configured to map the illumination matrix on the light module 118 by scaling the image according to the distance of the artwork from the light fixture 110 and the focal distance (or more generally the spatial radiation characteristics) of the light fixture 110, which could also be variable, e.g. in order to generate a light beam having approximately the dimension of the artwork 140 to be illuminated.

In various embodiments, the distance between the object 140 and the light fixture 110 may be entered manually or detected automatically, e.g. via distance sensor. Moreover, during the mapping operation, the data processing unit 113 may also take into account the mounting height and inclination/orientation of the light fixture with respect to the artwork 140.

In various embodiments, the mapping of the light sources 117 on the artwork 140 may also be determined automatically via the light fixture 110. Specifically, in various embodiments, the data processing unit 113 may be configured to switch on given light sources 117 or sets of light sources 117, and monitor, e.g. via a camera, the illumination of the artwork 140, thereby permitting to associate each light source 117 with a given area of the artwork 140. In various embodiments, in particular in case the light fixture 110 comprises variable optics 115, the data processing unit 113 may also perform a plurality of iterations, wherein the data processing unit 113 also drives one or more actuator 114 of the light fixture 110 in order to regulate the focal distance of the light fixture 110, e.g. in order to obtain the setting of the optics wherein the dimension of the light beam generated by the light module 118 corresponds (approximately) to the dimension of the artwork 140.

Once having mapped the illumination image on the light module 118 and knowing the positions and characteristics of the light sources 117, the data processing unit 113 may calculate the requested power supply parameters for the light sources 117. Accordingly, in this case, the memory 1112 may comprise the original illumination image data or already the calculated power supply parameters for the light sources 117.

For example, the use of such an illumination matrix/image of requested illumination value has the advantage that a light artist does not need to know the characteristics of the light fixture 110, but has only to specify the requested illumination profile for the artwork 140, independently from the resolution of the light matrix. Moreover, also a sequence of images, such as a film, could be used. Thus, in various embodiments, the requested illumination data may be stored with conventional picture or video formats, such as GIF, PNG, MPEG, preferably with lossless data compression.

As mentioned before, in various embodiments, these preset configuration data relate only to a base illumination, which may be adapted via control commands CMD and/or as a function of sensor data.

For example, in various embodiments, the dimension and/or the position of an highlighted area may be changed by adapting dynamically at the step 810 the power supply parameters. For example, for this purpose, the control system 130 may send a sequence of commands for adapting the parameters stored to the memory 1112 in order to change the dimension of and/or move the highlighted area according to a predetermined sequence. Additionally or alternatively, the control system 130 and/or the data processing unit 113 may receive a user input, such as gestures detected via a camera or commands received via a visitor’s smartphone. For example, a human or automatic guide could explain certain features of an object 140, and to show a respective feature, the guide could move the highlighting spot across the object 140. In this respect, the light spot could move gradually, or it could jump from one feature to the next, simply by reducing the intensity of the respective pixels at the first spot and increasing the intensity of the respective pixels at the second spot. Similarly, a viewer could move the highlighting spot across the object 140 via a remote control, such as the visitor’s smartphone having installed a suitable application, via gesture recognition using a sensor ( e.g . a camera) connected to the light fixture 110 or the control system 130, or interact with some user interface, such as buttons.

Generally, by using a plurality of light sources 117, also several areas can be highlighted at the same time.

Moreover, in addition to or as alternative to the variation of the brightness, the parameters stored to the memory 1112 may be varied in order to change the color of given areas of the artwork. For example, in some older artworks, certain colors may have faded strongly. Increasing their intensity or luminosity may help to counteract the fading effect, i.e. the color could be perceived as strong as originally intended by the artist. For instance, the color yellow has faded strongly on paintings by van Gogh, the color red on other old paintings and textiles. Increasing the intensity of yellow or red on these paintings by illuminating the painting with a yellowish or reddish color permits thus to perceive the yellow or red areas to their original intensity. However, when changing the color of a global illumination, also the other areas would be illuminated with the different light color, which could be perceived as “unnatural”. Conversely, various embodiments of the light module 118 described herein permit to concentrate the increased yellow or red illumination only on given areas, while the remaining areas may be illuminated with white color (e.g. light having a color temperature along the Planck curve).

Generally, the previously described solution may also be applied when a plurality of light fixtures 110 illuminate the same artwork 140.

For example, in this case, the control system 130 or a data processing unit 113 of a light fixture acting as a master device, may determine the illumination data for each light fixture 110 as described in the foregoing (in particular with respect to the association of the light sources 117 to given areas of the artwork, and the power supply parameters for the light sources 117) and then regulate the intensity of light emitted by each light fixture 110 in order to obtain the requested light intensity.

Alternatively, each of the light fixtures 110 could be used to illuminate a given sub-area of the artwork 140. However, when several light fixtures are used to illuminate an object, the illumination either needs to be well aligned so that no dark border between the illuminated areas on the object exists, or the light fixtures can produce illumination areas which overlap. In this case, the control system 130 or the data processing unit 113 may monitor the intensity of the light in the overlapping area and may reduce the power supply of the light sources 117 (of the light fixtures 110) associated with this overlapping area such that the overall intensity of the illumination is homogenous and equal. For example, also for this purpose the control system 130 or the data processing unit 113 may receive a user input or monitor the illumination of the artwork 140, e.g. via a camera.

Moreover, the light fixtures 110 could be used to implement different tasks, e.g. one or more first light fixtures 110 could be used to provide the basic illumination of the artwork 140 and one or more second light fixtures could be used to implement dynamic effects (step 806), correct the color of the illumination or highlight given areas of the artwork 140.

Accordingly, various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a light fixture comprising a plurality of light sources, a driver circuit configured to provide an individually controllable power supply to each of the light sources as a function of one or more control signals, a data storage device having stored at least one preset configuration data item, and a data processing unit comprising a memory. Specifically, in various embodiments, the method comprises: reading a preset configuration data item from the data storage device and storing the preset configuration data item into the memory; and generating the one or more control signals as a function of the configuration data stored to the memory.

Possible embodiments of this solution are detailed at the following point "Example 8". Second embodiment of light fixture

As shown in Figure 32, and as already described with respect to Figure 5 to 7 and 25, in various embodiments, the light fixture 110 comprises a driver circuit 116 configured to provide a regulated power supply, such as a regulated current i o t , to a light module 118 comprising one or more light sources 117, such as LEDs or laser diodes.

Specifically, for this purpose the light fixture 110 comprises a power supply circuit 900 configured to provide a DC voltage Vt us. For example, the power supply circuit 900 may comprise the previously described circuits 116e, 116f and 116g configured to generate the (preferably regulated) voltage Vt us based on an AC input voltage V m,A c received via two input terminals 116a and 116b.

Moreover, in the embodiment considered, the driver circuit 116 comprises a regulated voltage or preferably current source 902 configured to provide via output terminals 116c and 116d a regulated voltage V out or preferably a regulated current i out to a light module 118 comprising one or more light sources 117, such as LEDs or laser diodes. Specifically, as described in the foregoing, the regulated voltage or current source 902 may comprise: a switching stage 116h and an optional output filter 116i; a feedback circuit 116k configured to provide a feedback signal FBue a control circuit 116m configured to generate one or more drive signals DRV 116 for the switching stage 116h as a function of the feedback signal FBue.

Specifically, in various embodiments, the feedback circuit 116k is configured to provide a feedback signal FBue indicative of (and preferably proportional to) the output quantity to be regulated, such as the instantaneous or average value of the output voltage V 0«t or output current iout. For example, in Figure 7 is used a voltage measurement circuit configured to monitor the output voltage V out , while Figure 32 shows a current sensor configured to monitor the output current i out. Generally, various solutions are known for monitoring the output voltage V out or output current i o t in an electronic converter. Usually, these solutions use a voltage or current sensor connected to the output of the switching stage 116h or the output terminals 116c, 116d. However, also other solutions are known, wherein the value of the output voltage V out or output current i out is estimated based on one or more voltage or current sensors configured to measure signals at the input of the switching stage 116h or intermediate signals within the switching stage 116h, such as the current received at the input terminals of the switching stage 116h or the current flowing through a component of the switching stage, such as a current flowing through an inductive element Ln 6 of the switching stage, e.g. the primary or secondary winding of a transformer of the switching stage 116h.

Accordingly, the control circuit 116m may be configured to generate the one or more drive signals DRV 116 in order to regulate, e.g. via a regulator circuit, the switching activity of the switching stage 116h until the feedback signal FBue corresponds to a requested value, e.g. indicative of a requested current i re f to be provided to the lighting module 118. In various embodiments the regulator circuit comprises at least one of: a proportional (P), low- pass filtering (PT1), integral (I), or derivative (D) component. For example, such a regulator circuit may be implemented with an active controller, e.g. implemented with an operational amplifier and at least one dynamics’ compensating feedback branch having proportional, low- pass filtering, integral, and/or derivative characteristics. At least part of the regulator circuit may also be implemented via software instructions executed by the data processing unit 113.

Thus, essentially, the circuit 902 is a closed-loop regulated voltage or current source. For example, in Figure 32, the circuit 902 implements a current generator.

In the embodiment considered, the data processing unit 113 may thus set the value of the reference signal i re f indicative of (and preferably proportional to) the requested output current in order to control the output current i o t. Generally, the reference signal i re f may also be a voltage signal, e.g. in case the current sensor 116k, such as a shunt resistor, provides a voltage signal proportional to the output current i o t.

For example, as described in the foregoing, the data processing unit 113 may set the reference signal i re f as a function of a control command CMD received from a control system 130 and/or one or more signals received from one or more sensors 120, such as light sensors, temperature sensors, etc.

For example, Figure 33 shows a possible embodiment of a block diagram of the data processing unit 113.

In the embodiment considered, the driver circuit 116 comprises again a regulator current generator 902 configured to regulate a signal FBi indicative of (and preferably proportional to) the output current i o t to a requested value i re f For example, in Figure 33, the signal FBi is generated by a current sensor 116ki, such as a shunt resistor Rs connected in series with the output terminals 116c and 116d.

Specifically, in the embodiment considered, the data processing unit 113 is configured to determine a signal indicative of a requested light flux F h /. For example, in the embodiment considered, a module 906, such as an analog and/or digital hardware circuit or a software module executed by the digital processing unit 1130 (see also Figure 30), may receive via the communication interface 110 a command CMD from the control system 130, such as a central control system of a complex lighting system 100, a smartphone or also only a dimmer. For example, as described in the foregoing, the command CMD may be used to load a preset configuration data item from the dataset 800.

In the embodiment considered, the signal indicative of a requested light flux re f is provided to a second module 908, such as an analog and/or digital hardware circuit or a software module executed by the digital processing unit 1130. Specifically, in various embodiments, the module 908 is configured to generate the signal i re f as a function of the signal indicative of a requested light flux F -f In general, as described in the foregoing, the requested light flux F t may refer to a requested brightness value of the light emitted by the light fixture 110 or a plurality of requested brightness values for a plurality of light sources 117 or sets of light sources. For example, such requested brightness values for a plurality of light sources 117 or sets of light sources 117 may be determined as a function of a global requested brightness value and requested color information, such as a requested color temperature. Thus, as described in the foregoing, the data identifying the requested light flux F re f may be used to set the reference value ef for supplying a given set of light sources 117 with a given current i o t , wherein a relative dimming of the light sources 117 within the set of light sources 117 may still be performed by controlling the current flow through the light sources (see e.g. the description of Figure 25). For example, as described with respect to Figure 30, the digital processing unit 1130 may store the requested power supply data, which may be used to determine or may also directly include the reference signal U e f, to the memory 1132.

For example, when using one or more LEDs or laser diodes as light sources 117, the light flux is a function (at least) of the current i out flowing through the LEDs, the voltage V 0ut at the LEDs and the temperature of the LEDs.

In various embodiments, the light system 100, e.g. directly the light fixture 110, comprises a light sensor 120a configured to generate a signal indicative of a light flux F generated by the light sources 117. In various embodiments, also a plurality of light sensors 120a may be used and/or the light sensor 120a may also provide color information. Accordingly, in the embodiment considered, the module 908 may be configured to vary/regulate the reference signal U e f, e.g. via a PID regulator circuit, such that the signal indicative of a light flux F corresponds to the signal indicative of a requested light flux n e.g. by adapting the data stored to the memory 1132.

Conversely, Figure 34 shows an embodiment, wherein the light fixture 110 comprises a temperature sensor 120b configured to generate a signal indicative of the temperature T of the light sources 117. Moreover, the driver circuit 116 comprises a second feedback circuit 116k 2 configured to provide a signal FB2 indicative of (and preferably proportional to) the output voltage V out. Accordingly, in this case, the module 908 may have stored a look-up table or a mathematical function which determines the value of the signal i re f as a function of the signal indicative of a requested light flux n the feedback signal FB 2 and the signal indicative of the temperature T of the light sources 117.

Figure 37 shows in this respect an embodiment of the operation of the data processing unit 113.

In the embodiment considered, after a start step 920, the light fixture 110 and the respective light sources 117 are calibrated at a step 922. For example, in various embodiments, the light flux F is measured for a plurality of operating settings, such as for different values of the reference signal U e f and (if supported) by setting different relative dimming levels for the light sources 117. As mentioned before, the measured light flux F may refer not only to the brightness but also the color. Thus, essentially, the calibration phase 922 associates given reference signal /V e / with a respective light flux F.

As mentioned before, indeed the light flux F often depend also on the voltage V 0ut at the light sources 117 (signal FB 2 ) and the temperature T of the light sources 117. Accordingly, also these data may be monitored at the step 922 in order to generate a model of the light sources 117. Generally, respective data may also be derived from the datasheet of the light source(s) 117. Thus, at the end of the calibration phase 922, a model, e.g. in the form of a mathematical function or a data-structure, may be stored in the module 908, which associates to each combination of requested light flux <P re f feedback signal FB2 and signal indicative of the temperature T of the light sources 117 a respective reference signal i re f flowing through the LEDs,

Thus, in the embodiments shown in Figures 33 and 34, the module 908 sets at a step 924 the reference signal i re f as a function of data identifying a requested light flux F,· The reference signal i re f is then used by the driver 116 as the current setpoint for the output current i o t.

As described in the foregoing, the module 908 may also adapt at a step 926 the reference signal i re f for a given requested light flux F, n[ For example, LEDs or laser diodes, usually have a higher efficiency, i.e. higher lumen output at a certain current, when the junction is cold, while in normal operation, when the junction has become hot, the lumen output decreases for the same current i o t. For example, the module 908 may use the measured light flux F in Figure 33, and the output voltage V 0 ut and the temperature T in Figure 34.

The inventors have observed that the desired illumination intensity might not be reached as a light source 117, e.g. an LED, is damaged or the intensity of the light source 117 is degraded due to ageing. Similarly, often it should also be assured that the intensity of the light sources 117 does not increase above a certain threshold value, e.g. due to a malfunction in the control system 130, the module 908 or a short circuit in the wiring. This can be the case e.g. for laser based light sources 117 where a too high laser intensity could damage eyes of a user. Similarly, as will be described in greater detail with respect to possible embodiments of light sensors, also artworks 140 often have a given maximum irradiation threshold value.

For example, the embodiment shown in Figure 33 performs a control of the light flux F. Accordingly, a degradation of the light sources 117 may be compensated via the feedback of the light sensor 120a. Similarly, the light sensor 120a may provide indications for an incorrect operation of the light sources 117, e.g. an excessive or insufficient intensity of the light emitted by the light sources 117 for a given reference value i ref. However, this is not possible in the arrangement shown in Figure 34, wherein the requested signal F,-f is used in a feed-forward configuration and the signals FB2 and T are only used to correct the reference signal i re f Generally, in this case, the module 908 may also consider degradation of the light sources 117 in the model of the light sources 117, but a malfunction may not be detected easily.

In various embodiments, the driver circuit 116 may thus implement an overcurrent and/or overvoltage protection function. For example, in this case, the driver circuit 116 may comprise an electronic switch 904 configured to selectively disable the power supply of the voltage or current source 902, such as an electronic switch connected between the power supply circuit 900 and the voltage or current source 902, or an electronic switch of the switching stage 116h. For example, in this case, the voltage or current source 902, e.g. the control circuit 116m, may be configured to disable the power supply when the voltage V 0 ut and/or the current i out exceeds a respective threshold value.

Accordingly, the threshold value may be determined as a function of the maximum light flux value.

For example, when using laser diodes, the threshold should be set to a current value that limits the light output to a given maximum value when the junction is cold, thereby avoiding damages to the eyes of users. However, this implies that, when the junction becomes hotter (normal operations), the light output will decrease well below such a limit, decreasing the performances of the light fixture 110. Conversely, when illuminating artworks, the start-up phase with high intensity is usually short ( e.g . several seconds) and the threshold value should be set to a current value that limits the light output when the junction is hot. However, as will be described in the following, artworks may have different maximum illumination thresholds.

Accordingly, a fixed maximum overcurrent threshold may be suitable from an electrical safety point of view, but usually is not sufficient to ensure a maximum light flux. Moreover, an overcurrent protection may not always ensure that malfunctions in the driver circuit 116, in particular the control circuit 116m, are handled correctly.

As shown in Figure 35, in various embodiments, a redundant solution is thus used. Specifically, in various embodiments, the module 908 generates again at a step 926 a reference signal i re f, e.g. as a function of a requested light flux <P re f, and optionally a measure light flux F, or the output voltage Vout and/or the temperature T. Accordingly, in the embodiment considered, the module 908 provides the setpoint i re f for the constant current regulator 902. The regulated current generator 902 measures the current i o t and regulates it to the desired value i re f This should assure a correct illumination during a normal operation of the light fixture 110.

In the embodiment considered, the signal i re f is also provided to a module 910, which also receives a signal indicative of (and preferably proportional to) the output current i o t . In various embodiments, the module 910 may be connected to the current sensor 116ki or to an additional current sensor. Preferably, the signal feed to the module 910 is indicate of (and preferably proportional to) the average value of the output current i out . For example, in case the signal FBi is indicative of the instantaneous value, the module 910 may receive a low-pass filtered version of the signal FBi. Generally, the module 910 may be an analog and/or digital hardware circuit or a software module executed by the digital processing unit 1130.

Specifically, in various embodiments, the module 910 is configured to compare at a step 930 the measured value, e.g. the signal FBi, and the requested value i re f independently from the constant current regulator 902.

Specifically, in various embodiments, the module 910 is configured to determine at a step 928 a lower threshold and an upper threshold as a function of the value of the requested value i re f For example, in various embodiments, the module 910 is configured to calculate the upper threshold and the lower threshold by adding a given percentage of the value i re f to the value i re f and subtracting a given percentage of the value i re f from the value / re /, respectively. For example, the given percentage may be between 5 % and 20 %, e.g. 5, 10 or 20 %.

In case the module 910 detects that the measured value, e.g. the signal FBi, is smaller than the lower threshold or greater than the upper threshold (out “Y” of the verification step 930), the module 910 generates at a step 932 an error signal indicating a malfunction of the light fixture 110 and the procedure terminates at a stop step 934.

For example, as shown in Figure 35, this error signal may be used to disable the power supply of the current regulator 902, e.g. via the electronic switch 904. Additionally or alternatively, the module 910 may send a warning signal to the control system 130, optionally including also data identifying the value of the measured value, e.g. the signal FBi, and/or the reference signal i re f Generally, also in the absence of a malfunction, the module 910 may be configured to provide the measured value and/or the reference signal i re f to the control system 130, e.g. periodically or in response to a given command CMD received from the control system 130.

In various embodiments, in case the module 910 detects that the measured value is between the lower threshold and the upper threshold (output “N” of the verification step 930), the module 910 may not signal an abnormal behavior and returns to the step 924.

Conversely, Figure 36 shows an embodiment, wherein the reference signal i re f is not fed directly to the current generator 902. Specifically, in the embodiment considered, the module 908 generates again a reference signal i re f for the current generator 902. However, the reference signal i re f is provided to the module 910, which generates the reference signal i re f for the current generator 902. Specifically, in this case, the module 910 is configured to set the signal i re f to the value of the signal i re f. Next, the module 910 determines at the step 928 the upper and lower thresholds as described in the foregoing. Moreover, in case the module 910 detects that the measured value, e.g. the signal FBi, is smaller than the lower threshold or greater than the upper threshold, the module 910 may again generate at the step 932 an error signal indicating an abnormal behavior of the light fixture 110.

However, in this case, in case the module 910 detects that the measured value is between the lower threshold and the upper threshold (out “N” of the verification step 930), the module 910 may adjust at a step 936 the value of the signal i re f in order to regulate the measured value to the requested value i re f.

Thus, in this case, the module 910 implements a further closed control loop in addition to the control loop of the current regulator 902. However, also this additional control loop may be exposed to malfunctions. Accordingly, as shown in Figure 36, in various embodiments, the data processing unit 113 may comprise a “watchdog” circuit 912. For example, such a watchdog circuit 912 may be reset periodically via the module 910, e.g. at the step 936. If the watchdog detects a fault from the DPU, e.g. the watchdog circuit 912 is not reset within a given period of time, the watchdog circuit 912 may set the error signal indicative of an abnormal operation, which, e.g. , disables (e.g. via the electronic switch 904) the power supply of the current regulator 902. Thus, in various embodiments, the data processing unit 113 implements three functions: the light output control function 906, the light flux regulation function 908 and the additional ( e.g . software) current loop function 910.

In various embodiments, the light output control function 906 provides data identifying a requested light flux F h / of the light sources 117. For example, the requested light flux F,- e f ay be determined as a function of a requested and/or maximum illumination of an artwork.

In various embodiments, the light flux regulation function 908 translates the requested light flux F,- e f into a current setting i re f, possibly taking into account further data provided by sensors 116k 2 , 120a and/or 120b. For example, in Figure 33, the module 908 may adapt the reference signal / re / in order to regulate the light flux F. Alternatively, the module may adapt the reference signal re / as a function of the junction temperature T (see Figure 34) or according to a given time profile implicitly indicating the heating curve of the junction temperature, thereby compensating the different light flux during the heating phase of the light sources 117.

In various embodiments, the current loop function 910 provides the current setting i re f to the driver 116 as a setpoint, compares redundantly the target / re /o r i re f with the real value, e.g. FBI , of the current i out and decides which measures to take in case of discrepancy.

As described in the foregoing, in various embodiments, the verification of the output current i o t may be performed via a digital processing unit 1130, e.g. via respective software instructions executed by a microprocessor. Accordingly, a digital sample of the output current i out , e.g. the feedback signal FBi has to be obtained (and if required also of the other signals provided by the sensors 116k 2 , 120a and 120b). Accordingly, the data processing unit 113 may also comprise an analog-to-digital converter 914 configured to provide a digital sample indicative of (and preferably proportional to) the output current i o t. Similarly, in case the regulated current generator 902 is configured to receive an analog reference signal i re f, e.g. because the control circuit 116m is an analog control circuit, the data processing unit 113 may comprise a digital-to- analog 916 converter configured to receive a digital reference signal / re / and provide an analog reference signal i re f

Generally, due to the fact that the module 910 verifies only the correct operation of the regulated current generator 902, such an analog-to-digital converter 914 may also be low-speed and operated, e.g. with a sampling frequency being smaller than 1 kHz, preferably between 1 and 100 Hz, e.g. between 1 and 20 Hz.

Accordingly, various embodiments of the present disclosure relates to a method of operating a light fixture comprising a light module comprising one or more light sources, a power supply circuit configured to provide a DC voltage, a regulated current generator configured to provide an output current to the one or more light sources as a function of a reference signal, a current sensor configured to provide a first measurement signal indicative of the output current, and a data processing unit operatively connected to the regulated current generator and the current sensor. Specifically, in various embodiments, the method comprises executing the following steps via the data processing unit: setting the reference signal as a function of data identifying a requested illumination to be generated by the one or more light sources; determining an upper and a lower current threshold as a function of the reference signal; obtaining the first measurement signal;

- verifying whether the first measurement signal is between the upper and the lower current threshold; and in case the verification indicates that the first measurement signal is not between the upper and the lower current threshold, generating an error signal.

Possible embodiments of this solution are detailed at the following point "Example 9".

Optical elements

As described in the foregoing, in various embodiments, the light fixture 110 may include optics 115 comprising one or more optical elements. For example, the optics 115 may include one or more of the following optical elements: a lens and/or reflector, means for reducing glare, a diffuser or diffusive layer, optical filters for color changing, and/or a framer or shutter.

Figures 40A, 40B and 40C show possible embodiments of the optics 115. Generally, the light module 118 (comprising one or more light sources 117, such as LEDs) emits light. As described in the foregoing, in various embodiments, the brightness and/or the spectral characteristics of the light emitted by the light module 118 may be controllable. Specifically, in various embodiments, the brightness and/or the spectral characteristics of the light emitted by the light module 118 may be varied globally or locally. Accordingly, in general, the light emitted by the light module 118 has a given beam pattern.

Usually, the light emitted by the light fixture 110 should have a given beam angle or even a requested beam pattern. The requested beam angle/beam pattern may depend on the application needs. For example, a spotlight should have a small beam angle. Accordingly, usually a light fixture 110 comprises one or more optical elements 115 for varying the beam pattern of the light emitted by the light module, or even by each light source 117, in order to obtain a requested beam pattern, e.g. by focusing or expanding the light emitted by the light module 118 in order to obtain a requested beam angle.

For example, Figure 40A shows a first embodiment. In the embodiment considered, the light fixture 110 comprises a first optical element 115a configured to focus the light generated by the light module 118. For example, in various embodiments, the first optical element 115a may be configured to generate substantially parallel light rays. For example, the first optical element 115a may comprise one or more of: a reflector for the light emitted by the light module 118; a micro-reflector structure, wherein a micro-reflector is arranged in correspondence with each light source 117 or a set of light sources 117; a collimator lens or lens structure arranged in front of the light module 118; and a micro-lens structure, wherein one or more micro-lenses are arranged in correspondence with each light source 117.

In various embodiments, the light fixture 110 comprises also a second optical element 115b mounted at a given distance D from the first lens structure 115a. For example, e.g. in case the first optical element 115a generates substantially parallel light rays, the second optical element 115b may comprise a second reflector and/or a second lens structure configured to focus or expand the light provided by the first optical element 115a in order to obtain the requested beam angle. For example, the second optical element 115b may comprise a convex lens or lens structure. As described in the foregoing, the distance D may also be variable (e.g. via an actuator 114 configured to move the second optical element 115b), thereby selectively varying the focal point (and thus the beam angle) of the light generated by the light fixture 110. Similarly, also the distance between the light module 118 and the first optical element 115a could be variable and e.g. controlled by a further actuator 114.

As described in the foregoing, when illuminating an artwork 140, such as a painting having a rectangular form or a statue, the light beam should have a given form. Accordingly, in this case, one or more framers, shutters or gobos may be arranged in the light path between the light module 118 and the artwork 140. For example, Figure 40B shows an embodiment, wherein the optics 115 comprise an optical element 115c, wherein the optical element 115c comprises a framer, shutter and/or gobo. Generally, the optical element 115c may be arranged along the light path between the light module 118 and the artwork 140, e.g. : in the absence of the second optical element 115b, between the optical element 115a and the artwork 140; or in the presence of the second optical element 115b, between the first optical element 115a and second optical element 115b, or between the second optical element 115b and the artwork 140.

In various embodiments, one or more elements of the optical element 115c may be variable, and e.g. controlled by a further actuator 114, such as the vertical and/or horizontal aperture of a framer.

Figure 40C shows an embodiment, wherein the optics 115 of the light fixture 110 comprises also an optical element 115d, wherein the optical element 115d comprises a neutral density filter, diffuser, diffusive layer, and/or optical filters for color changing. Generally, also the optical element 115d may be arranged along the light path between the light module 118 and the artwork 140, e.g. :

- between the optical element 115a and the optical element 115b;

- between the optical element 115a (or 115b when used) and the artwork 140; or - between the optical element 115a (or 115b when used) and the optical element 115c (when used).

For example, Figure 41 A shows a typical illumination scenario, wherein a light fixture 110, e.g. fixed to the ceiling 161 of the exposition area 160, illuminates an artwork 140, e.g. fixed to a wall 163 of the exposition area 160. Accordingly, in such an illumination scenario, the artwork 140 is illuminated under a given angle. Moreover, often, a light fixture 110 does not emit uniform light. Accordingly, as shown in Figures 4 IB and 41C, when measuring the local illumination values F of the artwork 140 via a light sensor 120, such as a camera, it may be observed that the illumination F of the artwork 140 often is not uniform, e.g. because: the illumination emitted by the light fixture 110 is not uniform, and/or

- the upper portion of the artwork 140 is closer to the light fixture than the lower portion of the artwork 140.

For example, line 1100 in Figure 41C shows an example of the measured light intensity F in the vertical direction y of the artwork 140.

Accordingly, in such a scenario, a diffuser 115d may be arranged in the light path between the light module 118 and the artwork 140, in order to generate a more uniform illumination emitted by the light fixture 110. For example, line 1102 in Figure 41C shows an example of a more uniform light intensity in the vertical direction of the artwork 140. However, this does not necessarily compensate the different distances of areas of the artwork 140 from the light fixture 110

Moreover, in case a plurality of light fixtures illuminate the same artwork 140, there may be an overlap of the light generated by different light sources.

Finally, as also described in the foregoing, in various embodiments, it may also not be preferable to have a uniform illumination of an artwork 140, but specific areas of an artwork 140 should be illuminated stronger or weaker.

Thus, often the need is felt to provide optics 115, which are able to transform the beam pattern of the light emitted by the light module 118 (possibly already transformed via one or more optical elements of the light fixture 110) into a requested beam pattern, which provides the desired illumination of an artwork 140. As described in the foregoing, such a beam pattern may be defined by a bi-dimensional matrix of intensity values in a plane perpendicular to the optical axis of the light fixture 110.

Method of producing a translucent optical element

Specifically, in various embodiments, a translucent optical element, such as a diffuser or neutral density filter, is used to perform such a conversion between the beam pattern of the light emitted by the light module 118 and the requested beam pattern.

Figure 42 shows an embodiment of the optics 115 of a light fixture 110 in line with the previous description. Specifically, in the embodiment considered, the optics 115 comprise: a translucent optical element 115 2 ; a first set of optical elements 115i arranged between the light module 118 and the diffuser 115 2 , and optionally a second set of optical elements 115 3 arranged between the diffuser 115 2 and the artwork 140.

For example, as shown in Figure 40C, in various embodiments, the first set of optical elements comprises at least one lens and/or reflector 115a/l 15b, and the optional second set of optical elements comprises a framer, shutter or gobo 115c. This embodiment has the advantage that the optical transfer function of the second set of optical elements 115 3 does not further deform the beam pattern, but only sets given values of the beam pattern to zero. For example, in this case, the translucent optical element 115 2 may also be mounted in the aperture of the framer, shutter or gobo 115c, wherein the distance d2 is substantially zero.

However, in general, the translucent optical element 115 2 may be arranged at any position within the optics 115, preferably perpendicular to the optical axis of the light provided by the first set of optical elements, i.e. the second set of optical elements 115 3 may also comprise other optical elements, such as a lens, color filter, etc..

Figure 43 shows an embodiment of a tool and a respective method for implementing the translucent optical element 115 2. For example, such a tool may be implemented via software instructions, such as an application executed on a processing device, such as a smartphone or tablet, or a web-application.

Generally, in the embodiment considered, the translucent optical element 115 2 should be mounted in a given plane 1106 at a distance dl from the first set of optical elements 115i. Optionally, a second set of optical elements 115 3 may be arranged at a distance d2 from the translucent optical element 115 2. In various embodiments, the plane 1106 is perpendicular to the optical axis 502 of the light provided by the light module 118 once having passed the first set of optical elements 115i, i.e. the optical axis of the light provided by the first set of optical elements 115 1 .

After a start step 1110, the tool determines at a step 1114 the “original” beam pattern of the light emitted by the light module 118 and having passed the first set of optical elements 115i in the plane 1106 (without the translucent optical element 115 2 ).

For this purpose the tool may receive the beam pattern of the light provided by the light fixture 110 and optionally the optical transfer function of the second set of optical elements 115 3. Generally, methods for determining the beam pattern of light (also between optical elements) are per se well known in the art.

For example, in a first embodiment, the tool may determine at the step 1114 the original beam pattern in the plane 1106 by receiving at the step 1114 directly the beam pattern in the plane 1106, which e.g. may be measured by removing the optional second set of optical elements 115 3. In a second embodiment, when the second set of optical elements 115 3 is not used or has been removed temporarily, the tool may determine the original beam pattern in the plane 1106 by receiving at a step 1112 a beam pattern measured in a plane perpendicular to the optical axis 502 at a distance being greater than the distance dl, and by calculating at the step 1114 the beam pattern in the plane 1106 via geometrical projection of the measured beam pattern.

In a third embodiment, when the second set of optical elements 115 3 is used, the tool may determine the original beam pattern in the plane 1106 by receiving at the step 1112 a beam pattern measured in a plane perpendicular to the optical axis 502 at a distance being greater than the distance dl + d2, and by calculating at the step 1114 the beam pattern in the plane 1106 via geometrical projection of the measured beam pattern and as a function of the optical transfer function of the second set of optical elements 115 3 (which may be rather simple in case the second set of optical elements 115 3 comprises only a framer, shutter or gobo).

In a fourth embodiment, the tool may determine the original beam pattern in the plane 1106 by receiving at the step 1112 a beam pattern measured in a plane being arranged at a given angle with respect to the optical axis 502, e.g. the beam pattern of the illumination of the portion of the wall where the artwork should be positioned (see also Figure 41 A). Accordingly, in this case the tool may first calculate at the step 1114 the beam pattern in a plane perpendicular to the optical axis 502 as a function of the measured beam pattern and the position of the (measurement) plane with respect to the light fixture 110, and then proceed as mentioned for the second or third embodiment, i.e.:

- when the second set of optical elements 115 3 is not used or has been removed temporarily, calculating the beam pattern in the plane 1106 via geometrical projection of the beam pattern in the plane perpendicular to the optical axis 502; or

- when the second set of optical elements 115 3 is used, by calculating the beam pattern in the plane 1106 via geometrical projection of the beam pattern in the plane perpendicular to the optical axis 502 and as a function of the optical transfer function of the second set of optical elements 115 3.

For example, such beam patterns may be measured by illuminating a reference surface 1104, preferably a Lambertian surface, with the light fixture 110 and measuring the illumination/luminance of the reference surface, e.g. via a camera 120. For example, for this purpose may be used the camera of the processing device executing the tool. Accordingly, in the fourth embodiment, the reference surface may be a wall of the exposition area 160 where the artwork 140 should be fixed.

Similarly, the tool determines at a step 1118 the “requested” beam pattern of the light emitted by the light module 118 and having passed the first set of optical elements 115i in the plane 1106 (with the translucent optical element 115 2 ). For example, as described in the foregoing, a given artwork 140 should usually be illuminated with a given requested illumination pattern, which may be either uniform or custom, e.g. in order to highlight given areas of the artwork and/or to compensate overlapping illuminations of a plurality of light fixtures 110. Thus, in various embodiments, the tool may determine the modified/requested beam pattern in the plane 1106 by receiving at a step 1116 requested illumination values, e.g. represented via a bi-dimensional matrix but in the plane of a painting (or other flat artworks) or values mapped on the surface of a three dimensional model of a statue (or other 3D artworks). Accordingly, in this case the tool may first calculate at the step 1118 the requested beam pattern in a plane perpendicular to the optical axis 502 as a function of the requested illumination values and the position of the artwork 140 with respect to the light fixture 110, and then proceed as mentioned before, i.e. :

- when the second set of optical elements 115 3 is not used, calculating the requested beam pattern in the plane 1106 via geometrical projection of the requested beam pattern; or

- when the second set of optical elements 115 3 is used, by calculating the requested beam pattern in the plane 1106 via geometrical projection of the requested beam pattern and as a function of the optical transfer function of the second set of optical elements 115 3.

Thus, essentially, the steps 1114 and 1118 provide the original beam pattern which should be converted in a requested/modified beam pattern via the translucent optical element 115 2. Accordingly, at a step 1120 the properties of the translucent optical element 115 2 may be determined, which permit to transform the original beam pattern in the modified beam pattern. Next, the tool may provide at a step 1122 the technical specification of the translucent optical element 115 2 , which may be used to produce the translucent optical element 115 2. Finally, the translucent optical element 115 2 may be mounted in the above described position in the optics 115, i.e. at a distance dl from the first set of optical elements 115i and the procedure terminates at a stop step 1126.

In the following will now be described possible embodiments of the steps 1120 and 1122. Specifically, as described in the foregoing, in various embodiments, the translucent optical element 115 2 should be configured to convert an original beam pattern, identified via a first matrix of light intensity values, in a modified/requested beam pattern, identified via a second matrix of light intensity values.

As shown in Figure 44, in various embodiments, the translucent optical element 115 2 is implemented with a translucent material. For example, such a translucent material may be implemented: e.g. in case of a neutral density filter, with a light absorbing base material 1150, and/or e.g. in case of a diffuser, with opaque and/or scattering particles 1152 dispensed in the base material 1150.

Thus, the transmittance of a given area of the translucent optical element 115 2 may be modified, e.g, by:

- varying the density of the scattering particles 1152; and/or

- varying the thickness L of the translucent optical element 115 2.

Accordingly, in general, the translucent optical element 115 2 is implemented with a translucent material 1150, 1152 comprising a first surface 1154 for receiving a light intensity F 1 and an opposite second surface 1156 for providing an attenuated second light intensity F' . Specifically, given a light beam along an axis z being perpendicular to the plane 1006 of the translucent optical element 150 2 , the light beam passes through the thickness L of the translucent optical element 150 2 and is attenuated. Specifically, after dividing the material thickness L into small slices, perpendicular to the light beam direction, each having a sufficient small thickness dz , the light flux F that emerges from a given slice is reduced by a given amount άF(z) with respect to the light flux F entering the same slice, which can be approximated as follows: άF(z ) = — m(z) F(z) dz where m(z) is the attenuation coefficient of the translucent optical element and F(z) is the light flux entering a given slice. Generally, as mentioned before, the attenuation m(z) may result from absorption within the base material 1150 and/or absorption/scattering at the particles 1152. From a mathematical point of view, this leads to a first-order ordinary differential equation: which may be solved as: where L is the thickness of the translucent optical element, F 1 is the light flux entering the translucent optical element and F 1 is the light flux exiting the translucent optical element. Thus, the absorption depends on the thickness L of the translucent optical element. Specifically, in various embodiments, the translucent optical element has an (approximately) uniform attenuation factor m m(z) = m

Accordingly, in this case, the previous equation may be simplified:

F^T) = f i b -mI

Thus, the following light transmission ratio T of the translucent optical element may be defined:

Thus, knowing a requested transmission ratio T, the tool may calculate the requested thickness L of the translucent optical element as:

L = In (G)/m

Accordingly, in various embodiments, the tool may be configured to calculate at the step 1120 a matrix of transmission ratios T(x,y) as a function of the original beam pattern, essentially comprising a matrix of light flux values F i (c,g), and the modified/requested beam pattern, essentially comprising a matrix of light flux values F'(c, }).

Specifically, in various embodiments, the tool may obtain a first matrix of first light intensity values F 1 , wherein each first light intensity value F 1 is associated with a respective area of the surface 1154 and identifies the intensity of light expected to enter the respective area of said first surface 1154. Similarly, the tool may obtain a second matrix of second light intensity values F 1 having the same dimension as the first matrix, wherein each second light intensity value F 1 is associated with a respective area of the surface 1156 and identifies the intensity of light requested to exit the respective area of the surface 1156 when the expected intensity of light enters said first surface.

Based on these data, the tool may thus calculate a matrix T(x,y) of light transmission ratios having the same dimension as the first matrix and the second matrix. Specifically, each light transmission ratio T may be calculated as a function of a respective first light intensity value F 1 and a respective second light intensity value F e.g. in case of absolute intensity values, by calculating the ratio between the respective second light intensity value F' and the respective first light intensity value F 1 .

Generally, the light transmission ratio T may be between 0 and 100%, i.e. the translucent optical element may only reduce the light flux entering the translucent optical element. Accordingly, in various embodiments, in case one or more of the values of the requested beam pattern are greater than the respective values of the original beam pattern, the intensity of the light emitted by the light module 118 may be increased and a new original beam pattern may be obtained (either by estimating the new beam pattern or by performing new measurements). Similarly, in various embodiments, in case all values of the requested beam pattern are significantly smaller than the respective values of the original beam pattern, the intensity of the light emitted by the light module 118 may be reduced and a new original beam pattern may be obtained (either by estimating the new beam pattern or by performing new measurements). For example, in typical applications, the intensity of the light emitted by the light module 118 should be varied such that the values of the matrix of transmission ratios T(x,y) are between 10% and 90%, preferably between 20% and 80%.

Once having calculated the matrix of transmission ratios T(x,y), the respective thickness L(x,y) of the translucent optical element may be calculated as a function of the respective transmission ratio T(x,y) and the attenuation coefficient m of the material of the translucent optical element. Specifically, the tool may calculate a matrix L(x,y) of thickness values having the same dimension as the matrix T(x,y) of light transmission ratios, wherein each thickness value L is calculated as a function of a respective light transmission ratio T and the attenuation factor m of said translucent material. Accordingly, the matrix L(x,y) of thickness values identifies the requested thickness of the translucent material between the surface 1154 and the surface 1156 in order to obtain the intensity of light requested to exit the surface 1156 when the expected intensity of light enters the surface 1154.

Generally, the translucent optical element has thus a conversion portion having a dimension corresponding to the dimension of the beam pattern in the plane 1106, or at least the area of the requested beam pattern in the plane 1106 having non zero values, which thus (in use) receives light from the first set of optical elements 115i and modifies the beam pattern. In various embodiments, the translucent optical element may also comprise a peripheral portion, which e.g. may be used to mount the diffuser in the light fixture 110.

For example, as described in the foregoing, the second set of optical elements 115 3 may only comprise a framer, shutter or gobo. Such, elements have a diameter between 10 mm and 100 mm. Accordingly, also the translucent optical element should have a similar dimension for the conversion portion, plus a peripheral portion for mounting the translucent optical element 115 2 in the light fixture 110. Moreover, as mentioned before, the translucent optical element 115 2 may also be mounted in the aperture of the framer, shutter or gobo 115c. Accordingly, in this case, the conversion area (with variable thickness) should have a profile being complementary to the aperture of the framer, shutter or gobo.

Figure 45 A shows a first embodiment of a translucent optical element 115 2. For example, assuming that the light pattern covers an area of 100 mm x 100 mm, also the translucent optical element 115 2 should comprise a conversion area having a height and width of 100 mm. For example, based on the requested and original beam pattern, it has been determined that the transmission ratio T should be uniform in the horizontal direction (x), and increase linearly in vertical direction from a T(y= 0 mm) = 30% to T(y=\00 mm) = 80%. For example, the transmission ratio T(y) is shown in Figure 46A, which essentially has a linear profile:

Based on this equation the thickness L of the translucent optical element may be calculated as follows:

L(y) = In (ay + b)/ m

In order to calculate the above thickness values, the tool may thus discretize the surface of the translucent optical element in the vertical direction (y) in a given number N of stripes (each one having a width of 100 mm and a height of 100 mm/L) as shown in Figure 45 A.

For example, Figure 46B shows an example of the thickness L of the translucent optical element 115 2 for a uniform material having a constant attenuation factor m = 50.27 m 1 , which is a rather high attenuation coefficient obtained e.g. via PLEXIGLAS Satinice DF22 8N from Evonik Company.

Generally, as also shown in Figure 45B, with an arbitrary transmission ratio matrix T(x,y ) the same discretization may also be performed in the horizontal direction. For example, in this way, for each pixel/value T(x,y) may be calculated a respective thickness value L(x,y) of the translucent optical element.

In various embodiments, e.g. in case only low resolution beam pattern matrixes are available, the tool may also calculate a higher resolution matrix U(x,y), e.g. via interpolation of the low- resolution matrix L(x,y). For example, in various embodiments, the resolution of the thickness matrix used for producing the translucent optical element is greater than 25 dpi (dots-per-inch), preferably greater than 100 dpi, e.g. between 200 and 1200 dpi.

Generally, any suitable production method may be used for producing the translucent optical element at the step 1122 as a function of the dimensional data, in particular the thickness matrix L x,y).

For example, the translucent optical element 115 2 may be produced via an injection molding process, e.g. for mass production. Conversely, for lower production numbers may be used a material removal process, in which the thickness of a block of material may be reduced in order to correspond to the requested thickness values L(x,y). In various embodiments, due to the fact that the translucent optical element 115 2 may also be custom for a specific illumination scenario, the translucent optical element 115 2 is produced via additive manufacturing, i.e. a 3D printing process. Finally, in various embodiments, e.g. in case of a neutral density filter, the thickness matrix L(x, y ) may refer to the thickness of a coating to be applied to a flat and uniform substrate, such as a glass substrate, e.g. having a thickness between 1 mm and 2 mm. For example, such a coating may be a metallic coating, which may be applied, e.g. , via sputter deposition, vacuum evaporation, etc.

For example, the base material 1150 of the translucent optical element may be made from a plastic material, such as thermoplastic materials, e.g. polycarbonate (PC) or acrylic/polymethyl methacrylate (PMMA), silicone or glass material. For example, in this case, the absorption/scattering within the material 1150 may be obtained via absorbing and/or scattering particles 1152 distributed in the base material 1150.

For example, the particles 1152 may be AI2O3, S1O2, T1O2, etc. Generally, also various different particles 1152 may be mixed to the base material 1150, for example, a high refractive index silicone (HRI-Silicone), e.g. with a quantity of 0.5 - 5.0 wt%, may be mixed to a low refractive index silicone (LRI-Silicone), which permits to obtain a turbid mixture, which has, after curing, diffuse properties.

Accordingly, various embodiments of the present disclosure relate to a method of producing a translucent optical element for a light fixture, wherein the translucent optical element is implemented with a translucent material comprising a first surface for receiving a light radiation and an opposite second surface for providing an attenuated second light radiation, wherein the second surface is arranged at a given variable thickness from the first surface. Specifically, in various embodiments, the method comprises the steps of: obtaining a first matrix of first light intensity values, wherein each first light intensity value is associated with a respective area of the first surface and identifies the intensity of light expected to enter the respective area of the first surface; obtaining a second matrix of second light intensity values having the same dimension as the first matrix, wherein each second light intensity value is associated with a respective area of the second surface and identifies the intensity of light requested to exit the respective area of the second surface when the expected intensity of light enters the first surface; calculating a matrix of light transmission ratios having the same dimension as the first matrix and the second matrix, wherein each light transmission ratio is calculated as a function of a respective first light intensity value and a respective second light intensity value; obtaining an attenuation factor of the translucent material; calculating a matrix of thickness values having the same dimension as the matrix of light transmission ratios, wherein each thickness value is calculated as a function of a respective light transmission ratio and the attenuation factor of the translucent material, and wherein the matrix of thickness values identifies the requested thickness of the translucent material between the first surface and the second surface in order to obtain the intensity of light requested to exit the second surface when the expected intensity of light enters the first surface; and producing the translucent optical element by shaping the translucent material as a function of the matrix of thickness values.

Possible embodiments of this solution are detailed at the following point "Example 11".

Determining configuration of light sensors

The previous solutions are useful in order to determine a suitable set of light fixtures 110 and/or the respective configuration for illuminating one or more artworks 140 in an exposition area 160. However, as described in the foregoing, the artwork(s) 140 may also be exposed in exposition areas 160 where background light levels are not always negligible.

Background light might be due to the natural (sun) light, i.e. daylight as a natural light source, or artificial light sources. For example, background illumination {i.e. in addition to the light generated by one or more light fixtures 110 in order to illuminate a given artwork 140) may be provided, e.g, by: natural or artificial light sources located outside the exposition area 160, e.g. outdoors or within another room/exposition area 160, wherein light enters into the exposition area 160 through an opening of the exposition area 160, e.g. a window 164 or a door 165; other light fixtures 110 installed in the same exposition area 160 and intended to illuminate other artworks or the exposition area 160 itself, e.g. the floor 162.

For example, in Figure 12, natural/sun light may enter through the window 164, artificial light from another room may enter through the door 165, and the light fixture I IO 3 intended to illuminate the artwork 140 3 may illuminate in part also the artwork 140 4.

Accordingly, background light may vary during the exposition of the object/artwork 140, both in spectrum and intensity distribution. Usually, natural background light does not have an exact cyclic behavior, e.g. due to a variable day-light intensity during different days of a year or due to variable weather conditions. Moreover, also artificial background light may have variable characteristics, e.g. because lamps degrade and brightness varies over a long term, e.g. months or years.

Accordingly, in general, the resulting light (having a given spectrum and intensity), which illuminates a specific object 140 may vary during an exhibition, thereby resulting in a different illumination of an object 140 with respect to the target/requested illumination, e.g. as defined during a light project by a light designer. For example, such a different/variable illumination of an artwork 140 may create aesthetic issues, because the appearance of an artwork 140 changes, e.g. over a single day, a season, or a year. Moreover, insofar as background light essentially increases the illumination of an artwork 140, just using the characteristics of the light fixtures used to intentionally illuminate an artwork 140 may result in an unrealistic estimation of light dosimetry. For example, light dosimetry is often used to define the exposure limit in terms of illuminance level and duration of the exposure of an artwork. In various embodiments, the characteristics of the (natural and/or artificial) background light should thus be measured, and the characteristics of the light emitted by the light fixtures 110 should possibly be adapted as a function of the measured background light characteristics.

One of the simplest solutions consists in executing manual measurements of the illuminance level and the light spectra at each object 140 whose illumination should be controlled. This, of course, is limited by the accuracy in time as the measurements are only performed for a short moment, and the reproducibility of the measurement from the operator.

Accordingly, automated luminance measurement solutions are preferable. Specifically, in automated solution, the brightness and/or light spectrum of the light emitted by one or more light fixtures 110 is automatically controlled and optionally varied as a function of the measured luminance characteristics, e.g. in order to regulate the light characteristics to requested brightness and/or color values.

In various embodiments, the sensors 120 may thus include luminance/light sensors installed in the exposition area 160. For example, as described in the foregoing, such a light sensor 120 may be a 2D color sensor configured to acquire pixel data, wherein (at least) a subset of the pixels comprises the object 140 to be illuminated, e.g. a painting, such as a camera configured to acquire a picture/image of the object 140. In various embodiments, the sensor 120 and/or the control system 130 may be configured to process the pixel data, e.g. in order to: extract the color data of the subset of pixels associated with the object 140; calculate a mean color value for the subset of pixels; and/or calculate color values for sub-segments of the subset of pixels.

Accordingly, in this case, the pixel data and/or the calculated color values are indicative of the brightness and the color reflected by the object 140. In general, also only the brightness data may be used, e.g. by using mere luminosity sensors, such as a monochrome (e.g. grayscale) camera, or by converting the color data into monochrome (e.g. grayscale) pixel data.

For example, the control system 130 may be configured to regulate the intensity of light emitted by the light fixtures 110 as a function of the brightness data. For this purpose, the target/requested values for the illumination of the object 140 are obtained. For example, as mentioned before, these target data may be stored in an artist illumination database 208. However, these data may also include maximum illumination values, which may be stored, e.g. , in the artwork database 206.

Generally, the requested and/or maximum illumination values may also be determined by (manually or automatically) classifying the artwork 140, wherein with each class of artwork are associated requested and/or maximum illumination values.

Accordingly, in various embodiments, the control system 130 is configured to compare the measured brightness values with the target and/or maximum values, and regulate the intensity of light emitted by the light fixture(s) 110. Substantially, this involves activating different subsets of light fixtures and/or different subsets of light sources within the light fixtures, and/or by performing a dimming operation of the light emitted by the light fixtures 110. Similarly, in case the lighting system comprises light sources with different spectral/color characteristics, the control system 130 may also be configured to regulate the spectral characteristics of the light emitted to the object 140, essentially by varying the brightness of the light emitted by the various light sources in order to obtain a combined light with given spectral characteristics.

For example, in this context may be cited documents WO 2016/203423 Al, WO 2018/189007 A1 or US 2017/265267 Al, which are incorporated herein by reference for this purpose.

As will be described in greater detail in the following, the characteristics of the illumination of an object 140 may be measured e.g. via: a light sensor positioned in proximity of the object 140, thereby measuring the light received at the object 140; a light sensor positioned in proximity of the light fixture 110, thereby measuring the light emitted by the light fixture 110, which permits to calculate the light received at the object 140 as a function of geometrical data specifying the position of the object with respect to the light fixture(s) 110; a light sensor, such as a camera, configured to measure the characteristics of the light reflected by the object 140; and/or a light sensor, such as a camera, configured to measure the characteristics of the light reflected by a reference surface positioned in proximity of the object 140.

Generally, the selection of the sensor(s) to be used depends usually on the application needs, e.g. with respect to the costs involved and the characteristics of the exposition area 160 and/or the artwork 140. Usually, the selection of light sensors 120 is performed by a light designer which creates a light plan for a given exposition area 160. For example, based on the selected light sensors, also the installation characteristics of the light sensors 120 change.

For example, the simplest solution for measuring the luminance of an object 140 is based on sensors installed in proximity of the object 140. However, this approach is often not welcome, because it requires installing an electronic device in the proximity of the object. Furthermore, this may have an aesthetical impact because a sensor 120 has to be placed near or even on the object 140. Moreover, possible malfunction of the sensor 120 may create a risk for the artwork 140, e.g. due to fire generated by the sensor 120. Finally, the installation of such sensors 120 is often complex, because a power supply has to be provided or a battery of the sensor 120 needs to be replaced regularly. Generally, also such installation/maintenance operations may damage the object 140.

Conversely, the solutions which measure the luminance of the object 140, i.e. characteristics of the light reflected by the object 140, are often rather limited in performance because reflected light has specular and diffusive components, which are not easy to characterize. For example, for this reason, the reflected light changes with the angle of incidence of the light generated by the light fixture(s) 110 and the angle of observation, both in brightness and spectral response. Moreover, the reflectivity of an object may also vary over the time, e.g. due to a variable temperature.

As mentioned before, usually the selection of sensors 120 is performed manually by a light designer. Conversely, in various embodiments, a method is proposed which (at least in part) automatically assesses and evaluates the exposition situation and provides a recommendation, which kind of light sensor 120 (sensor type, sensor combination, operating parameters, placement of sensors relative to the art object) should be used.

For example, such method may be implemented with a software program to be executed by one or more computer. In various embodiments, such a computer comprises a data processing unit, a data storage and a display with a graphical user interface (GUI). For example, the method may be implemented with a web application executed by a Webserver and/or an APP to be executed by a mobile device, such as a smartphone or tablet. Various embodiments relate thus to a tool (device with implemented software programs und user interface) that comprises and uses the above-mentioned components and methods.

In various embodiments, this software tool for selecting the sensors 120 (sensor selection tool) may also be combined with the software tool used to determine the configuration of light fixtures 110 (light fixture selection tool). Specifically, as described in the forgoing, the light fixture selection tool permits to select the light fixtures and optionally the respective configuration to be used to illuminate a given artwork 140. Preferably, the light fixture selection tool already uses a 3D model of the exposition area 160, which permits to generate a simulation of the illumination of the exposition area 160. Specifically, in various embodiments, each light fixture 110 has associated data identifying the position of the light fixtures, their intensity and their radiation pattern. Alternatively, similarly data may also be entered manual by a light designer.

Accordingly, once having obtained the 3D model of the exposition area 160, including the artworks 140 and the light fixtures 110, the sensor selection tool may similarly perform simulations of the illumination of the exposition area 160. Generally, when the sensor selection tool and the light fixture selection tool are combined, these simulations may also correspond to the simulations performed for selecting the position of the light fixtures 110. Generally, instead of using a 3D model, also a simplified simulation may be performed by using only a two- dimensional model of the exposition area 160 (more or less as shown in Figure 12).

Specifically, as mentioned before, illumination of an artwork 140 may include both the light generated by the light fixture(s) 110 dedicated to a specific object 140 and (natural and/or artificial) background light. Accordingly, light sensors 120 have to be placed in positions, which permit an efficient monitoring of the illumination of the artworks.

In general, the selection of light sensors 120 may be performed manually or automatically. For example, based on the characteristics of the artwork, the tool may propose for each artwork 160 a suitable sensor. The tool may also propose a plurality of suitable sensors and one of the sensors may be selected manually, e.g. by a light designer. Generally, the tool may not propose sensors 120 for all artworks 140, but only for those artworks 140, which are indeed exposed to a variable background light.

Figure 13 shows a possible embodiment of the operation of the sensor selection tool. In the embodiment considered, after a start step 400, the sensor selection tool determines the artworks exposed to variable background light by: simulating the illumination of the exposition area 160 when the light fixtures 110 are switched off; and/or simulating the illumination of the exposition area 160 by varying ( e.g . between a minimum value and a maximum value) the light intensity of light sources associated with background light, e.g. the (virtual) light sources used to simulate light entering through doors 165 and/or windows 164 of the exposition area 160 (e.g. between the minimum and maximum value of sunlight).

Similarly, as mentioned before, an artwork 140 may also be illuminated indirectly by light fixtures 110 used to illuminate other artworks 140. Accordingly, in case these light fixtures may generate a variable illumination, also the combined illumination may be variable.

For example, for this purpose, the sensor selection tool may select at a step 402 a given configuration for the natural and artificial light sources in the 3D model and simulate the configuration at a step 404. These steps are then repeated, e.g. for a given number of possible configurations. For example, this is schematically shown via a verification step which verifies whether all (background) light scenarios have been simulated. In case other light scenarios have to be simulated (output “N” of the verification step 406), the tool returns to the step 402 for selecting another light scenario. Conversely, in case all light scenarios have been simulated (output “Y” of the verification step 406), the tool proceeds to a step 408.

In various embodiments, the tool may determine at the steps 408 and 410 the artworks to be monitored via light sensors. For example, in various embodiments, the tool determines at the step 408 for each artwork 140 a minimum and a maximum value of the background light (generated by other natural and/or artificial light sources) as a function of the illumination simulations. In various embodiments, the tool may use these values in order to determine at the step 410 a set of light sensors to be used. For example, in case the difference between these values (representing the variability of background light) exceeds a given threshold, the tool may determine at the step 410 that a light sensor 120 should be used to monitor the respective artwork 140.

Additionally or alternatively, the tool may determine at the step 408 for each artwork 140 other illumination characteristics, e.g. the maximum brightness and/or the spectral characteristics. As mentioned before, these illumination characteristics should correspond to given target values and/or be below a maximum threshold value. Accordingly, in this case, the tool may determine at the step 410 whether the illumination characteristics (determined at the step 408) correspond to given target values and/or are below a maximum threshold value. For example, in case the illumination characteristics (determined at the step 408) do not correspond to given target values and/or are not below the maximum threshold value, the tool may determine at the step 410 that a light sensor 120 should be used to monitor the respective artwork 140.

For example, as shown in Figure 14, the sensor selection tool may recommend the use of light sensors 120i, 120 2 and 120 4 for monitoring the artworks 140i, 140 2 and 140 4 , e.g. because:

- the artworks 140i and 140 2 are illuminated by variable natural light entering through the window 164; and

- the artwork 140 4 is illuminated by natural light entering through the door 165 and in part light generated by the light fixture 1 I O3.

Conversely, in the example considered, the sensor selection tool determines that the illumination of the artwork MO3 does not vary significantly during the simulation and thus determines that no light sensor 120 is required to monitor the artwork 140 3 , i.e. a light sensor 120 may be omitted for the artwork 140 3.

Generally, the sensor selection tool may not only determine the artworks 140 to be monitored via a respective light sensor 120, but may also determine at the step 410 which light sensor 120 should be used to monitor a given artwork, or a user may also select preferred light sensors.

In various embodiments, the tool may also acquire data identifying the position of already installed light sensors 120 within the exposition area 160. For example, also in this case may be used data already stored in a database, e.g. in the exposition area database 204, or by receiving the data via a manual input or at least in part automatically, e.g. by acquiring image data of the exposition area 160.

For example, the (new) light fixtures 1 lOi and 1 I O2 may already include light sensors configured to measure the light emitted by the respective light fixture 110. In this case, the tool may also provide the data, in particular the geometric data (distance and relative position of the artwork with respect to the light fixture) in order to calculate the light received at the object 140 as a function of these geometrical data. Conversely, insofar as the light fixtures I I O3 and I I O4 are already installed, the tool proposes the installation of a new light sensor. For example, considering that a 3D object MO 4 has to be illuminated, which renders it rather difficult to install a light sensor 120 or a reference surface in the proximity of the object 140 4 , the sensor selection tool may propose the installation of a light sensor 120 4 , such as a camera, configured to measure the characteristics of the light reflected by the object 140 4. Thus, in general, the tool may be configured to select a light sensor type as a function of the type and/or characteristics of the object 140, and/or the characteristics of the exposition area 160.

In various embodiments, the sensor selection tool may permit that the user confirms or changes these recommendations, e.g. with respect to the number, type and/or position of the light sensors 120

Accordingly, the light fixture selection tool may be used to determine (at least in part) automatically a light plan specifying the illumination of an exposition area 160 based on the available light fixtures 110 and/or new light fixtures 110 to be installed, possibly also taking into account other natural or artificial light sources, and/or the reflectivity of the objects 140 and/or the exposition area 160. For example, due to the fact that each object 140 may be illuminated by different natural or artificial light sources, the light fixture selection tool may vary the light plan in order to optimize the overall illumination taking into account the local illumination required at the objects 140.

Conversely, the sensor selection tool may be used to include in such a (automatically generated or manually inserted) light plan also a sensor system used to monitor and optionally adapt the illumination of the artworks 140 based on variable illumination, in particular with respect to possible background light. Specifically, in various embodiments, the sensor selection tool may provide proposals which artworks 140 should be monitored via light sensors 120 and/or a light designer can define or change the artworks 140 to be monitored. For this purpose, the sensor selection tool may determine which light fixtures 110 illuminate each object 140 and as a consequence have to be controlled to regulate and/or verify the illumination, e.g. in order to assure that the illumination of an object 140 is below the chosen threshold.

In various embodiments, the sensor selection tool is also configured to determine which light sensor should be used to monitor a given artwork, e.g. as a function of the characteristics of the artwork 140 and/or the exposition area 160 and/or the respective light fixture used to illuminate the artwork 140 (e.g. because the light fixture already includes a light sensor 120). In various embodiments, the sensor selection tool may also determine the geometrical positioning of the light sensors 120.

As mentioned before, the light sensors 120 may be used to regulate the illumination of an artwork 140 to requested target values (e.g. in terms of color and/or brightness) and/or to verify whether the illumination is below a given maximum value. Accordingly, the sensors 120 may be connected to the control system 130 in order to control and optionally regulate the luminosity of the respective artwork 140 by sending control commands to the light fixtures.

In general, each of the light sensors 120 provides only data being indicative of the illumination of the respective artwork 140. For example, a light sensor 120 positioned in proximity of an artwork 140 is usually not illumined with the same (maximum) illumination as the artwork 140, because the sensor 120 is usually positioned outside the center of the beam angle. Accordingly, in general, each light sensor has a given sensitivity function, which permits to calculate/estimate the illumination at the artwork as a function of measured (ID or 2D) data, such as brightness and/or color pixel data. For example, the sensitivity of a given light sensor 120 may be a function of the color, the angle with respect to the light fixture, the distance with respect to the artwork 140, etc.

In various embodiments, the sensor selection tool may thus also perform at a step 412 one or more illumination simulations of the 3D model with the positioned light sensors 210. Specifically, for this purpose, the tool may acquire the characteristic data of the light sensors 210, which may be stored in a sensor database 218, which e.g. may form part of the local and/or remote database 200, and/or may be determined as a function of the position of the light sensor 210 with respect to the respective light fixture 110 and optionally the respective artwork 140. Accordingly, in various embodiments, the sensor selection tool may determine (for one or more different illumination situations) the expected measured value(s) for one or more of the light sensors 120 and the respective expected illumination of the associated artwork 140. These data may then be used to determine respective control information for the control system 130 to be used to control and optionally regulate the light emitted by the light fixtures 110. For example, the sensor selection tool may determine a respective threshold value for the measured value as a function of the expected illumination at the artwork and the maximum value for the illumination of the artwork.

Accordingly, once the sensor selection tool has been used to determine a given set of light sensors, the procedure terminates at a stop step 414. Next the lighting system 100 may be installed and the respective control information (determined by the light fixture selection tool and/or the sensor selection tool) may be provided to the control system 130. Accordingly, in operation, the control system 130 is able to receive the data from the light sensors 120 and verify these data, e.g. verify whether the measured value are above the previous mentioned threshold, and/or control the operation of the light fixtures 120, e.g. reduce the light intensity of a light fixture 110 when the respective measured value exceeds the threshold and/or in order to regulate the measured values to given reference values (e.g. to requested brightness and/or color values). Accordingly, various embodiments of the present disclosure relate to a method of selecting at least one light sensor for a lighting system used to illuminate at least one artwork in an exposition area via one or more light fixtures configured to emit light with variable characteristics as a function of a control command. Specifically, in various embodiments, the method comprises the steps of: obtaining a digital model of the exposition area, the digital model including: o exposition area data comprising data identifying the dimension of the exposition area; o artwork data comprising data identifying the position of the at least one artwork within the exposition area; o light fixture data comprising data identifying the position, orientation and illumination characteristics of the one or more light fixtures; and o background illumination data comprising data identifying the position and illumination characteristics of other natural and/or artificial light sources emitting light within the exposition area 160; executing a plurality of illumination simulations of the digital model of the exposition area by varying the illumination characteristics of the one or more light fixtures and/or the illumination characteristics of the other natural and/or artificial light sources, and determining for each illumination simulation data identifying a respective expected illumination of each of the at least one artwork, and determining a set of light sensors for monitoring the illumination of the at least one artwork as a function of the data identifying the expected illumination of the at least one artwork.

Possible embodiments of this solution are detailed at the following point "Example 4".

Light sensors

As mentioned before, the light sensors 120 and the control system 130 may be used to monitor the illumination of an object 140. Such a system can be used for example in lighting systems for artworks 140, such as paintings, graphics, photographs or lithographs, textile compositions etc., to protect them against damages caused by the excessive illumination or in general irradiation with electromagnetic waves. Such objects 140 are typically illuminated by light fixtures 110, such as spotlights. As described in the foregoing, such light fixtures 110 illuminate the object 140 with light having a given frequency spectrum and given light intensity, which may be selected based on the characteristics of the object 140, such as a respective light sensitivity of the surface of the object.

The above-mentioned damages can have various physical or chemical causes. On the one hand, a process of photochemical decomposition can be initiated by absorption of high-energy light quanta of the incident radiation in the molecules of the corresponding object surface. Since specific activation energies have to be exceeded for a corresponding direct cleavage of the molecules, usually short-wave light, especially UV light, can lead to greater damages.

On the other hand, chemical reactions can also be caused by light absorption of impurities or foreign substances, the so-called sensitizers, which transfer the absorbed energy to the actual reaction partner. The often long-chain polymer molecules can then be destroyed by radical formation (oxidation) or catalysis. Temperature and humidity can also play a role. In any case, even long-wave light, especially in the visible wavelength range, can have a damaging effect on the surface of the object.

Light in the infrared wavelength range may also cause damages, for example through thermal expansion of the surface (mechanical tension) or through dry damage resulting in cracking, or through phase transitions in plastics or glass materials that are not initially visible, with obvious consequences for the surface quality.

As a result, the material and fabric properties of the irradiated object surface play an important role for light resistance of the object 140. This is particularly relevant due to the numerous pigments used in artists’ paints. For this reason, these - like binders and carrier materials such as paper or textiles - are usually divided into light sensitivity categories or light fastness classes, which then allow exhibitors to take individual restrictive measures with regard to the exposure of objects to light.

However, for reasons of general comprehensibility and also to facilitate verifiability, museums regularly apply simplified rules that set common upper limits for maximum illuminance, e.g. 50 lx for paper and textiles or 150 lx for oil paintings without differentiating between pigments, etc. This unified approach often prevents good illumination of low-contrast, delicate objects in cases where light-stable pigments or materials are used, which makes observation difficult, especially for older people. Nor does this approach take into account spectrally dependent light sensitivity. In addition, the total radiation exposure accumulated over time is generally decisive, which can be determined, for example, within the framework of monitoring by so-called blue scales or LightCheck strips. Furthermore, measurements can also be made using so-called light data loggers, wherein the light irradiation is recorded periodically via a light sensor 120. For example, for this purpose, the measured values can be temporarily stored in a data memory of the sensor 120 and/or the control system 130, and used to calculate through temporal integration the light exposure. For example, based on these data, the light sensor 120 and/or the control system 130 may determine/estimate a resulting color deviation and/or a remaining useful life. A special example of such a system is described, for example, in the publication WO 2013/017287 Al. However, the objects 140 are often not uniformly illuminated. For example, as mentioned before, the maximum intensity of the illumination is usually in the center of the object 140, but not at the edge, where the light sensor 120 may be installed. Therefore, the light sensor 120 will often not measure the actual light irradiation of the object 140, but only a representative value.

Accordingly, as mentioned before, the light sensor 120 to be used to monitor a given artwork 120 should be selected taking into account the characteristics of the artwork 140, the light fixture 110 and/or the exposition are 160. For example, in some cases, low-cost light sensors 120 may be sufficient, while for other artworks 140 more precise solutions may be required to monitor the illumination also of sub-areas of the object 140 and/or in order to take into account the various materials of the artwork 140.

In various embodiments, the measured data provided by the light sensor 120 and/or the calculated illumination as determined as a function of the measured data (or the variations thereof) may be stored in the data storage unit to provide a track record for the illumination of the object 140.

First embodiment of light sensor

Figure 15 shows a first embodiment of a lighting system 100 configured to monitor the illumination/irradiation of an object 140 with light 500 generated by a light fixture 110. In the embodiment considered, the object 140 is an artwork, such as an oil painting with a layer of paint 144 containing pigments on a carrier material 141 such as canvas. The object 140 may also comprise a frame 143 and a surface 142 formed by the color layer 144 and illuminated by the light fixture 110 with the light 500.

In the embodiment considered, the light fixture 110 is a spotlight, but it can also be a floodlight, downright or other right fixtures. In the embodiment considered, the right fixture comprises one or more right sources 117, such as LEDs, of which only one right source 117 is schematically shown in Figure 15. The right emitted by the right sources 117 is focused or expanded by optics 115. For example, as mentioned before, the optics 115 may comprise a lens, a lens system comprising a plurality of lenses, and/or reflector. In various embodiments, the distance between the right sources/LEDs 117 and the optics 115 or parts thereof ( e.g . objective lenses) may be adjustable, for example, to achieve a desired beam widening or focusing, so that, for example, optimum and as much as possible homogeneous illumination of the surface 142 of object 140 is achieved.

In the embodiment considered, a light sensor 120i is integrated in (the housing of) the light fixture 110. Specifically, in the embodiment considered, the light sensor 120i is placed in the area of the light 500 emitted by the light sources/LEDs 117, e.g. between the light sources/LEDs 117 and the optics 115, within the optics 115 (e.g. between two lenses or within a reflector), or after the optics 115.

In the embodiment considered, the light sensor 120i is configured to generate a measurement signal indicative of the brightness of the light 500 passing through its active surface. Generally, the light sensor 120i may also provide a plurality of brightness vales for respective colors/wavelengths.

In the embodiment considered, the light sensor 120i is connected to a control device. This control device may be a data processing unit 123 of the light sensor 120i, a data processing unit 113 of the light fixture 110 and/or a data processing unit 133 of the control system 130 (see also the description of Figures 2 to 4).

For example, in the embodiment considered, a data processing unit 113 of the light fixture 110 is configured to receive the measurement signal provided by the light sensor 120i and to transmit the measured data indicative of the luminance to a data processing unit 133 of the control system 130. For example, in various embodiments, the control system 130 may comprise or consist in a smartphone or tablet, and the data processing unit 113 may transmit the measured data via a wireless communication interfaces 111 (within the light fixture) and 131 (within the control system 130) to the control system 130. For example, the communication may be via a WLAN and/or ZigBee connection. Intermediate stations (not shown) like bridges or even routers (Internet) can be provided, i.e. the communication between the light fixture 110 and the control system 130 may be at least in part wireless.

Thus, in the embodiment considered, the measurement data are transmitted to the data processing unit 133 of the control system 130, i.e. the data processing unit 133 receives (directly or indirectly) the measurement data from the light sensor 120i.

In various embodiments, the data processing unit 133 obtains the measurement data periodically, preferably at short time intervals, such as 1 second. For example, the data processing unit 113 may periodically obtain the measurement data from the light sensor 120i and transmit these data to the data processing unit 133, or the data processing unit 133 may send a control command to the data processing unit 113 requesting a new measurement. In various embodiments, the duration of the time interval may thus be controllable, e.g. via the data processing unit 113 or the data processing unit 133.

In general, the measured luminance depends on the placement of the light sensor 120i, e.g. the distance from the light sources/LEDs 117, the angular deviation from an optical axis 502 (see also Figure 17) of the light fixture 110, if, for example, a luminous intensity has a maximum value along this optical axis 502 and decreases with increasing angular deviation from the optical axis 502 ( i.e ., the light fixture does not represent a Lambert's emitter), the orientation of the light sensor 120i with respect to the light radiation 500. In any case, in various embodiments, the light sensor 120i is positioned at a fixed position, where the measured luminance is indicative of (and preferably proportional to) the total light 500 emitted onto the surface 142 of the object 140.

As an alternative or in addition to the light sensor 120i, the light fixture 110 may comprise a current and/or voltage measuring device. For example, as described with respect to Figures 5 to 7, the light sources 117 (LEDs) may be supplied with a power supply generated by a driver/power supply 116, such as an electronic converter. In such an electronic converter, the current fed to the light sources 117 (LEDs) may be measured, e.g. via the feedback circuit 116k. The current intensity generally correlates with the illumination intensity. Accordingly, also in this case, the measured current value may be transmitted to the data processing unit 133 of the control system 130.

In various embodiments, the data processing unit 133 may thus process the transmitted data in order to determine the luminance of the light sources 117. Alternatively, the data processing unit 113 may already process (at least in part) the measured data and transmit processed data.

For this purpose, in various embodiments, the control system 130 may comprise a data storage device 132 (or similarly the light fixture 110 may comprise a data storage device 112). As mentioned before, the control system 130 may be implemented also in a distributed mode (see also the description of Figure 8). Accordingly, such a data storage device 132 may be local and/or remote. Specifically, in various embodiments, the data storage device 132 (or 112) has stored a mathematical function or a (Look-up) table providing a relationship between a measured luminance value and/or a measured current value and a respective actual luminance of the light sources 117 (such as a luminous flux, luminous intensity etc.). For example, these characteristics of the light sensor 120i and/or the current sensor 116k may be stored in a sensor database 218. For example, the values for such a table may be determined in advance by experiments and measurements.

In various embodiments, this mathematical function or table may also take into account the variation of the relationship between the current measurement and the actual luminance of the light sources 117 due to ageing effect or degradation of the light sources 117, according to which the light output converted by the current usually decreases over time, i.e. the function or table may also take into account the total operating time of the light sources 117.

Accordingly, in various embodiments, the processing operation is based on the total operating time of the light sources 117. For example, this information may be monitored within the light fixture 110 and transmitted to the control system 130. For example, the control system 130 may store these data to the light fixture database 202. For example, in Figure 15 is schematically shown a timer module 504 of the data processing unit 113, which is configured to monitor the operating time of the light sources 117. In various embodiments, the data processing unit 133 of the control system 130, e.g. the smartphone, 6 may thus be configured to determine the luminance or an analog quantity (luminous flux, luminous intensity etc.) of the light sources 117 as a function of the measured current value and the total operating time.

As mentioned before, the light sensor 120i may also provide a plurality of luminance values as a function of wavelength or for different wavelength ranges, such as in the IR range, the visible range and/or the UV range. For example, for this purpose, several light sensors may be included in the sensor 120i, each covering a different wavelength range.

Conversely, when measuring the current, only the current intensity can be measured, which does not provide an indication of the wavelength spectrum. However, the data storage device 112 or 132 may also comprise data identifying the distribution of illuminance over the wavelength ranges for types of light sources 117 used in the light fixture 110. For example, these data may be stored in the light fixture database 202. Also this information may have been determined and entered in advance by experiments and measurements. In various embodiments, a shift in the spectrum of the emitted light caused by ageing of light sources 117 may also be taken into account in the function or table. This means that the distribution of luminance over the wavelength ranges is taken into account depending on the current total operating time of the light sources 117.

In various embodiments, instead of approximating the ageing of the light sources 117 only via the total operating time, also more complex degrading models may be used, which e.g. take into account also other parameters, such as the current fed to the light sources 117, the operating temperature of the light sources, etc.

By taking the wavelength range into account, monitoring of the irradiation load of the surface 142 of object 140 is further improved, since the shorter the relevant wavelengths are, the more harmful the illumination is usually for object 140. This is expressed, for example, in a fading of red colors or color pigments that absorb the more energetic blue radiation, while blue colors or color pigments reflect the blue radiation more strongly.

The radiation emitted by the light fixture 110 hits the surface 142 of the object 140, on which an intensity distribution can be determined or calculated by the system 100. For example, Figure 16 shows an example of the iso-intensity lines of the light fixture, wherein two maxima II, 12 are highlighted, one (II) in the middle of the surface and a smaller one (12 < II) displaced with respect to the first one. For example, the second one may be an artifact resulting from the radiation characteristic of the light fixture. Specifically, in the example, this second maximum occurs in the direction of an upper side 142a of surface 142, which is positioned closer to the light fixture 142 than the opposite lower side 142b of the object 142. Although the intensity maximum 12 is smaller than that of II, this intensity can be decisive for the adjustment of the target and/or maximum intensity of the light fixture as a whole, since it may illuminate e.g. an area of the object 140 that is particularly light-sensitive, while the higher intensity maximum II may be located on a more light-insensitive part of the object 140. Figure 16 also shows that the intensity profile may be enlarged slightly in the lower direction 142b. The geometrical arrangement causing this type of illumination is shown in Figure 17. As described in the foregoing, the object 140 may be fixed or suspended on or near a wall 163 of a room/exposition area 160, and due to the fact that the light fixture may be mounted at a given height, such as at the top of a room ceiling 161, the object 140 is obliquely illuminated by the light fixture 110.

The cone of emitted light 500 expanded/focused by the optics 115 of the light fixture 110, or its optical axis 502, falls on the object 140 at an angle a deviating from the surface normal of surface 142 (or the angle to the plane of the surface is 90° - a).

According to various embodiments, the data processing unit ( e.g . 113 or 133) is configured to determine the illumination of the object 140 also as a function of the angle a, i.e. the data processing unit takes into account the oblique incidence. In various embodiments, the angle a may be stored, e.g. in the light fixture database 202 or the exposition area database 204. Additionally or alternatively, an inclination sensor 120 2 is fixed to or forms part of the light fixture 110, i.e. the inclination sensor 120 2 is configured to measure the inclination of the light fixture and transmit the result, e.g. , to the data processing unit 113 or directly the control system 130.

In various embodiments, the data processing unit (e.g. 113 or 133) is configured to determine the illumination of the object 140 also as a function of the distance from the wall 163 or the artwork 140. In various embodiments, this distance may be stored, e.g. in the light fixture database 202 or the exposition area database 204. Additionally or alternatively, an distance measurement sensor I2O3 may be fixed to or forms part of the light fixture 110, i.e. the distance measurement sensor I2O3 is configured to measure the distance of the light fixture 110 to the wall 163 or the artwork 140 and transmit the result, e.g. , to the data processing unit 113 or directly the control system 130. For example, in various embodiments, the distance measurement sensor I2O3 may be an ultrasonic sensor configured to measure a distance d between the light fixture 110 and the wall 163 to which the object 140 is attached.

For example, in various embodiments, the control system 130 is configured to receive the data from the sensors 120 2 and I2O3 and store these data to the exposition area database 204. Generally, the control system 130 may also store other data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 of the object 140. As mentioned before, the data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 (e.g, the distance d and/or the angle of inclination a) may also be entered manually and stored, e.g. in the light fixture database 202 and/or the exposition area database 204.

In various embodiments, also further spatial positioning data, such as a lateral inclination angle and/or a lateral offset (perpendicular to the distance d) of the light fixture 110 with respect to the center of the surface 142, may be measured or entered manually, and stored, e.g. in the light fixture database 202 and/or the exposition area database 204 (see e.g. the light fixtures IIO3 and 1 IO4 in Figure 12). The information on the spatial positioning of light fixture 110 with respect to the surface 142 of the object 140 are thus able to identify a position and orientation of the object surface 142 with respect to the light fixture 110, thereby permitting the calculation, via a mathematical projection or interpolation, of the beam path.

As mentioned before, in various embodiments, also the radiation characteristic of the light fixture 110 may be stored, e.g. in the light fixture database 202. This radiation pattern may be measured once, for example after production of the light fixture 110 or once for all luminaires of this type. The radiation characteristic describes a direction-dependent output of light or luminous intensity of a luminaire for a given light intensity emitted by the light fixture. For example, in various embodiments, such a radiation pattern may be identified via a two-dimensional intensity distribution in a reference plane perpendicular to the optical axis 502 at a certain distance from the light fixture 110. Generally, also a plurality of radiation patterns may be stored, e.g. when the light fixture 110 supports a plurality of operating conditions, e.g. when various sub-sets of light sources 117 in the light fixture 110 may be controlled independently and/or the optics 115 are controllable.

Accordingly, the radiation pattern of the light fixture 110 may be projected on the surface 142 as a function of the spatial positioning data. For example, the distance of the reference plane from the light fixture and a distance of a given point in the reference plane from the point of intersection of the reference plane with the optical axis 502, the data processing unit 133 may thus determine an angle. This angle can then be used to project the respective radiation characteristic at the given point onto a point of the surface 142 of the object 142.

Moreover, the data processing unit 133 may calculate a respective distance of the point of the surface 142 from the light fixture 110 (as a function of the spatial positioning data) and determine an expected intensity by taking into account that the intensity decreases with increasing distance due to the beam widening.

In various embodiments, the radiation characteristic may also include several intensity distributions in multiple planes at different distances from the light fixture 110, so that the data processing unit 133 may calculate the expected intensity distribution on the surface 142 of object 140 via interpolation or extrapolation between the planes. For this purpose the surface 142 can be divided into a grid or a matrix, whereby the local luminance is calculated for each grid point depending on the position in the relevant plane of the radiation characteristic and the distance from the light fixture 110, whereby a second plane of the radiation characteristic (a plane at a distance "in front" of the considered position on object 140, another plane at a distance "behind" the considered position on object 140) can also be used and then interpolated. The grid points may be arranged at a distance of a few millimeters up to a few centimeters.

Accordingly, once having obtained the expected illumination pattern at the object 140, the data processing unit 133 may determine the actual illumination pattern as a function of the light intensity value actually measured/calculated as a function of the data provided by the sensor 120i or 116k. For example, the distance of the sensor 120i from the light fixture 110 and the optical axis 502 may be used to determine a reference point in the radiation pattern, and the value of the reference point and the measured/calculated light intensity value may be used to determine a multiplication factor for the intensity values determined for the surface 142. Generally, the multiplication factor may also be stored, e.g. in in the light fixture database 202 and/or sensor database 218, and the value may be read by the data processing unit.

Accordingly, in various embodiments, the system shown in Figure 15 takes into account the information about the spatial positioning between the light fixture 110 and the surface 142 of the object 140, how the optical axis 502, i.e. a main beam direction, of the light fixture 110 is aligned with respect to the surface 142, since the radiation characteristic itself are related to that optical axis 49 or main beam direction. As described in the foregoing, preferably, the main beam direction of the light fixtures is centered with respect to the center of the surface 142 of the object 140.

Thus, on the basis of the information stored, e.g. in the databases 202, 204 and 218, and the measured light intensity value(es), the data processing unit 133 (possibly in collaboration with the data processing unit 113) is able to calculate a local intensity for any position on the surface 142 of the object 140, e.g. in order to obtain a distribution as shown in Figure 15. Similarly, also a plurality of distributions for different wavelengths or ranges of wavelengths may be determined.

As mentioned before, these distribution of intensity values may be used by the control system 130 in order to verify whether the global and/or local intensity/illumination values exceed one or more maximum values and/or in order to regulate the intensity values to requested values. For example, as described in the foregoing, the local intensity at the surface 142 should now not exceed a certain maximum limit in order to prevent damages, e.g. , to the paint application, Specifically, as described in the foregoing, a given surface may not have a single global maximum irradiation value (or plural maximum values for different wavelengths), but each of a plurality of individual surface areas (e.g. the locally different pigments or materials of the object) may have a respective maximum value (or plural maximum values for different wavelengths). For example, inorganic pigments such as zinc white or ultramarine are generally more light- resistant than organic dyes.

In various embodiments, the artwork database 206 may thus have stored for a given artwork 140 data identifying these maximum values, such as sensitivity values associated with individual positions on the surface 142 of the object 140 which may be determined e.g. as a function of paint application, pigments, binders and carriers etc.

In various embodiments, the data processing unit 133 thus calculates the local intensity (or local intensities for respective wavelengths) for at least one of the numerous positions as described above and then compares this with a respective maximum value (or maximum values for respective wavelengths), e.g. , specified via the sensitivity information for this position. In various embodiments, the data processing unit 133 may generate a signal, such as a warning signal, as a function of the comparison. As mentioned before, the sensitivity information includes an assignment of the positions on the surface 142 of the object 140 to maximum values which can be defined, for example, by the blue scale (ISO 1-8) or a light sensitivity category classification according to Colby, Karen M.: "A Suggested Exhibition / Exposure Policy for Works of Art on Paper ", in: The Lighting Resource - Montreal Museum of Fine Arts, (accessed on 22.1.2019) at http://www.lightresource.com/research-papers/A-Suggested-Exh ibition-Exposure-Policy-for- W orks-of- Art-on-Paper. pdf .

For example, in various embodiments, the following sensitivity data are used:

Accordingly, the maximum values may refer both to instantaneous limit values (short term) and to cumulative/average limit values (long term):

For example, category 1 (Colby) may include: most organic dyes, magenta, verdigris (copper acetate), chrome yellow, chrome red, smalt, pastel, clay papers, older color photographs, polaroids, felt-tip pen, most natural textile colors, feathers, colored printing inks, turmeric yellow, etc. Category 2 (Colby) may include Manganese blue, Prussian blue, zinc yellow, cadmium yellow, cinnabar, carmine red, groundwood paper and board, new photo prints, Kodachrome slides, Indian yellow, etc.

Category 3 (Colby) may include Ivory black, titanium white (rutile), zinc white, cobalt violet, ultramarine, cobalt blue, chrome green, malachite, earth tones, Naples yellow, lead tin yellow, orpiment, good rag paper, carbon printing inks, S-W photos on gelatin, indigo on wool, earth tones, plastics (PE), etc.

The above compilation of categories according to Colby (1991) and Thomson, Gary: "The Museum Environment" , Butterworth, London 1994), is based on a website by C. Waller: http://www.cwaller.de/deutsch.htm? 1 i chtschaeden htm~i nform ati on (retrieved on 23.1.2019). As mentioned before, the previous table may provide short-term limit values which should not be exceeded at any given location. For watercolors or oil paintings (category 1), for example, the value of 50 lux should not be exceeded. For extremely light-resistant colors such as zinc white (category 3), on the other hand, 200 lux can be set as a limit value. Furthermore, long-term limit values - considered e.g. over the year - may also be given in spatial resolution. For sensitive objects based on textiles or water paints (category 1), for example, 15,000 lux hours per year can be specified as a limit value. For less sensitive objects based, for example, on oil paints, a much higher limit value can be specified, e.g. 150,000 lux hours per year.

For the comparison with the long-term values, the data processing unit 133 may thus e.g. integrate the local intensity values recorded since the beginning of the year, for example, in order to have the same standard of comparison. Alternatively, the data processing unit 133 may determine the opening hours of the exhibitor area, which e.g. may be stored in the exposition area database 204, and multiply the calculated intensity (or an average value thereof) by this time. In various embodiments, additionally or alternatively, a long-term limit may also refer to the limit of a "first bleaching effect".

If the data processing unit 133 determines that a given limit value is exceeded or there is a risk that it will be exceeded in the future (for example for this purpose, the data processing unit 133 may estimate the future (global or local) intensity values as a function of the previous (global or local) intensity values), the data processing unit 133 may send one or more control commands to the lighting fixture(s) 110 in order to reduce the power provided to the light sources 117. In extreme cases, the data processing unit 133 may send one or more control commands to the lighting fixture(s) 110 in order to switched off one or more light sources 117/light fixtures 110. Specifically, in various embodiments, the control system 130 may perform these operations already when a single limit value associated with a very sensitive partial area of the surface 142 of object 140 is exceeded.

As mentioned before, the maximum/sensitivity values may also be determined automatically. For example, as schematically shown in Figure 15, such maximum/sensitivity values may be determined via a further sensor 120 4 , e.g. in the form of a camera, which may also be part of the light fixture 110. For example, the camera may have associated a data processing unit 123, which elaborates the image data of the surface 142 of the object 140 and performs an assignment of individual positions on the surface 142 to a respective category or directly a respective limit value via image data processing.

The respective data may then be provided to the control system 130, e.g. by storing respective data to the artwork database 206. Specifically, as described in the foregoing, the artwork database 206 may have stored characteristics data, e.g. the maximum/sensitivity values, for a plurality of artworks 140. For example, in various embodiments, such an artwork database 206 may be stored in a cloud 506, but the database 206 may also be stored, e.g. , in the control system 130. For example, the use of a remote artwork database 206 is particularly useful when an artwork 140 may be moved from one museum to another.

Thus, by accessing the artwork database 206, the control system 130 may determine the previously mentioned maximum values. As already described in the foregoing, the control system 130 may obtain the data for a given artwork 140, e.g. via an image recognition operation or by using a univocal code. For example, in various embodiments, the control system may comprise its own camera 508. This can be used to perform the image recognition operation or to read an identifier 510 attached to the object 140, such as a QR code. Alternatively or additionally, the control system 130 may comprise other reader devices, such as an NFC reader with which it reads an NFC tag attached to the object 140, which contains a corresponding unique identification.

In various embodiments, in line with the previous description, the control system 130 may also take into account the intensity of background illuminations. Specifically, when using light sensors 120 configured to determine the light emitted by the light fixture 110 (in particular near the light fixture 110), this light sensor 120 is unable to measure the background illumination, which also illuminates the object 140. Accordingly, when taken alone, such a light sensor 120 may be used to monitor artworks 140 not exposed to significant and/or variable background illumination. Alternatively, the control system 120 may also be connected to a light sensor configured to monitor the background illumination generated by other artificial or natural light sources. For example, such an additional light sensor may be implemented with light sensor positioned near the artwork 140, or other light sensors 120 configured to measure the background illumination in the exposition area 160, preferably near the artwork 140. For example, such a light sensor 120 may be a camera, e.g. the camera 508, or a light sensor/camera integrated or positioned near the light fixture (see also Figure 14).

Thus, the data processing unit 133 may be configured to calculate actual (global and/or local) illumination values by summing the calculated intensity at the artwork 140 due to light generated by the light fixture (as e.g. measured by the sensor 120i) and the intensity of background light. For example, if the ambient light alone should cause the limit values to be exceeded, the data processing unit 133 may generate a warning signal. This may apply in particular to UV radiation, as this causes particularly severe damages to inks and materials, i.e. the background light sensor may also provide plural intensity values for different wavelengths or wavelength ranges.

Figure 18 summarizes the operation of various embodiments of the data processing unit 133 shown in Figure 15.

In a step 520 the light sensor 120i or the current sensor 116k provides data indicative of the intensity of the light emitted by light sources 117 (or respective intensity values for a plurality of wavelengths/wavelength ranges) to the data processing unit 133, i.e. the data processing unit 133 receives the data indicative of the intensity of the light emitted by light sources 117.

In a step 522, the data processing unit 133 obtains, e.g. via the light fixture database 202 and/or the exposition area database 204, the data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 of object 140.

In a step 524, the data processing unit 133 obtains, e.g. via the light fixture database 202, data identifying the spatial radiation characteristics of the light fixture 117.

In a step 526, the data processing unit 133 determined a global and/or a plurality of local intensity values (or respective intensity values for a plurality of wavelengths/wavelength ranges). As mentioned before, the local intensity values may be calculated for a plurality of positions on the surface 142 of the object 140 as a function of:

- the data identifying the spatial radiation characteristic;

- the data identifying the spatial positioning; and

- the measured intensity value (or measured intensity values for a plurality of wavelengths/wavelength ranges).

Optionally, the step 526 may also determine the global and/or local intensity value as a function of data identifying an intensity of background illumination.

In a step 528, the data processing unit 133 obtains, e.g. via the artwork database 206, data identifying the global and/or local sensitivity of the surface 142 of the object 140 to be irradiated (or global and/or local sensitivity values for a plurality of wavelengths/wavelength ranges). These data may include or may be used to calculate global and/or local maximum values (or global and/or local maximum values for a plurality of wavelengths/wavelength ranges), which may relate to short-term/instantaneous values and/or long-term values.

In a step 530, the data processing unit 133 compares the calculated global and/or local intensity values with the respective global and/or local maximum values. In various embodiments, the data processing unit 133 may generate one or more control commands for the light fixture(s) 110 in response to this comparison, for example, in order to adjust its power supply.

As mentioned before, the operation of the data processing unit 133 may also be implemented in a data processing unit 123 of a sensor 120, e.g. the sensor 120i, or a data processing unit 113 of the light fixture 110, or in a distributed manner wherein the above operations are executed by at least two of such data processing units 113, 123 and 133 (see also the description of Figures 2, 3 and

4)·

Accordingly, the lighting system 100 described with respect to Figure 15 is able to monitor the irradiation of an object 140 with light from a light fixture 110.

The system comprises a light fixture 110 (or a plurality of light fixtures) with one or more light sources 117, which together emit light with a given spatial radiation characteristic. For example, the light fixture 110 may be a spotlight comprising one or more LEDs (light-emitting diodes) or similar. The light fixture 110 is preferably a physical unit having a housing in which the light sources 117 and other functional units are accommodated, such as in particular an optical system 115 for expanding or focusing, e.g. lenses and/or reflectors arranged in a corresponding manner. The system comprises also a data processing unit operatively connected via a suitable communication interface to the light fixture(s) 110 or possibly integrated in the light fixture. For example, in various embodiments, the light fixture 110 comprises a data processing unit 132 configured to exchange data with a data processing unit 133 of a control system 130. The connection can be a physical data line (including cable) or a wireless connection, or a combination of both. For example, the light fixture 110 and the data processing unit may be connected via the Internet, for example, via an on-site router, switch and access point. A bridge can also be used to connect a building management system to the light fixture 110, such as DALI, KNX or ZigBee. In various embodiments, the data processing unit may also form part of a smartphone, which may correspond to or form part of the control system 130. Such a smartphone may also be another mobile (hand-held) control and display unit, which enables a user of the system to monitor and control the system by showing respective data on a display.

In various embodiments, the data processing unit has associated a first memory, e.g. an exposition area database 204, connected to the data processing unit, in which information about the spatial positioning of the light fixture 110 with respect to the object 140 are stored.

In various embodiments, this information on the spatial positioning may include data concerning a (horizontal) distance between the light source(s) and the surface 142 of the object 140 or another reference point, and an angle of inclination at which the light fixture 110 is positioned with respect to a surface normal or from a plane of the surface 142. Alternatively, the information may include mere coordinates of the light fixture 110 and the surface 142 of the object 140 in a reference coordinate system. In any case, the information contains data that allow the data processing unit to make a geometrical calculation of the radiation from the light fixture 110 to positions on the surface 142 of the object 140. For this purpose, the system, e.g. the light fixture 110, may comprise at least one of: a distance sensor, preferably an ultrasonic sensor, configured to measure a distance between the light fixture 110 and the surface 142 or a reference point to it and to transmit the measurement result to the data processing unit; and an inclination angle sensor, preferably provided in the light fixture 110 or on the surface 140 of the object 140, and configured to measure an inclination angle at which the light fixture 110 is positioned with respect to a surface normal or from a plane of the surface 142, and to transmit the measurement result to the computing unit.

In various embodiments, the data processing unit has associated a second memory, e.g. a light fixture database 204, in which the above-mentioned spatial radiation characteristics of the light source(s) 117 or the light fixture 110 as a whole (including optics) are stored. The radiation characteristic refers to a directional output of the light of a light fixture 110 with respect to a value determined for a main direction along an optical axis, whereby the radiation characteristic can be influenced by apertures, lenses, louvres or reflectors of the light fixture 110. The beam pattern can be symmetrical (preferably with spotlights, spotlights or downlights) or asymmetrical (preferably floodlights).

In various embodiments, the radiation characteristics refer to an optical axis of the light fixture 110. For example, an intensity distribution within a plane perpendicular to this optical axis may be given, with the intersection of the axis with the plane representing the reference point. It should be noted that the orientation of the optical (or geometric) axis of the light fixture 110 with respect to the surface 142 of the object 140 may also be part of the data identifying the spatial positioning of the light fixture 110 with respect to the surface 142 of the object 140 and may be stored in the first memory. Alternatively, intensities can be stored angle-dependent (relative to the optical axis). For example, in various embodiments, the spatial radiation characteristic of the light source(s) includes data with a two-dimensional distribution of intensities on one surface, or on several surfaces at different distances from the light source(s), perpendicular to an optical axis of the light emitted by the light sources of the luminaire. In this case, the data processing unit may calculate the local intensity at a given position on the surface 142 of the object 140 by means of mathematical projection or interpolation or extrapolation from one surface or between the several surfaces.

In various embodiments, the data processing unit is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface 142 of the object 140 from the information on the luminance, from the spatial radiation characteristic of the light source(s) and from the information on the spatial positioning of the luminaire relative to the surface of the object. In various embodiments, such a calculation is essentially a geometric calculation, whereby the intensity determined at each position on the surface 142 includes the absolute distance from the light fixture 110, the inclination of the surface 142 with respect to the direction towards the light fixture 110 and the direction- dependent attenuation of the radiation relative to the aligned optical axis of the light fixture 110. A further factor is the light intensity transmitted by the light fixture.

Consequently, in various embodiments, the data processing unit is able to monitor the irradiation of the surface 142 of the object 140 with local resolution.

In various embodiments, the local irradiation determined in this way may be compared, for example, with local light sensitivities. For example, smaller, but particularly light-sensitive surface areas are thus taken into account much more appropriately in the monitoring process. In various embodiments, the data processing unit may modify the strength, orientation and, if necessary, also the wavelength range of the light source(s) in response to this comparison.

For this purpose, in various embodiments, the system comprises a (photographic) camera with which the surface 142 of the object 140 can be scanned in order to obtain color and/or brightness values for positions on the surface 142, wherein the data processing unit is adapted to receive the position-dependent color and/or brightness values from the camera and calculates a limit value for each of the positions on the basis of a fixed predetermined relationship between the color and/or brightness values and a sensitivity. This allows a fast and efficient analysis of the object surface 142 with the aim of a locally resolved limit value determination for the above-mentioned sensitivity information. For example, this camera may be a camera of the control system 130, e.g. in the form of a mobile unit wirelessly connected to the light fixture 110.

Alternatively, the control system, such as a mobile unit wirelessly connected to the light fixture 110, may comprise a camera or a device for near-field communication, which can be used to read an identifier attached to the object 140, allowing access to the sensitivity information for the object 140 to be irradiated stored in a fourth memory, e.g. an artwork database 206. The identifier can be, for example, a QR code or a correspondingly programmed NFC tag, which can be read with a smartphone, for example, and is individually assigned to the object 140. This fourth memory is preferably set up in a generally accessible cloud. If, for example, the object 140 is moved to another exhibition or museum, the new user can access the surface that has already been "mapped" earlier in terms of sensitivity information without having to recreate it.

In various embodiments, the system may also comprise a timing device 504, e.g. a clock or timer, which is configured to output an operating time for the light source(s) 117 in which the light source(s) 117 are operated since their commissioning for irradiating the object 140, and a current and/or voltage measuring device 116k, which is designed to measure a current and/or voltage with which the light source(s) 117 is/are operated. In this case, a function or table may be stored in a third memory, such as a sensor database 218, with which values of an illuminance are assigned to a combination of values from a current and/or a voltage and/or an operating time of the light sources. For example, in this way, the age-related decrease of the radiation power may be mapped. Thus, indirect determination of the intensity of light emitted by the light fixture 110 is usually more cost efficient compared to a light sensor 120i.

For example, when using as control system 130 a mobile unit wirelessly connected to the light fixture, such as smartphone in which the data processing unit is installed, the first, second and/or third memory may be installed in the smartphone or in a cloud accessible by the smartphone. Due to the fact that the data processing unit may be remote, such as a remote server, it is possible to make the underlying calculation for monitoring the irradiation of an object 140 available to third parties via suitable interfaces (Software as a Service (SaaS) or Platform as a Service (PaaS)).

Generally, one or more of the following examples of information may also be used when performing the previously described illumination simulation:

- the spatial positioning data of the light fixture 110 with respect to the surface 142 of the object 140 and/or the spatial radiation characteristic of the one or more light sources 117 or the light fixture 110, which may be used during the simulation in order to calculate/estimate the illumination of an artwork 140, and

- the sensitivity data, e.g. the target and/or maximum illumination values, of the artworks 140.

Similarly, the information used to determine the intensity of light emitted by the light fixture 110 as a function of the measured light intensity (sensor 120i), or the power supply (sensor 116k) may be used when simulating the operation of the sensors 120.

Finally, the described control operations for controlling the operation of the light fixture 110 as a function of the illumination of the artwork 140 may be used with other light sensors 120 and/or control systems 130 described herein. For example, the solution for obtaining the sensitivity data of an artwork 140, which e.g. are used to determine local and/or global target and/or maximum illumination values, may be used to provide these data, e.g. by storing these data into the artwork database 206, to any control system 130 using such data.

Accordingly, various embodiments of the present disclosure relate to a lighting system configured to monitoring the irradiation of an object with light generated by a light fixture. In various embodiments, the light system comprises the light fixture comprising one or more light sources, which together are configured to emit light with a spatial radiation characteristic, a data processing unit connected to the light fixture and configured to obtain information on an intensity of the light emitted by the light sources, a first memory connected to the data processing unit, in which information about the spatial positioning of the light fixture with respect to a surface of the object is stored, and a second memory connected to the data processing unit, in which information about the spatial radiation characteristic of the one or more light sources or the light fixture is stored. In various embodiments, the data processing unit is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface of the object as a function of the information on the light intensity, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture. Possible embodiments of this solution are detailed at the following point "Example 5".

Second embodiment of light sensor

As described in the foregoing, art lighting should assure that an artwork or object 140 is illuminated with a given target and/or maximum illumination, e.g. in order to avoid or reduce the risk that the artwork is damaged. In this respect, the light 500 emitted by a light fixture 110 often comprises wavelengths ranging from UV to IR. In this respect, damages of an artwork 140 may occur if the global and/or local light intensity or the light intensity at a certain wavelength or wavelength range is too high, thereby inducing, e.g. , damages in the pigments or other materials. Thus, often it is not sufficient to measure, calculate or estimate the global light intensity of the light received at the artwork 140. Moreover, as described in the foregoing, usually it is not possible to directly measure the light received at the artwork 140, because in this case, the light sensor 120 should be placed in the position of the artwork 140. Accordingly, such direct measurements may usually only be executed during an installation phase, but not in real time/continuously. Accordingly, indirect measurements are preferable. In such indirect measurements the (global and/or local) illumination of an artwork 140 is calculated or estimated (possibly a single intensity value or a plurality of intensity values for different wavelengths/wavelength ranges) based on other measurements.

In this respect, as described in the foregoing, one of the simplest solutions consists in installing a light sensor 120 next to the object 140. Such a sensor would thus not directly measure the illumination of the object 140, but still would permit to calculate or at least estimate the illumination of the artwork 140 as a function of the characteristics of the light fixture 110 and the geometrical position of the light fixture 110, the artwork 140 and the light sensor 120. However, such a light sensor 120 installed near the artwork 140 is often not a feasible solution, e.g. because a power supply has to be provided to the light sensor 120 and some kind of data- connection is required in order to transmit the measured data to the control system 130.

In the previous first embodiment of a light sensor 120, the global and/or local illumination of an artwork is calculated as a function of: - the intensity (possibly for plural wavelengths/wavelength ranges) of the light 500 emitted by the light fixture 110 as measured directly within the light fixture 110 (or at least in proximity thereof),

- the spatial radiation characteristics of the light fixture 110, and

- the geometrical position of the light fixture 110 and the artwork 140.

Conversely, as shown in Figure 19, in the second embodiment, the illumination of an artwork 140 is calculated/estimated as a function of the data provided by a light sensor 120 configured to measure the light 600 reflected by the artwork 140 itself. Generally, the light sensor 120 may provide a global and/or local (i.e. space-resolved across the object) measurement values for a single or plural wavelengths/wavelength ranges. Accordingly, the light sensor 120 may be a photometric device, such as diode ( e.g . for an overall/global measurement) or a (2D) camera, such as a CCD or CMOS-sensor, which measures the intensity of local intensity values. For example, such a camera 120 may be: a monochromatic, e.g. grayscale, camera providing only intensity values for a single wavelength range; a full RGB camera (or other color patterns, such as CMYK), wherein each pixel has a plurality of sensor elements/light sensors associated with respect wavelength/wavelength ranges (e.g. due to given color filters installed before each sensor element), thereby providing measured color values indicative of the intensity of a plurality of wavelengths/wavelength ranges for each pixel; a “reduced” RGB camera (or other color patterns, such as CMYK), wherein each pixel has a single sensor element/light sensor associated with a respective color/color range (e.g. due to a given color filter installed before the sensor element), thereby providing a single measured color value indicative of the intensity of a respective wavelength/wavelength range for each pixel; in this case, the sensor elements/light sensors are usually arranged according to a color pattern, such as a Bayer pattern, which still permits to calculate/estimate, e.g. via interpolation, color values indicative of the intensity of a plurality of wavelengths/wavelength ranges for each pixel.

In various embodiments, the light sensor 120 may be installed within, fixed to or installed in the proximity of the light fixture 110 used to illuminate the artwork 140 to be monitored, because in this way the light sensor 120 may be supplied by the power supply of the light fixture 110, e.g. via the electronic converter 116. Moreover, in this way, the light sensor 120 may also use the communication interface of the light fixture 110 in order to exchange data with the control system 130.

Thus, when using a camera 120, i.e. a matrix of sensor elements/light sensors, usually these sensor elements/light sensors have associated respective color filters, e.g. arranged according to a Bayer-filter. In various embodiments, the spectral characteristics filters and the spectral characteristics of the light 500 emitted by the light fixture 110 are matched. For example, when using a RGB camera 120, i.e. a camera having a plurality of filters for a red wavelength range, a green wavelength range and a blue wavelength range, respectively, the light fixture 110 should emit light with peak values in these red, green and blue wavelength ranges or vice versa the (e.g. red, green and blue) wavelength ranges of the color filters of the camera 120 should be selected to correspond to the peak emission values of the light fixture 110. For example, for this purpose, the light fixture 110 may comprise light sources 117, such as LED, emitting light with peak values corresponding to the wavelength ranges of the camera 120, such a red, green and blue LEDs, preferably having peak emission values corresponding to the wavelength ranges of the red, green and blue filters. In this case, the measurement data provided by the camera 120 may thus be correlated directly to the light emitted by a given subset of light sources 117 within the light fixture 110. For example, when detecting that the measured red light exceeds a maximum threshold value, only the power supply of the light sources 117 emitting red light, such as red LEDs, may be adapted, thereby simplifying the control operation within the control system 130. Alternatively, e.g. if one light sensor of the camera 120 measures the intensity of two or more light sources 117 (e.g. LEDs), the transmission window overlaps with the intensity peak of two or more light sources 117, then the respective relative intensities of the two LEDs have to be taken into consideration when setting the intensity of the light fixture. A similar issue occurs also when using a monochromatic camera, or when using only a global light sensor (such as a diode providing a single brightness value for a given wavelength range, or a plurality of diodes with respective filters each providing global brightness value for a respective different wavelength range), because, also in this case, the spectral characteristics of the light sources 117 may or may not be matched to the spectral characteristics of the sensor elements.

Moreover, when spectral characteristics of the light sensor 120 are not matched to the spectral characteristics of the light fixture 110, these spectral characteristics may be taken into account in order to calculate/estimate the actual illumination of the light sensor 120, e.g. by scaling the measured values as a function of the spectral characteristics of the light fixture 110 and the spectral characteristics of the light sensor 120.

Accordingly, in various embodiments, the spectral characteristics of the light fixture 110 and/or subsets of light sources 117 of the light fixture 110 (such as red, green and blue LEDs) may be stored, e.g. in the light fixture database 202, and the spectral characteristics of the light sensor 120 may be stored, e.g. in the sensor database 218. Next, the control system 130 (or in general any data processing unit of the lighting system 100) may use these data in order to: estimate the actual illumination of the sensor 120; and/or - generate the control commands for the light fixture 120 in order to regulate the spectral characteristics of the light emitted by the light fixture 110 as a function of the measured data provided by the light sensor 120.

Generally, this issue occurs each time a light sensor 120 is used to measure a single intensity value or a plurality of intensity values for different wavelengths/wavelength ranges. Accordingly, the data identifying the spectral characteristics of subsets of light sources 117 of the light fixture 110 and the data identifying the spectral characteristics of the light sensor 120 may also be used in other embodiments described herein wherein the control system 130 uses data provided by a light sensor 120, e.g. in order to verify and/or control the operation of the light fixtures 110.

Generally, as mentioned before, these spectral characteristics data are helpful when the spectral characteristics of the light sensor 120 and the light fixture 110 are not matched. However, these data are purely optional, because instead of storing explicitly these spectral characteristics data, these data may be identified implicitly via sensitivity data of the light sensor 120 for a given light fixture 110, e.g. in the form of a mathematical function or table, which associates given measured values (for a single or plural wavelengths/wavelength ranges) to respective intensity values for a plurality of wavelengths/wavelength ranges. For example, as mentioned before, when measuring the intensity of visible light (or a given wavelength range thereof), the spectral characteristics of the light source 110 may be used to estimate/calculate also the intensities in the UV and/or IR range.

Accordingly, when referring to intensity values for a plurality of wavelengths or wavelength ranges, these intensity values may refer to the actually measured values or to intensity values (for the originally measured or different wavelengths or wavelength ranges) calculated/determined as a function of the above-mentioned spectral characteristics. Generally, the respective calculation may be implemented already in the sensor 120, e.g. via the data processing unit 123, or the control system 130, e.g. via the data processing unit 133.

As mentioned before, in the second embodiment, the light sensor 120 is configured to measure characteristics of the light 600 reflected by the artwork 140. Usually, the intensity of the reflected light 600 does not provide direct information about the intensity of the light at the object 140. Specifically, the intensity of the reflected light 600 will, of course, depend on the intensity of the light 500 which illuminates the object 140, but it will also (and quite often significantly) depend on the reflectivity of the object 140, which in turn depends, e.g. , on the surface structure of the object 140 itself, e.g. if it is smooth, rough, has a high or low reflectivity, and which colors cover the object 140. Accordingly, the intensity of the reflected light 600 is just indicative of the actual intensity with which an object 140 is illuminated. For example, the reflectivity of an object 140 will result in a variation of the spectral characteristics of the light 600 measured by the light sensor 120.

In various embodiments, a calibration step is used to reference the (global and/or local) intensity of reflected light 600 as measured by the light sensor 120 (possibly for a plurality of wavelengths or wavelength ranges) to the actual (global and/or local) intensity of light (possibly for a plurality of wavelengths or wavelength ranges) at the object 140.

Figure 20 shows an embodiment of the calibration phase/step. In the embodiment considered, the artwork 140 is illumined at a step 610 via the light 500 generated by at least one light fixture 110

At a following step, the actual illumination at the object 140 is determined. In various embodiments, the global and/or a plurality of local light intensities for at least one wavelength or wavelength range are determined at a step 612 by directly measuring the illumination at the artwork 140 via a sensor positioned at the position of the artwork 140.

Alternatively, in various embodiments, the global and/or a plurality of local light intensities for at least one wavelength or wavelength range are determined at a step 614 by calculating/estimating these data as a function of other data, e.g. by: calculating the illumination at the artwork 140 as a function of the measured data provided by a light sensor positioned in proximity of the artwork; calculating the illumination at the artwork 140 as a function of the measured data provided by a light sensor configured to measure the light 500 emitted by the light fixture 110 (as described with respect to the first embodiment of a light sensor).

Generally, the respective data may be inserted manually or automatically by connecting the respective light sensor to the control system 130.

In various embodiments, also the global and/or a plurality of local light intensity values of the reflected light 600 are measured at a step 616 by the sensor 120 for at least one wavelength or wavelength range.

For example, when using a light sensor 120 in proximity of the light fixture 110, the measured intensity of the light fixture 110, the radiation characteristics of the light fixture 110 and the geometrical positioning (e.g. distance and/or angle) between the light fixture 110 and the object 120 may be used to calculate the global and/or local illumination of the object 140. Accordingly, by using a 2D radiation pattern, as measured e.g. via a camera, and using a camera 120 for measuring the light 600 reflected by the object 140, local intensity values of the measured reflected illumination may be associated with local intensity values of the calculated illumination of the object 140, possibly also for a plurality of wavelengths or wavelength ranges.

In various embodiments, a plurality of measurements are performed for different settings of the lighting fixture 110. For example, for this purpose, the setting of the light fixture(s) 110 may be modified at a step 618 and the steps 610-616 may be repeated.

This permits to better map the relationship between the measured reflected light 600 and the actual illumination of the artwork 140, e.g. for different brightness and/or color values. Moreover, as described in the foregoing, usually the spectral characteristics of the light fixture 110 and the light sensor 120 should be known in order to simplify the control of the light fixture in order to regulate the illumination of the artwork 140, in particular for individual wavelengths/wavelength ranges. However, also the reflectivity of the artwork 140 itself varies the measurement result. Accordingly, in various embodiments a plurality of measurement with different settings for the light fixture 110 are performed (e.g. with respect to the emitted brightness and color), e.g. by sending suitable control commands from the control system 130 to the light fixture 110, and each time the following data are stored: data identifying the respective setting of the light fixture(s) 110, data identifying the measured or calculated illumination of the artwork 140 (e.g. global and/or local intensity values, possibly for a plurality of wavelengths/wavelength ranges); and data identifying the measured reflected light 600 (e.g. global and/or local intensity values, possibly for a plurality of wavelengths/wavelength ranges);

For example, in various embodiments, the setting used during the calibration phase include one or more of:

- the maximum allowable/possible intensity (overall, i.e. as a sum of intensities at certain wavelengths, or for a certain wavelength);

- the preferred setting, as determined e.g. during the illumination simulation; and

- various settings with higher and lower intensity, which may be used to inter- or extrapolate the intensity for other settings.

Generally, in case the control system 130 only performs a monitoring operation of the illumination of the artwork 140, the setting of the light fixture(s) 110 may also not be stored, i.e. these data are purely optional.

Accordingly, in various embodiments, the data obtained during the calibration phase may be stored in a dataset, such as a table of a database, thereby creating a mapping between:

- the combination of reflected illumination 600 measured via the sensor 120 and the (measured or calculated) actual illumination of the artwork 130; and optionally the combination of configuration settings for the light fixture(s) 110 and the illumination of the artwork 140, which may be used to vary the settings, e.g. in order to obtain a target illumination of the artwork 140.

In various embodiments, during a training phase, the above data are used to calculate a specular and/or diffusive reflectance function of the object 140 as a function of the reflected illumination 600 measured via the sensor 120 and the (measured or calculated) actual illumination of the artwork 130.

In general, also another mathematical function configured to determine the illumination of the artwork 140 as a function of the reflected illumination 600 as measured by the sensor 120 may be determined during the training phase. For example, in various embodiments a machine learning algorithm, such as a neural network, is used to generate such a mathematical function as a function of the dataset. For example, such a machine learning algorithm may be useful when local intensity values of illumination (optionally for a plurality of wavelengths) have to be determined for local intensity vales of reflected light (optionally for a plurality of wavelengths). Additionally or alternatively, the dataset, e.g. in the form of a table, may be stored directly in a memory accessible of the light sensor 120 and/or control system 130, such as the sensor database 218.

Accordingly, as shown in Figure 21, during a normal operation phase, a data processing unit of the light sensor 120 or the control system 130 may receive at a step 630 the measured global and/or local intensity values for at least one wavelength from the light sensor 120 and use the mathematical function or dataset to determine/estimate the illumination of the artwork 140 (e.g. a global and/or local intensity values for a single or a plurality of wavelengths or wavelength ranges) as a function of the measured reflected illumination of the artwork 140. For example, when using the dataset/table, the illumination may be estimated via interpolation or extrapolation of the stored data.

Accordingly, once the calibration/training is completed, the light sensor 120 may measure the reflected light 600 (an absolute value and/or a relative value thereof), and this measurement may be executed continuously.

As described in the foregoing (see e.g. the description of the first embodiment of a light sensor), the estimated illumination of the artwork 140 (or directly the measured data) may be used to verify and/or regulate the illumination of the artwork 140, e.g. in order to regulate the (global or local) illumination to a target illumination (intensity and/or color) and/or to verify whether the (global and/or local) illumination is below a respective maximum value (or maximum values for a plurality of wavelengths/wavelength ranges).

For example, in various embodiments, the measured changes in the reflected light intensity are used to control the intensity of the light fixture 110. In various embodiments, the control system 130 may obtain at a step 632 a reference value and compare at a step 634 the current illumination of the artwork 140 (overall or at certain wavelengths) with the reference value. For example, the reference value may be a previous measurement, or determined as a function of a plurality of previous measurements, such as an average of a given number of last measurements, e.g. the last five measurements. However, also any other target/reference value may be used. For example, the use of the last measurements/target value may be suitable in case the background light of the artwork 140 may vary.

In various embodiments, in case the illumination varies (output “Y” of the verification step 634), the control system 130 may thus vary at a step 636 the settings of the light fixture(s) 110, e.g. vary the brightness thereof. For example, the control system 130 may increase the light intensity of the light fixture 110 when the reflectance decreases compared to the reference value, or may decrease the light intensity of the light fixture 110 when the reflectance increases. Conversely, if no change occurs or if the change is smaller than a certain threshold (output “N” of the verification step 634), the control system 130 may maintain at a step 638 the previous setting of the light fixture(s) 110.

The new measurement value may then be stored at a step 640 as reference value (or may be used to calculate the new reference value).

In general, the control system 130 may send the control command to change of light intensity at the light fixture 110 immediately in response to the detection of a change of reflectance or with a certain delay or only if the change in reflectance is higher than a certain threshold, which e.g. may be determined by the user depending on the illuminated object.

Additionally or alternatively, the control system 130 may compare at the step 634 the reflected light intensity with a maximum value. If the light intensity exceeds this threshold, the control system may generate a warning signal and/or reduce the power supply of the light source 117. As described in the foregoing the threshold values (or other sensitivity data) may be stored in a data storage unit, e.g. the artwork database 206, and/or may be determined as a function of an image of the artwork 140.

In various embodiments (see also Figure 19), the light sensor 120 may have a variable position. Specifically, as mentioned before, the reflectivity of the artwork 140 may also be a function of the angle of observation. Accordingly, in various embodiments, both during the calibration and the normal operation phase, the position of the light sensor 120 may be varied according to a given profile in order to acquire a sequence of a plurality of (global and/or local) intensity values of reflected light for at least one wavelength or wavelength range for respective positions of the light sensor 120, and the sequence of measurements may be used to determine the mathematical function or may be stored in the dataset. For example, the light sensor 120 may have associated for this purpose an actuator (schematically shown via a line 602 in Figure 19), e.g. controlled by the light sensor 120 or the control system 130, configured to vary the position of the light sensor 120 with respect to the artwork 140, e.g. the distance and/or angle of the light sensor 120 with respect to said artwork 140.

Accordingly, various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the artwork for at least one wavelength or wavelength range. Specifically, in various embodiments, the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork for at least one wavelength or wavelength range and measuring via the light sensor the global and/or local light intensity values of the light reflected by the artwork; during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork; and during a normal operation phase, measuring via the light sensor the global and/or the plurality of local light intensity values of the light reflected by the artwork, and estimating via the mathematical function or the dataset the global and/or the plurality of local light intensities at the artwork as a function of the global and/or the plurality of local measured light intensity values of the light reflected by the artwork.

Possible embodiments of this solution are detailed at the following point "Example 6".

Third embodiment of light sensor

As mentioned before, when measuring the light 600 reflected by an artwork 140, the characteristics of the light 600 often vary with the angle of incidence and observation, both in brightness and spectral response. Specifically, reflected light 600 has specular and diffusive components, which often are not easy to characterize. For example, paintings 140 often have a very uneven surface. This implies that specular as well as diffusive reflectivity of an object 140 may vary significantly depending on the incident direction of the light. Finally, the reflectance of an object 140 may also vary over the time.

Thus, according to various embodiments, instead of estimating the illumination of an artwork/object 140 as a function of the light 600 reflected by the artwork 140 itself, the light 702 reflected by a reference surface 700 is used. Preferably, such a reference luminance target is an object of known spectral reflectance, preferably in both specular and diffusive reflectance.

For example, as shown in Figures 22 (front view) and 23 (top view), in various embodiments, a reference luminance target (RLT) 700 is installed in the proximity of the object 140 to be monitored. For example, in various embodiments, the reference target 700 may be installed with respect to the border of the object 140, preferably laterally, at a distance being smaller than 1 m, preferably smaller than 50 cm, preferably between 5 and 30 cm. For example, the RLT 700 may be the tag having printed thereon the name of the respective object 140 or additional information about it.

In general, the RLT 700 has a given (preferably known) reflectivity pattern depending on the angle of incidence 0_i and angle of reflection O r with respect to the normal of the surface 142 of the object 140. The reflectivity pattern can be divided into two components: specular reflectivity Rs, wherein the specular reflectivity Rs is a non-zero value only when 0_i = 0_r; and diffusive reflectivity Rd.

The specular and diffusive reflectivity form the overall reflectivity:

R(0_i, O r) = Rs(0_r) + Rd(0_i, 0_r)

As mentioned before the RLT 700 has preferably a known diffusive and preferably also a known specular reflectivity. In various embodiments, the RLT 700 has a surface with minimized specular reflection and maximized diffusive reflection. Such a surface 700 may be white paint, such as barium sulfate BaS04, or polymers with diffusive particles, such as polycarbonate or PMMA or silicone with A1203 or Ti02. In this way, the RLT 700 is (approximately) a Lambertian emitter on the complete spectral range of interest. The apparent brightness of a Lambertian surface to an observer is (approximately) the same regardless of the observer's angle of view.

In various embodiments, it is assured that the RLT 700 and the object 140 are illuminated with the same light fixtures(s) 110, or by light fixtures 110 or light sources 117 with identical settings, so the values measured for the RLT are indicative of the illumination of the object 140. Furthermore, in various embodiments, shadowing of the RLT 700 and the sensor 120 is avoided, both in cases it is caused by the object (or other items) or visitors.

Accordingly, the light sensor 120 described with respect to the second embodiment of light sensor is now used to measure a global and/or a plurality of local light intensity values of the light 702 reflected by the reference luminance target 700 for at least one wavelength or wavelength range. For example, also in this case, the light sensor 120 may be a 2D camera or a photodiode.

Also in this case, a calibration, a training and a normal operation phase is thus used to determine the global and/or plurality of local light intensities at the artwork 140 as a function of the measured global and/or plurality of local light intensity values of the light 702 reflected by the reference luminance target 700.

In general, when the RLT 700 is a Lambertian emitter, the light sensor 120 may be installed in a fixed position, because the measured global and/or plurality of local light intensity values are the same regardless of the incident angles of the light 500 from the light fixture(s) 110 and the observation angle (indicated as 0_r-c in Figure 23). However, as schematically shown in Figure 23, the light sensor 120 may have also a variable position, i.e. in various embodiments, both during the calibration and the normal operation phase, the position of the light sensor 120 may be varied according to a given profile in order to acquire a sequence of a plurality of (global and/or local) intensity values of reflected light for at least one wavelength or wavelength range for respective positions of the light sensor 120, and the sequence of measurements may be used to determine the mathematical function or may be stored in the dataset. For example, also in this case, the light sensor 120 may have associated for this purpose an actuator (schematically shown via a line 602 in Figure 23), e.g. controlled by the light sensor 120 or the control system 130, configured to vary the position of the light sensor 120 with respect to the artwork 140, e.g. the distance and/or angle 0_r-c of the light sensor 120 with respect to said artwork 140.

Specifically, in various embodiments as shown in Fig 24, during the calibration phase, the artwork 140 and the RLT 700 are illuminated at a step 710 (essentially corresponding to the step 610) by at least one light fixture 110. Next the global and/or plurality of local light intensities at the artwork 140 and/or the RLT 700 are obtained at a step 712 (essentially corresponding to the step 612 or 614), e.g. by directly measuring the illumination of the object 140/RLT 700 or calculating/estimating the illumination of the object 140/RLT 700. For example, by replacing the artwork 140 with the RLT 700, the same methods as described with respect to the steps 612 and 614 may be used.

Next, the reflected light intensity is measured at a step 714 (essentially corresponding to the step 616) via the light sensor 120. As described in the foregoing, the light sensor 120 may provide a global and/or a plurality of local light intensity values for a single or a plurality of wavelengths or wavelength ranges.

In various embodiments, also in this case, the settings of the light fixture(s) 110 may be varied at a step 716 (essentially corresponding to the step 618) and the steps 710-714 may be repeated Accordingly, the calibration phase is used to acquire a dataset, wherein each item of the dataset comprises: a global and/or a plurality of local light intensities at the artwork 140 and/or the RLT

700 (preferably for a plurality of wavelengths or wavelength ranges); and a measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700 (preferably for a plurality of wavelengths or wavelength ranges)

Accordingly, during a training phase 720, a mathematical function or dataset is determined, which permits to: directly calculate/estimate (via the mathematical function or the dataset, e.g. via interpolation or extrapolation) the global and/or the plurality of local light intensities at the artwork 140 as a function of the measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700; or calculate/estimate (via the mathematical function or the dataset, e.g. via interpolation or extrapolation) the global and/or the plurality of local light intensities at the RLT 700 as a function of the measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700, and then calculate (more or less in line with the first embodiment of a light sensor) the global and/or the plurality of local light intensities at the artwork 140 as a function of: o the calculated/estimated global and/or plurality of local light intensities at the RLT 700, o data identifying the geometrical position of the RLT 700, the artwork 140 and the light fixture 110, and o optionally the radiation pattern of the light fixture(s) 110.

Generally, due to the fact that the dataset may comprise both data identifying the illumination of the artwork 140 and data identifying the illumination of the RLT 700, the function used to calculate the illumination of the artwork 140 as a function of the (estimated) illumination of the RLT 700 may also be determined as a function of the dataset, e.g. by training a machine learning algorithm, such as an artificial neural network.

Accordingly, the calibration phase takes also into account the optical transfer function of the light sensor 120, such as the wavelength dependence of the camera, e.g. for the lens and sensor elements, the angle dependence of the camera, e.g. for the lens, the sensor elements, or the iris, if the camera has a strong field distortion or field curvature, etc.

In general, a camera 120 will likely measure both light 702 reflected by the RLT 700 and light 600 reflected by the artwork 140. For this reason, it is convenient to perform a plurality of measurement for different setting of the light fixture(s) 110 and/or to vary the position of the light sensor 120.

In various embodiments, during a normal operation phase 730 (essentially corresponding to steps 630-640), a data processing unit of the lighting system 100, e.g. the data processing unit 123 or 133, may thus receive the measured global and/or plurality of local light intensity values of the light 702 reflected by the RLT 700 and determine via the mathematical function or dataset the global and/or the plurality of local light intensities at the artwork 140. For example, the data processing unit may first estimate the illumination of the RLT 700 and then calculate the illumination of the artwork 140 as a function of the geometrical position of the RLT 700, the artwork 140 and the light fixture 110.

In various embodiments, the intensity of the reflected light may thus be measured continuously or periodically, e.g. every 5 minutes or every hour.

Also in this case, the measurement may be performed for the overall illumination of the artwork 140/RLT 700, i.e. the illumination provided by the light fixture(s) 110 and background illumination, and/or the measurement may be performed for only for the background illumination (i.e. with the light fixture(s) 110 switched off). For example, the background illumination may be measured when the exposition area 160 is closed or when no visitors in the exposition area 160 are detected. For example, for this purpose the opening hours of the exposition area 160 may be stored, e.g. in the exposition area database 204. Alternatively, in various embodiments, a RLT 700 and a light sensor 120 are used to measure only background light, e.g. by placing a RLT 700 in a position, which is not illuminated by the fixture fixture(s) 110

As described in the foregoing, during the normal operation phase 730, various operations may be performed as a function of the global and/or plurality of local light intensities at the artwork (possibly for a plurality of wavelengths or wavelength rages). In general, reference can be made to the description of the steps 630-640. For example, the global and/or local intensities may be compared with one or more thresholds, and the control system 130 may adapt the brightness of the light fixture accordingly.

Similarly, the exposition area may comprise means (e.g. an actuator) configured to vary the background illumination, e.g. by using roller shutters to reduce the background light from outside of a window 164. Accordingly, the control system 130 may also adapt the brightness of the ambient light as a function of the global and/or local intensities at the artwork 140. Generally, a similar operation may also be performed in any other lighting system 100 described herein.

In various embodiments, the lighting system 110 may also comprise a plurality of reference luminance targets 700 placed around the same object 140, allowing to interpolate the light intensity at the object 140. For example, this may be useful when the radiation pattern of the light fixture(s) is unknown.

Additionally, instead of moving the light sensor 120, also several light sensors 120 monitoring the same RLT 700 (or different RLT associate with the same object 140) could be used.

Although the embodiments focused primarily on Lambertian diffusers, other RLTs could also be used, for instance:

- using the same material for the RLT 700 as for the object 140;

- using a non-null specular reflectivity component, i.e. a diffusive reflector with a controlled specular component, additionally to the Lambertian one, to monitor the source from a very specific direction;

- using a purely reflective component to monitor the source from a very specific direction; - using a diffusive component being not-Lambertian: for instance components which have a dominant diffusive reflectivity in a specific direction; this could be useful when objects 140 are lit by a light fixture 110 not from the front, but from a long distance and with a high angle of incidence, or receiving light from a dominant direction like a big window In various embodiments, the RLT 700 may also comprise surfaces of several materials with different features. The RLT could also have a non-flat spectral reflectivity. This could be used to enhance or reduce the reflectivity of a specific wavelength or in a specific wavelength range. Accordingly, various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with a lighting system comprising one or more light fixtures configured to emit light with variable characteristics as a function of a control command, wherein a reference luminance target is installed in proximity of the artwork, whereby the reference luminance target is illuminated with the light emitted by the one or more light fixtures, and wherein a light sensor is installed in the exposition area in order to measure a global and/or a plurality of local light intensity values of the light reflected by the reference luminance target for at least one wavelength or wavelength range. Specifically, in various embodiments, the method comprises the steps of: during a calibration phase, obtaining a global and/or a plurality of local light intensities at the artwork and/or at the reference luminance target for at least one wavelength or wavelength range, and measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target; during a training phase, determining a mathematical function and/or a dataset adapted to estimate the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity values of the light reflected by the reference luminance target; and during a normal operation phase, measuring via the light sensor the global and/or plurality of local light intensity values of the light reflected by the reference luminance target and estimating the global and/or plurality of local light intensities at the artwork as a function of the measured global and/or plurality of local light intensity of the light reflected by the reference luminance target.

Possible embodiments of this solution are detailed at the following point "Example 7".

Reducing ageing effects via machine learning

As described in the foregoing, in a gallery or art museum, paintings and other pieces of artwork are illuminated by natural and/or artificial light sources in order to present the artwork, e.g. according to a given requested illumination having a good color rendition for a viewer.

In general, an artwork 140 may be illuminated with various kinds of light sources at the same time under different irradiating angles and beam diameters. A light fixture may encompass pre set or adjustable light sources 117 (intensity, color, orientation of irradiation etc.), a variety of optical elements 115 (e.g. lenses, diffuser, color filter), sensors 120 (such as temperature, humidity, light intensity, color, as well as sensors for people tracking, e.g. by using IR-radiation (emission and sensing), lighting control units (driver and controller for light sources and light/color sensors as well as for actuators for movement and orientation of light sources and optical elements), and the like.

As described in the foregoing, a light fixture 110 may comprise a control unit 113 and a driver 116 for a light source 117, such as one or more LEDs. The control unit 113 and the driver 116 may be configured to adjust the light spectrum of the light emitted by the light sources 110 to a certain value. For example, in various embodiments, the light fixture 110 may be configured (via a suitable selection of light sources 117, and configuration of driver 116 and data processing unit 113) to adjust the color temperature of the combined light emitted by the light sources 117, so that the color point lies preferably on or near the Planck Curve (CIE color diagram). For example, the light fixture 110 may be configured to adjust and keep the resulting color temperature on and along the Planck Curve, for example, in the range between 1800 K and 6000 K with a deviation from the Planck Curve of not more than 2 McAdam Ellipses. However, the light fixture may as well be configured to adjust color coordinates within a color space area that is not limited to a Planck Curve. Thus, in various embodiments, the light fixtures 110 are configured to provide a tunable light output, in particular a tunable white light output. For example, as described in the foregoing, such a color regulation function may be implemented by using light sources 117 emitting light with different colors and regulating/dimming the relative brightness of the light sources 117. Such a dimming operation may be performed by using a PWM or other dimming methods. Generally, the color regulation may be performed via a feedback/closed loop control, e.g. by using a sensor providing a measure of the color temperature of the light emitted by the light fixture 110, or via a feed-forward/open loop control, e.g. by using a look-up table having stored the dimming levels for the light sources 117 for a given requested color temperature.

In general, paintings and other artworks 140 may also undergo ageing effects. For example, such ageing effects of color-pigments of artworks may including a darkening or a color shift, e.g. due to temperature changes and/or increased levels of humidity, and/or color bleaching and/or textile (canvas) damages, e.g. due to natural and artificial light with a high blue or even UV-content.

In general, a pigment is a material that changes the color of reflected or transmitted light as the result of wavelength-selective absorption. Most materials selectively absorb certain wavelengths of light. Usually the pigments are selected to have a high tinting strength relative to the materials it colors. Usually, once the pigments have been applied, they should be stable in solid form at ambient temperatures.

For industrial applications, as well as in the arts, permanence and stability are desirable properties. Pigments that are not permanent are called fugitive. Fugitive pigments fade over time, or with exposure to light, while some eventually blacken. Pigments are used for coloring paint, ink, plastic, fabric, cosmetics, food, and other materials. Most pigments used in manufacturing and the visual arts are dry colorants, usually ground into a fine powder. For use in paint, this powder is added to a binder (or vehicle), a relatively neutral or colorless material that suspends the pigment and gives the paint its adhesion. A distinction is usually made between a pigment, which is insoluble in its vehicle (resulting in a suspension), and a dye, which either is itself a liquid or is soluble in its vehicle (resulting in a solution). A colorant can act as either a pigment or a dye depending on the vehicle involved. In some cases, a pigment can be manufactured from a dye by precipitating a soluble dye with a metallic salt. The resulting pigment is called a lake pigment. The term biological pigment is used for all colored substances independent of their solubility.

Thus, in various embodiments, the lighting system 100 is also configured to determine and monitor possible ageing effects of the artwork(s) 140 and to adapt the settings of the light fixture(s) to provide optimized lighting conditions.

Figure 38 shows an embodiment of a lighting system 100 configured to monitor ageing effects of artworks 140.

As already explained, a piece of artwork 140 may be illuminated by natural and/or artificial light as provided by lighting fixtures 110. Such lighting fixtures 110 may employ a variety of light sources 117 and optical components 115 like lenses and filters. The combined light 500 emitted by the light fixture 110 may be characterized, e.g ., by spectral distribution, intensity, beam spread, and orientation (incidence of light). The lighting fixtures 110 may be different from each other and also their incidence of lighting. This means that the lighting conditions may vary from one illuminated artwork 140 to another and thus making it difficult to compare the effect they have on degradation effects of the illuminated painting. Furthermore, also the total time of illumination, i.e. the integral amount of light per day or per period, plays a role, adding further complexity to the matter.

In various embodiments, in order to determine the ageing of an artwork, the artwork 140 is illuminated with a given (reference) lighting condition and a graphical image is obtained via a suited camera 120i providing pixelized image data where each (usually color filtered) pixel or subpixel of the image provides information about intensity and color coordinates of the optically related sensed area of the painting. This means that the artwork is pixel-wise color and intensity measured. The pixels are optically related to small areas (cells) of the painting. So, each pixel data is related to a reflected cell spectrum. For example, in various embodiments, the reflected light 600 of each area (cell) of the painting 140 is measured with the camera 120i, such as a CMOS or CCD camera, using (standardized) color filters in front of the sensor chips, for example RGB in a Bayer-setting or other color filter variants like RGEB (red, green, emerald, blue), or one that employs two kinds of green filters), and the like. For a more detailed description of how to monitor an artwork via a camera, reference can be made, e.g, to the description of Figure 19.

As described in the foregoing, image taking (e.g. with a CCD/CMOS-camera) may be done under various lighting situations and/or under various positions and angles with respect to the illuminated artwork 140. This means that image measurement (image taking) is functionally related to artwork Illumination (ambient and artificial), the reflectivity features of the artwork 140. Such measurements will result in a large number of (digitized) camera pixel data of each painting 140.

Generally, as also described in the foregoing, the measurements depend also on the image measurement characteristics, i.e. the properties of the camera 120i. For example, a CCD/CMOS camera needs to have filter segments placed in front of the sensor chips, for example RGB filters in a Bayer configuration/setting, in order to allow for color perception and respective measurement. However, filter segments, CCD/CMOS-chips and signal procession will show some kind of variation that needs to be taken into account. In this respect, the optical characteristics of the camera may be expressed via an Image Transfer Function or Optical Transfer Function.

Accordingly, the measurements may be represented by a (possibly multi-dimensional) Digital Data Set taking all these conditions into account.

In various embodiments, the acquired image data set is stored in a database. Thus, by storing a sequence of image data sets over time, the evolution of given characteristics of the data set may be analyzed, e.g. via image analysis. Substantially, the stored data sets represent historical data of the evolution (ageing) of the color data of the artwork 140.

Specifically, in various embodiments, the historical data of a plurality of artworks 140 are stored to a remote database, e.g. in a cloud. Specifically, as shown in Figure 38, the lighting system may comprise: a remote control system 13 OR connected to a Wide Area Network WAN, e.g. Internet, wherein the remote control system 130 R comprises a remote data processing unit 133 R and a remote data storage 132 R ; and a local control system 130 L (i.e. located in the exposition area or in the vicinity of the exposition area 160) configured to manage the operation of the light fixture(s) 110 in the exposition area 160.

Generally, the control system 130 L is purely optional, because the light fixture(s) 110 may also be connected directly to the Wide Area Network WAN.

Accordingly, in various embodiments, the historical data of a plurality of artworks 140 may be stored to the remote, e.g. cloud based, data storage 132 R .

In various embodiments, the data processing unit 133 R is configured to compare the historical data of a given artwork 140 with the historical data of other artworks 140. Specifically, in various embodiments, due to the fact that the artworks 140 are different, given features may be extracted from the historical data sets. Generally, as alternative to or in addition to storing the image data-sets, the image data may also be pre-processed (e.g. in order to extract the features) and the processed image data may be stored to the database.

Generally, a significant number of methods are known in order to determine artworks having similar properties. For example, in various embodiments, in order to compare different artworks 140, the pixel-data may be grouped so that they represent color-coded information of a contiguous larger area of a painting, i.e. a larger cell area. For example, by selecting a plurality of reference colors, one or more sub-areas of the pixels in an image may be selected, which have a similar color, e.g. which are between a given upper and lower threshold with respect to the respective reference color. For example, this permits to determine images having similar colors, irrespective of the fact whether the positions and dimensions of the respective areas are similar. However, in various embodiments, also the dimension of the respective areas of pixels may be stored, which e.g. permits to select artworks having a similar overall color usage.

Generally, also other data may be relevant in order to determine similar artworks. For example, one or more of the following characteristics may be stored for each artwork: the artist, the epoch, the type of artwork (old master, modem art, etc.), the type of pigments used (e.g. for each reference color) or even the ingredients used for the pigments. As mentioned before, such data may already be stored in the artwork database 204.

Similarly, also the actual and historical conditions of the artwork 140 may be taken into account, such as actual and/or historical data of the environment in which the artwork 140 has been exposed (which may also relate to a sequence of different exposition areas in case the artwork 140 is relocated), such as one or more of the following data which may be provided by one or more sensors 120 2 installed in the exposition are 160:

- the temperature in the exposition area 160 and/or the temperature of the artwork 140;

- the humidity in the exposition area 160 and/or at the artwork 140;

- the illumination (possibly for a plurality of wavelengths or wavelength ranges) of the artwork 140, which may include instantaneous and/or cumulative values; possibly correlated with circadian lighting conditions, which may be measured or estimated, e.g. as a function of the position of the exposition area 160 (see also the respective description of light sensors).

Thus, in various embodiments, the following data are stored to the central data base: data identifying the ageing of the pigments of the artworks 140 (as determined as a function of the images of the artwork 140); data identifying the characteristics of the artworks 140, which e.g. permits to determine similar artworks 140; and data identifying the historical data of the exposition of the artworks 140, in particular with respect to the illumination of the artwork 140.

In addition, also the camera data (e.g. type of sensors, color filters, optical components, etc.) may be stored for each artwork 140 in the central database.

For example, in various embodiments, the remote processing system 13 OR may receive the following data, which may e.g. be already stored in one or more of the databases 200: data identifying the artwork 140, preferably including pigment identification data, wherein the pigment identification data preferably comprise a matrix containing pigment identification data for bi-dimensional positions of the artwork 140; data identifying a requested illumination for the artwork 140, such as respective light intensity values for a plurality of wavelengths or wavelength ranges, or a color temperature; optionally data identifying the characteristics of the camera 120i; optionally data identifying the light fixtures 110.

In various embodiments, the remote processing system 13 O R may receive from the local processing systems 130 (or directly the sensors 120i and I2O 2 ), the following measured data:

- the image data or the processed image data provided by the camera 120i, and which are indicative of the ageing of the pigments of the artwork 140, such as the values of the reflectivity of given areas (associated with given pigments) of the artwork 140;

- the other measured data provided by the sensors 120 2 , such as the light intensity values for a plurality of wavelengths or wavelength ranges, and optionally one or more of: temperature, humidity, oxygen level of the air, location of the artwork.

The above data thus permit to define the (integral) environment (lighting, temperature and so on) of the artworks 140. In various embodiments, the data stored to the central database are processed via a machine learning method, in particular a Deep Learning / Artificial Intelligence (AI) program. Using such analysis based on supervised or non-supervised Al-programs permits to determine functional relationships between a plurality of parameters that are derived from the big data set (BDS). Moreover, the data may be processed in order to determine the best lighting conditions for each piece of artwork. Best lighting conditions may mean, less damaging lighting conditions, or less damaging lighting conditions with still acceptable or use-defined tolerance range of color rendition of an Art Work.

Figure 39 shows in this respect an embodiment of the operation of the remote processing system 13 O R .

Generally, in the embodiment considered, the database 132 R has stored one or more datasets for each of a plurality of already monitored artworks 140. For example, for this purpose, the processing system 130 R may receive one or more datasets for each of a plurality of artworks 140. Specifically, in various embodiments, each dataset comprises: data identifying a list of pigments of the respective artwork 140; data identifying the illumination of each pigment of the list of pigments during a given time period; data identifying the ageing of each pigment of the list of pigments during the given time period.

For example, as described in the foregoing, the data identifying a list of pigments of an artwork 140 may be a bi-dimensional matrix having stored data identifying the pigment in a given horizontal and vertical position of the artwork. Moreover, as described in the foregoing, the data may identify at least one of: a pigment color, a pigment type, a pigment material, etc. For example, as mentioned before, such a list of pigments of an artwork 140 may be inserted manually or determined by analyzing an image of the artwork 140. For example, the color of given areas of the artwork 140 may be determined via image processing, and the color of a given area may then be associated with a given pigment type. For example, the pigment type for a given color may be entered manually or determined automatically, e.g. based on a table having stored pigment types for given artwork types, such as Old Master, Modern Art, etc., or an artist name.

Generally, due to the fact that the pigment type data are per se static, these data may be stored once for each artwork 140, and the data identifying the list of pigments of the artwork 140 may simply identify the respective artwork 140, e.g. via a respective univocal artwork code.

In various embodiments, the data identifying the illumination of each pigment of the list of pigments during a given time period may be determined by determining the global illumination of the artwork or determining the local illumination of given areas of the artwork. Possible solutions for determining a global or local illumination of an artwork 140 have already been described for the previous light sensors, such as based on the power supply parameters of the light sources 117, the light emitted by the light sources 117, the light received at the artwork 140, the light reflected by the artwork 140 or the light reflected by a reference surface.

In various embodiments, the data identifying the ageing of the pigments may be determined by acquiring images of the artwork 140 and by processing the image via an image analysis, e.g. by determining the variation of the pixel data of two images taken at the beginning and the end of the time period. However, also other method for monitoring the variation of given properties of the pigments may be used to determine/estimate the ageing of the pigments.

After a start step 1000, the processing system 13 OR receives at a step 1002 data identifying the list of pigments of the artwork 140 to be illuminated. Generally, also in this case, the data may already be stored to a database, such as the artwork database 204 (which may also be stored in the remote database 132R) and the data may comprise a univocal artwork code identifying the artwork 140 to be illuminated.

Generally, the processing system 130R may also receive the data identifying the illumination of each pigment and the respective ageing for the artwork 140 to be illuminated and store these data to the database 132R. Accordingly, the database 132R may add to the historical database also the data of the artworks to be illuminated.

Next, the processing system 13 OR determines a maximum illumination threshold for the illumination of the artwork 140 as a function of the list of pigments of the artwork and the datasets stored in the database 132R.

For example, in various embodiments, the processing system 13 OR selects at a step 1006 a pigment of the list and uses the historical data of the artworks 140 stored to the data storage 132R in order to determine the best light spectrum that minimizes the ageing of the pigment, i.e. the maximum illumination for the respective pigment.

For example, in various embodiments, the method may perform the following operations at the step 1006: processing the datasets of the artworks 140 in order to determine the datasets having similar pigments; as mentioned before, for this purpose the pigments may be classified according to color, epoch, material, etc.; processing the data of the selected datasets in order to determine correlations between the selected pigment, the illumination of the pigments ( e.g . the respective spectrum of the light illuminating it) and the changes occurring to the pigments over a certain amount of time.

For example, in various embodiments, at the second step, a feature selection is performed in order to select a set of most relevant features, which are linked to the ageing of the selected pigment. Specifically, for the selected pigment type is selected a set of features, which includes at least data identifying the spectrum of the light used to illuminate the selected pigment. For example, these features may include light intensity values for a plurality of wavelengths or wavelength ranges. Generally, the features may also include one or more other data identifying the historical data of the exposition of the artworks 140, such as the temperature and humidity data.

Thus, by using per se known solutions, such as Principal Component Analysis or by calculating a Probability Density Function (PDF), the most influencing features may be selected, in particular a set of wavelengths or wavelength ranges is determined which is correlated strongly to the ageing of a given pigment. However, typically, ageing is also influenced strongly by temperature and humidity, i.e. the selected features may include the temperature and/or humidity in the exposition area or directly of the artwork 140. Generally, this step is purely optional, because the method may also use a fixed set of features, such as a pre-selected set of features, e.g. the set of wavelengths or wavelength ranges may include a set of predetermined wavelength or wavelength ranges, possibly ranging from IR to UV.

In various embodiments, once having determined the most influencing features, the data of the respective features may be processed in order to determine maximum threshold values for the illumination of the selected pigment with these wavelengths or wavelength ranges, which do not generate (a significant) ageing of the artwork 140.

For example, in various embodiments, the data are processed via a machine learning method, which receives at input the data of the selected features and provides at output the ageing index. Thus, in various embodiments, the machine learning method generates an ageing model of the selected pigment based on the historical data stored to the central database. For example, the machine learning method may be an Artificial Neural Network (ANN) or a Support Vector Machine.

Accordingly, by changing the data at the input, the method may simulate the ageing of the selected pigment, and select a combination of input values, which does not result in a significant ageing of the selected pigment. For example, due to the fact that humidity and temperature are usually not controllable, e.g. because the temperature is fixed to a constant value, such as 21°C, and humidity is usually not controlled, the method may vary the input of the light intensity values for the selected set of wavelengths or wavelength ranges, and select a combination of values, wherein the ageing index at the output of the machine learning method is below a given maximum threshold value.

Accordingly, the method may also be used to adapt the illumination of the artwork to different ambient conditions, e.g. a different temperature or humidity in the exposition area 160, because the actually measured temperature and humidity values may be used as input for the ageing model.

More specifically, due to the fact that a pigment should be illuminated with a given requested illumination ( i.e . spectrum), the relative value of the intensity values are linked. Thus, often it is sufficient to vary (e.g. increase) a single intensity value, calculate the other intensity values, monitor the respective ageing value at the output of the machine learning method and select the set with the highest intensity values for which the ageing value is still below a given maximum value.

The same operation may then be performed for the other pigments of the artwork 140, which is schematically shown via a verification step 1008. Specifically, in case not all pigments/areas of pigments have been processed (output “N” of the verification steps 1008, the method selects a next pigment/area of pigments and returns to the step 1006. Conversely, in case all pigments/areas of pigments have been processed (output “Y” of the verification step 1008), the method proceeds to a step 1010.

Specifically, at a step 1010, the processing system 13 OR determines the best light spectrum that minimizes the overall changes/ageing of all the pigments of an artwork 140. For example, for this purpose, the processing system 130R may select the minimum values of the (maximum) values for the various pigments as selected at the step 1006.

In various embodiments, before the method terminates at a stop step 1014, the method may transmit at a step 1012 the selected maximum values to the control system 130L or directly the light fixture(s) 110 in order to adapt the light emitted by the light fixture(s) 110. For example, as described in the foregoing, based on the maximum illumination values for the various wavelength or wavelength ranges, the lighting system 110, e.g. the control system 130L, may change the setting of the light fixture(s) 110 either in a feed forward configuration or in a feed back configuration by monitoring the illumination of the artwork 140 via the light sensor 120 2.

As mentioned before, preferably, the above data are stored to a central, e.g. cloud based, database 132R. For example, in this way, the data of artworks from a plurality of museums and art galleries may be acquired. Similarly, museums, art galleries and even private owners of artwork may access the database in order to optimize the lighting and other conditions of their artworks. In various embodiments, the control system 130 or directly the light fixtures 110 may be configured to communicate with the central data base and the AI-program in order to receive automatically respective lighting setting.

Of course, the described methods and the application of AI-programs and subsequent optimization of local lighting is not just limited to paintings but can be applied to all pieces of artwork. For example, in case of statues or generally three-dimensional artworks, the pigment data, the (requested and historical) illumination data and the aging data, may indeed relate to areas of the surface of the artwork 140. For example, in this case, the respective data may be mapped on the surface of a respective three-dimensional model of the artwork 140. Generally, the 3D model of the surface of the artwork 140 may be obtained, e.g. via a 3D scanner, or may by calculated based on a depth map determined as a function of the images acquired by the camera 120i. For example, as well knonw in the art, such depth maps may be determined based on images acquired via a stereo camera (which may also be implemented with the same camera 120i being placed in a plurality of different positions) and/or by illuminating the object with a light pattern.

Accordingly, various embodiments of the present disclosure relate to a method of illuminating an artwork in an exposition area with at least one light fixture. Specifically, in various embodiments, the method comprises:

- receiving one or more datasets for each of a plurality of artworks and storing each dataset in a database, each dataset comprising: o data identifying a list of pigments of the respective artwork; o data identifying the illumination of each pigment of the list of pigments during a given time period; o data identifying the ageing of each pigment of the list of pigments during the given time period;

- receiving data identifying a list of pigments of the artwork to be illuminated; determining a maximum illumination threshold for the illumination of the artwork to be illuminated as a function of the list of pigments of the artwork to be illuminated and the datasets stored in the database; controlling the illuminating the artwork to be illuminated in order to ensure that the illumination of the artwork corresponds to or is smaller than the maximum illumination threshold for the illumination of the artwork.

Possible embodiments of this solution are detailed at the following point "Example 10".

EXAMPLES OF EMBODIMENTS

Example 1

Example 1.1: An illumination system comprising a light fixture 110, an optional sensor 120 and a control system 130 to illuminate an object 140 or a person 150.

Example 1.2: The illumination system of Example 1.1 providing the illumination locally to highlight certain aspects of an object 140 being less than an entirety of the object 140.

Example 1.3 : The illumination system of Example 1.1 or 1.2 containing a light fixture 110 which provides pixelized light, the pixelized light produced by at least one of an LED matrix; a liquid crystal; and a Digital Micromirror Device (DMD).

Example 1.4: The illumination system of any of the Examples 1.1 - 1.3 combining its illumination with ambient illumination to create a homogenous illumination of an object Example 1.5: The illumination system of any of the Examples 1.1 - 1.4 comprising a user interface 134 which allows to control the illumination system 110 such as by gestures or third- party downloadable software applications.

Example 1.6: The illumination system of any of the Examples 1.1 - 1.5 wherein the sensor 120 is configured to detect degradation of an object 140 (e.g. its color) and adjust the light quality (intensity, spectrum) to reduce the damaging of the object 140.

Example 1.7: The illumination system of any of the Examples 1.1 - 1.6 comprising a sensor system which measures the light parameters of the illumination either directly or indirectly. Example 1.8: The illumination system of any of the Examples 1.1 - 1.7 comprising sensors 140 measuring environmental parameters such as one or more of a temperature, a humidity, and a chemical compound.

Example 1.9: The illumination system of any of the Examples 1.1 - 1.8 comprising a data processing unit 113, 123 and/or 133 which can process the data using artificial intelligence and/or machine learning.

Example 1.10: The illumination system of any of the Examples 1.1 - 1.9 comprising a data storage device 112 and/or 132 which stores different lightings for different settings.

Example 1.11: The illumination system of any of the Examples 1.1 - 1.10 comprising a data storage device 112 and/or 122 to collect data relating to an object over its lifetime.

Example 2

Example 2.1: A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, the method comprising the steps of: obtaining data identifying requested spectral characteristics (208), obtaining data identifying a viewer’s eye characteristics (210),

- generating one or more control commands in order to vary the characteristics of the light emitted by the one or more light fixtures (110) as a function of the data identifying requested spectral characteristics (208) and the data identifying a viewer’s eye characteristics (210).

Example 2.2: The method according to Example 2.1, wherein the characteristics of the light emitted by the one or more light fixtures (110) comprise one or more of: light intensity, frequency/color, polarization, direction and/or beam spread.

Example 2.3: The method according to Example 2.1 or Example 2.2, wherein the requested spectral characteristics, comprise a requested color and a requested brightness level.

Example 2.4: The method according to Example 2.3, wherein the requested color is specified via a color temperature or a color coordinate, such as in a CIE 1931 color space.

Example 2.5: The method according to Example 2.3 or Example 2.4, wherein the requested spectral characteristics comprise a selection, assortment or sequence of a plurality of requested colors and respective requested brightness levels.

Example 2.6: The method according to Example 2.5, wherein the sequence of a plurality of requested colors and respective requested brightness levels is stored in a light scenario matrix comprising data for a plurality of viewing locations.

Example 2.7: The method according to Example 2.6, comprising: obtaining data identifying a viewer’s position in the exposition area, and

- generating the one or more control commands in order to vary the spectral characteristics of the light emitted by the one or more light fixtures also as a function of the data identifying a viewer’s position.

Example 2.8: The method according to any of Examples 2.3 to 2.7, wherein the generating one or more control commands comprises:

- generating the one or more control commands in order to vary the color of the light emitted by the one or more light fixtures as a function of the data identifying requested spectral characteristics and the data identifying a viewer’s eye characteristics.

Example 2.9: The method according to any of Examples 2.3 to 2.8, wherein the data identifying a viewer’s eye characteristics identify the sensitivity of the viewer’s eyes for a plurality of colors.

Example 2.10: The method according to Example 2.9, wherein the generating one or more control commands comprises:

- generating data identifying modified spectral characteristics by modifying the data identifying requested spectral characteristics as a function of the data identifying the sensitivity of the viewer’s eyes for a plurality of colors; and

- generating the one or more control commands in order to vary the spectral characteristics of the light emitted by the one or more light fixtures as a function of the data identifying modified spectral characteristics.

Example 2.11: The method according to Example 2.10, wherein the generating data identifying modified spectral characteristics comprises: - increasing the intensity of one or more colors as a function of the data identifying the sensitivity of the viewer’s eyes for a plurality of colors.

Example 2.12: The method according to Example 2.10 or Example 2.11, wherein the generating data identifying modified spectral characteristics comprising: changing or altering one or more colors as a function of the data identifying the sensitivity of the viewer’s eyes for a plurality of colors.

Example 2.13: The method according to Example 2.12, wherein the one or more colors are shifted along the Planck-curve.

Example 2.14: The method according to Example 2.12 or Example 2.13, wherein the one or more colors are shifted as a function of data identifying one or more MacAdam ellipses.

Example 2.15: The method according to any of Examples 2.11 to 2.14, wherein the one or more colors are selected from: a first color in the red color spectrum, i.e. between 625 and 740 nm, a second color in the blue color spectrum, i.e. between 435 and 500 nm, and and a third color in the green color spectrum, i.e. between 520 to 565 nm.

Example 2.16: The method according to any of Examples 2.3 to 215, comprising: determining spectral characteristics of a natural and/or artificial ambient light in the exposition area (160), and

- generating the one or more control commands in order to vary the characteristics of the light emitted by the one or more light fixtures (110) also as a function of the determined spectral characteristics of a natural and/or artificial ambient light in the exposition area (160).

Example 2.17: The method according to Example 2.16, wherein the spectral characteristics of a natural and/or artificial ambient light in the exposition area are determined via a light sensor (120) installed in the exposition area (160).

Example 2.18: The method according to Example 2.10 and Example 2.16, wherein the generating data identifying modified spectral characteristics comprises:

- modifying the data identifying requested spectral characteristics also as a function of the determined spectral characteristics of a natural and/or artificial ambient light in the exposition area (160).

Example 2.19: The method according to any of the previous Examples 2.1 to 2.18, wherein the obtaining data identifying requested spectral characteristics comprises:

- receiving data identifying spectral characteristics requested by an artist; obtaining data identifying the artist’s eye characteristics and/or the artist’s preferences,

- generating the data identifying requested spectral characteristics by modifying the data identifying spectral characteristics requested by the artist as a function of the data identifying the artist’s eye characteristics and/or the artist’s preferences. Example 2.20: The method according to any of the previous Examples 2.1 to 2.19, wherein the data identifying a viewer’s eye characteristics are stored in a database (210) or on a portable memory support, such as a memory card or smartcard (220) or a smartphone (220).

Example 2.21. The method according to any of the previous Examples 2.1 to 2.20, wherein the data identifying a viewer’s eye characteristics are stored in a viewer’s eye database (210) comprising a plurality of profiles, wherein each profile comprises a univocal viewer code identifying a respective viewer and the data identifying the respective viewer’s eye characteristics and/or viewer’s preferences, and wherein the obtaining data identifying a viewer’s eye characteristics comprises:

- receiving a viewer code, accessing the viewers database (210) and obtaining the data identifying the viewer’s eye characteristics and/or viewer’s preferences associated with the viewer code.

Example 2.22: The method according to any of the previous Examples 2.1 to 2.21, wherein the obtaining data identifying a viewer’s eye characteristics comprises:

- receiving data identifying the viewer’s age, and determining the data identifying a viewer’s eye characteristic as a function of the viewer’s age.

Example 2.23: The method according to Example 2.22, wherein the data identifying a viewer’s eye characteristics identify the sensitive of the viewer’s eyes for a plurality of colors, and wherein the determining the data identifying a viewer’s eye characteristic as a function of the viewer’s age comprises: specifying via the data identifying a viewer’s eye characteristic a higher intensity for red and/or green and/or blue light with an increasing viewer’s age.

Example 2.24: The method according to any of the previous Examples 2.1 to 2.23, wherein the data identifying a viewer’s eye characteristics identify preferred illumination settings of the viewer.

Example 2.25: The method according to Example 2.24, wherein the preferred illumination settings of the viewer are varied in real-time in order to vary the spectral characteristics of the light emitted by the one or more light fixtures.

Example 2.26. The method according to any of the previous Examples 2.1 to 2.25, wherein the data identifying a viewer’s eye characteristics (210) represent default eye characteristics (210), and wherein the method comprises: acquiring via a camera (230) an image (240) of the artwork illuminated with the light emitted by the one or more light fixtures (110); obtaining data identifying a remote viewer’s eye characteristics (210),

- generating a modified image by modifying the spectral characteristics of the acquired image as a function of the data identifying a remote viewer’s eye characteristics (210), and displaying the modified image on a remote display device (250). Example 2.27: The method according to Example 2.26, wherein the image is modified by the remote display device (250) or before transmission to the remote display device (250)

Example 2.28: The method according to Example 2.26 or Example 2.27, wherein the generating a modified image comprises:

- receiving data identifying the sensitive of the camera (230) for a plurality of colors and/or data identifying a location and orientation of the camera, and

- modifying the spectral characteristics of the acquired image as a function of the data identifying the sensitive of the camera for a plurality of colors and/or data identifying a location and orientation of the camera.

Example 2.29: The method according to any of Examples 2.26 to Example 2.28, wherein the generating a modified image comprises:

- receiving data (212) identifying the spectral characteristics of the display remote device (250), and

- modifying the spectral characteristics of the acquired image as a function of the data (212) identifying the spectral characteristics of the remote display device (250).

Example 2.30: The method according to Example 2.29, wherein the receiving data (212) identifying the spectral characteristics of the remote display device (250) comprises:

- receiving data identifying a display device type, accessing a display database (212) comprising data identifying the spectral characteristics for a plurality of display device types, in order to obtain the data identifying the spectral characteristics associated with the received display device type.

Example 2.31: The method according to Example 2.30, wherein the display device type is selected from the group of: a projector model, a display type, such as an LCD display, an AMOLED display, a display model, a virtual reality or augmented reality glass model.

Example 2.32: The method according to any of Examples 2.26 to 2.31, wherein the display device is in a viewer’s location, and wherein the generating a modified image comprises: determining spectral characteristics of a natural and/or artificial ambient light in the viewer’s location, and

- modifying the spectral characteristics of the acquired image as a function of the determined spectral characteristics of a natural and/or artificial ambient light in the viewer’s location.

Example 2.32: The method according to Example 2.31, wherein the remote display device (250) is integrated in a computer device comprising a camera, and wherein the spectral characteristics of a natural and/or artificial ambient light in the viewer’s location are determined via the camera of the mobile device. Example 2.33: A control system (130) for a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics in order to illuminate an artwork (140) in an exposition area (160), wherein the control system (130) is configured to send control commands to the one or more light fixtures (110) in order to vary the characteristics of the light emitted by the one or more light fixtures (110), wherein the control system (130) is configured to implement the method according to any of the previous Examples 2.1 to 2.32. Example 2.34: The control system of Example 2.33, comprising a database (200) and/or a communication interface (131) in order to receive at least one of: data (202) identifying the characteristics of the one or more light fixtures (110); data (204) identifying the characteristics of the exposition area; data (206) identifying at least one artwork; data (210) identifying artist’s eye characteristics, such as an artist code, data (208) identifying requested spectral characteristics; data (210) identifying viewer’s eye characteristics, such as a viewer code, or a viewer’s age,

Example 2.35: The control system of Example 2.34, wherein the communication interface (131) comprises at least one of: a communication interface for connection to a local area network, a communication interface for connection to a wide area network, such as Internet, a communication interface for short range wireless communication, such as a Bluetooth® communication interface.

Example 2.36: The control system of any of Examples 2.33 to 2.35, comprising a memory (200) having stored data (206) identifying requested spectral characteristics for a plurality of artworks. Example 2.37: A lighting system (100) comprising: one or more light fixtures (110) configured to emit light with variable characteristics in order to illuminate an artwork (140) in an exposition area (160), and a control system (130) according to any of Examples 2.33 to 2.36.

Example 2.38: The lighting system of Example 2.37, comprising: at least one light sensor (120) configured to be installed in the exposition area (160). Example 2.39: The lighting system of Example 2.37 or Example 2.38, comprising: at least one mobile device (250) comprising a display.

Example 2.40: The lighting system of any of Examples 2.36 to 2.39, wherein the mobile device (250) is a smartphone or a tablet.

Example 2.41. The lighting system of any of Examples 2.36 to 2.40, comprising: a plurality of portable memory supports, such as memory cards or smartcards (220), or smartphones (222), each portable memory support having stored data identifying at least one viewer’s eye characteristics. Example 2.42. A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 2.1 to 2.32.

Example 2.43. A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 2.1 to 2.32.

Example 3

Example 3.1 : A method of selecting at least one light fixture (110) comprising: obtaining data (206) identifying characteristics of an artwork (140), obtaining data (204) identifying characteristics of an exposition area (160), determining a set of light fixtures (110) and/or operating setting for a set of light fixtures (110) as a function of said data (206) identifying characteristics of said artwork (140) and said data (204) identifying characteristics of said exposition area (160).

Example 3.2: The method according to Example 3.1, wherein at least one light fixture (110) of said set of light fixtures supports a plurality of operating setting having different characteristics, and wherein the method comprises: selecting one of said plurality of operating setting of said at least one light fixture (110) as a function of said data identifying characteristics of said artwork (140) and said data identifying characteristics said exposition area.

Example 3.3: The method according to Example 3.2, comprising generating control information for said light fixtures (110) of said determined set of light fixtures as a function of said selected operating setting.

Example 3.4: The method according to any of the previous Examples 3.1 to 3.3, wherein said determining a set of light fixtures: accessing a database (202) of light fixtures (110), said database (202) of light fixtures (110) comprising light fixtures installed in said exposition area (140) and/or installable in said exposition area (140); and selecting amongst said installed and/or installable light fixtures (110) a set of light fixtures as a function of said data (206) identifying characteristics of said artwork (140) and said data (204) identifying characteristics said exposition area (160).

Example 3.5: The method according to Example 3.4, wherein said database (202) of light fixtures (110) comprises: a first spotlight with a first light intensity level, and a second spotlight with a second light intensity level, said second light intensity level being greater than said first light intensity level.

Example 3.6: The method according to Example 3.4 or Example 3.5, wherein said database (202) of light fixtures (110) comprises: a light fixture with a fixed beam angle, and a light fixture with a variable beam angle. Example 3.7: The method according to any of the previous Examples 3.4 to Example 3.6, wherein said database (202) of light fixtures (110) comprises: a light fixture with a framer, and/or a light fixture with a gobo.

Example 3.8: The method according to any of the previous Examples 3.4 to Example 3.7, wherein said database (202) of light fixtures (110) comprises: a light fixture with variable spectral characteristics.

Example 3.9: The method according to any of the previous Examples 3.4 to 3.8, wherein said database (202) of light fixtures (110) comprises data identifying characteristics of the respective light fixture (110), said characteristics of said light fixtures comprising one or more of the following data:

- brightness data, such as data identifying a minimum brightness level and a maximum brightness level adapted to be emitted by the respective light fixture; spectral data identifying light colors adapted to be emitted by the respective light fixture; optics data, identifying a light transfer function of one or more optical elements of the respective light fixture, such as a reflector, diffusor, lens, shutters and/or framers; data identifying at least one of a spectral distribution, a color location, a Color Rendering Index, a beam direction, a beam spread angle, and a light polarization of the light emitted by the light fixture 110.

Example 3.10: The method according to Example 3.9, wherein said spectral data are a color temperature or a color coordinate or a color rendering index.

Example 3.11: The method according to any of the previous Examples 3.4 to 3.10, wherein at least one light fixture (110) supports a plurality of configuration conditions having different characteristics, and said data identifying the characteristics of said light fixtures (110) comprise data for said plurality of configuration conditions.

Example 3.12: The method according to any of the previous Examples 3.1 to 3.11, wherein said exposition area (160) is a room, and wherein said data identifying characteristics of said exposition area comprise a room height, the distance between the floor and a truss used to mount light fixtures and/or a configuration of said room.

Example 3.13: The method according to any of the previous Examples 3.1 to 3.12, wherein said data identifying characteristics of said exposition area (160) comprise data identifying characteristics of natural and/or artificial light in said exposition area (160), such as:

- brightness data, such as data identifying a minimum brightness level and a maximum brightness level; data identifying at least one of a spectral distribution, a color location, a Color Rendering Index, a beam direction, a beam spread angle, and a light polarization of the light in said exposition area (160).

Example 3.14: The method according to Example 3.13, wherein said determining a set of light fixtures comprises: determining said set of light fixtures also as a function of said data identifying characteristics of a brightness level of natural and/or artificial light in said exposition area (160).

Example 3.15: The method according to any of the previous Examples 3.1 to 3.14, wherein said data (206) identifying characteristics of said artwork (140) indicate the type of said artwork, such as a drawing, a print-out, a photography, a textile, an Old Master painting, a modern art paining, a statue or a 3D-object.

Example 3.16: The method according to any of the previous Examples 3.1 to 3.15, wherein said data (206) identifying characteristics of said artwork (140) indicate the type of said artwork (140), said type of said artwork (140) being selected as a function of at least one of a material of canvas, color pigments, a frame material.

Example 3.17: The method according to any of the previous Examples 3.13 to 3.16, wherein said determining a set of light fixtures comprises: determining said set of light fixtures as a function of said type of said artwork and a respective room height, distance between the floor and a truss used to mount light fixtures and/or a room configuration.

Example 3.18: The method according to any of the previous Examples 3.13 to 3.15, wherein said determining a set of light fixtures comprises: accessing a data set, such as a table, said data set comprising a plurality of light fixture sets associated with a respective type of said artwork and a respective room height, and selecting one or more of said light fixture sets as a function of said type of said artwork and said room height.

Example 3.19: The method according to any of the previous Examples 3.1 to 3.18, wherein said obtaining data (206) identifying characteristics of said artwork (140) comprises: showing on a screen a request to insert data (206) identifying characteristics of an artwork (140), and

- receiving via a user interface said data (206) identifying characteristics of said artwork (140).

Example 3.20: The method according to any of the previous Examples 3.1 to 3.19, wherein said obtaining data identifying characteristics of said artwork comprises: accessing a database of artworks (202), said database of artworks (202) having stored characteristics of a plurality of artworks (140); selecting at least one of said plurality of artworks and obtaining the respective characteristics stored in said database of artworks.

Example 3.21: The method according to Example 3.20, wherein said database of artworks (206, 216) has stored for each of said plurality of artworks (140) a respective digital representation (216), and wherein said selecting one of said plurality of artworks comprises: obtaining a digital representation (240) of said artwork (140); selecting one of said plurality of artworks (140) by comparing said obtained digital representation (240) with the digital representations stored in said database of artworks (206, 216).

Example 3.22: The method according to Example 3.20 or Example 3.21, wherein said database of artworks (206) has stored for each of said plurality of artworks (140) a respective univocal artwork code, and wherein said selecting one of said plurality of artworks comprises: obtaining a univocal artwork code; selecting one of said plurality of artworks (140) by comparing said obtained univocal artwork code with the univocal artwork codes stored in said database of artworks (206). Example 3.23: The method according to Example 3.22, wherein said obtaining a univocal artwork code comprises: scanning an alphanumeric string, a barcode, a bi-dimensional barcode, such as a QR code, a magnetic support, communicating with a short-range wireless transmitter, such as an RFID or NFC transponder or a Bluetooth® transceiver,

- retrieving the univocal code from a distributed blockchain ledger.

Example 3.24: The method according to any of the previous Examples 3.1 to 3.23, wherein said obtaining data (204) identifying characteristics of said exposition area (160) comprises: showing on a screen a request to insert data identifying characteristics of an exposition area, and

- receiving via a user interface said data identifying characteristics of said exposition area. Example 3.25: The method according to any of the previous Examples 3.1 to 3.24, wherein said obtaining data (204) identifying characteristics of said exposition area (160) comprises: accessing a database of exposition areas (204), said database of exposition areas (204) having stored characteristics of a plurality of exposition areas (160), such as rooms of a museum; selecting one of said plurality of exposition areas (160) and obtaining the respective characteristics stored in said database of exposition areas (204).

Example 3.26: The method according to any of the previous Examples 3.1 to 3.25, wherein said characteristics of said artwork (140) comprise one or more of the following data: descriptive data, such as the name of the artwork, the name of the artist, the period or creation year of the artwork, the type of the artwork, dimensional data of said artwork,

- global or local color data for said artwork, such as color analysis data, spectral data, reflectance and/or image pixel data, damage data, such as a local or global damage matrix,

- reflectivity data, such as local or global reflectivity data, such as a reflectivity matrix. Example 3.27: The method according to Example 3.26, wherein said obtaining data identifying characteristics of said artwork comprise: - taking a digital image (240) of said artwork (140), and extracting one or more of said data identifying characteristics of said artwork (140) from said digital image (240) of said artwork (140).

Example 3.28: The method according to any of the previous Examples 3.1 to 3.27, wherein said characteristics of said exposition area (160) comprise one or more of the following data: dimensional data of said exposition area, such as a room height, room width and room length,

- the color of the floor and/or ceiling and/or one or more walls of said exposition area,

- brightness data, such as a brightness level of natural and/or artificial light and/or brightness profile of natural and/or artificial light and/or color temperature of natural and/or artificial light during at least one 24 hours day,

- position data of said exposition area, such as GPS position data, which may be used to estimate brightness data as a function of a local time,

- temperature and/or humidity of the exposition area.

Example 3.29: The method according to Example 3.28, wherein said obtaining data (204) identifying characteristics of said exposition area (160) comprises:

- taking a digital image of said exposition area, and extracting one or more of said data identifying characteristics of said exposition area from said digital image of said exposition area.

Example 3.30: The method according to Example 3.28 or Example 3.29, wherein said brightness data are determined via at least one light sensor installed in said exposition area.

Example 3.31: The method according to any of the previous Examples 3.1 to 3.30, wherein said obtaining data (206) identifying characteristics of an artwork comprises obtaining a graphic representation of said artwork (140), a similar artwork or a generic artwork, and wherein the method includes:

- rendering an image of said artwork as a function of said graphic representation and data identifying characteristics of the light fixtures (110) of said determined set of light fixtures, and displaying said rendered image on a display device.

Example 3.32: The method according to Example 3.31, wherein said rendering an image of said artwork (140) comprises rendering said image of said artwork (140) as a function of said data (204) identifying characteristics of said exposition area (160).

Example 3.33: The method according to Example 3.31 or Example 3.32, wherein said rendering an image of said artwork comprises rendering said image of said artwork (140) as a function of data identifying characteristics of a default exposition area.

Example 3.34: The method according to any of the previous Examples 3.31 to 3.33, wherein said rendering said image of said artwork (140), comprises

- rendering said image of said artwork (140) as a function of said characteristics of said at least one light fixture (110) for said selected operating condition. Example 3.35: The method according to any of Examples 3.31 to 3.34, comprising:

- receiving via a user interface a different set of light fixtures (110) and/or a different operating setting, and

- rendering a new image of said artwork (140).

Example 3.36: The method according to any of Examples 3.31 to 3.35, comprising: obtaining data identifying characteristics of said display device.

Example 3.37: The method according to Example 3.36, wherein said characteristics of said display device comprise one or more of the following data:

- the type of said display device, data identifying an optical transfer function of said display device.

Example 3.38: The method according to Example 3.36 or Example 3.37, wherein said rendering said image of said artwork (160) and/or said displaying said rendered image on said display device, comprises

- rendering said image of said artwork and/or modifying said rendered image as a function of said characteristics of said display device.

Example 3.39: The method according to any of the previous Examples 3.1 to 3.38, wherein said determining a set of light fixtures (110) comprises: determining a plurality of sets of light fixtures and/or operating settings as a function of said data (206) identifying characteristics of said artwork (140) and said data (203) identifying characteristics said exposition area (160), displaying data identifying said plurality of sets on a display device,

- receiving via a user interface data identifying one of said plurality of sets of light fixtures

(110).

Example 3.40: The method according to any of the previous Examples 3.1 to 3.39, wherein said determining said set of light fixtures (110) and/or said operating settings comprises: acquiring a training database of a plurally of reference illumination conditions comprising data identifying characteristics of a respective artwork, data identifying characteristics of a respective exposition area, a respective selected set of light fixtures and/or respective operating settings,

- training a machine learning algorithm, such as an artificial neural network, by using said training database, and determining said set of light fixtures and/or said operating settings via said machine learning algorithm as a function of said data (206) identifying characteristics of said artwork (140) and said data (202) identifying characteristics said exposition area (160). Example 3.41: A device comprising a display device, a user interface and at least one processing unit configured to implement the method according to any of the previous Examples 3.1 to 3.40. Example 3.42: A system comprising: a device comprising a display device, a user interface, a first processing unit and a first communication interface connected to a wide area network, such as Internet, a server comprising a second communication interface connected to said wide area network, and a second processing unit, wherein said first processing unit and said second processing unit are configured to implement the method according to any of the previous Examples 3.1 to 3.40.

Example 3.43: The device according to Example 3.41 or the system according to Example 3.42, wherein said display device and said user interface are implemented with a touchscreen.

Example 3.44: The device according to Example 3.41 or the system according to Example 3.42, wherein said device is a smartphone, a tablet or a personal computer.

Example 3.45: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 3.1 to 3.40.

Example 3.46: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 3.1 to 3.40.

Example 4

Example 4.1: A method of selecting at least one light sensor (120) for a lighting system (100) used to illuminate at least one artwork (140) in an exposition area (160) via one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, the method comprising the steps of: obtaining a digital model of said exposition area (160), said digital model including: o exposition area data (204) comprising data identifying the dimension of said exposition area (160); o artwork data (204, 206, 208) comprising data identifying the position of said at least one artwork (140) within said exposition area (160); o light fixture data (202, 204) comprising data identifying the position, orientation and illumination characteristics of said one or more light fixtures (110); and o background illumination data (204) comprising data identifying the position and illumination characteristics of other natural and/or artificial light sources emitting light within said exposition area 160; executing a plurality of illumination simulations (402, 404, 406) of said digital model of said exposition area by varying the illumination characteristics of said one or more light fixtures (110) and/or the illumination characteristics of said other natural and/or artificial light sources (164, 165), and determining for each illumination simulation data identifying a respective expected illumination of each of said at least one artwork (140), and determining (408, 410) a set of light sensors (120) for monitoring the illumination of said at least one artwork (140) as a function of said data identifying the expected illumination of said at least one artwork (140).

Example 4.2: The method according to Example 4.1, wherein said digital model is a 2D model. Example 4.3: The method according to Example 4.2, wherein said data identifying the dimension of said exposition area (160) comprise data identifying one or more widths and one or more lengths of said exposition area (160).

Example 4.4: The method according to Example 4.1, wherein said digital model is a 3D model. Example 4.5: The method according to Example 4.4, wherein said data identifying the dimension of said exposition area (160) comprise data identifying one or more widths, one or more lengths and one or more heights of said exposition area (160).

Example 4.6: The method according to any of the previous Examples 4.1 to 4.5, wherein said exposition area data (204) further comprise data identifying at least one of: the reflectivity of one or more surfaces of said exposition area 160 and the position and dimension of obstacles within the exposition area 160, such as 3D objects and/or visitors at expected/estimated position when observing a given artwork 140.

Example 4.7: The method according to any of the previous Examples 4.1 to 4.6, wherein said exposition area data (204) are stored in an exposition area database.

Example 4.8: The method according to any of the previous Examples 4.1 to 4.7, comprising: acquiring via a camera a plurality of images of said exposition area (160), and determining said exposition area data (204) as a function of said images of said exposition area (160).

Example 4.9: The method according to any of the previous Examples 4.1 to 4.8, wherein said artwork data (204, 206, 208) further comprise data identifying at least one of: the dimension of said artwork (140), the reflectivity of said artwork (140) and a graphical representation of said artwork (140).

Example 4.10: The method according to any of the previous Examples 4.1 to 4.9, wherein said artwork data (204, 206, 208) further comprise data identifying at least one of: a requested target illumination of said artwork (140) and a maximum illumination for said artwork (140).

Example 4.11: The method according to Example 4.10, wherein said artwork data (204, 206, 208) further comprise data identifying a type of said artwork, and wherein the method comprises determining said requested target illumination of said artwork (140) and/or said maximum illumination for said artwork (140) as a function of said type of said artwork.

Example 4.12: The method according to Example 4.10 or Example 4.11, wherein said requested target illumination and/or said maximum illumination comprise a plurality of values for different colors.

Example 4.13: The method according to any of the previous Examples 4.1 to 4.12, wherein said artwork data (204, 206, 208) are stored in an artwork database.

Example 4.14: The method according to any of the previous Examples 4.1 to 4.13, comprising: acquiring via a camera a plurality of images of said exposition area (160), and determining said artwork data (204, 206, 208) as a function of said images of said exposition area (160). Example 4.15: The method according to any of the previous Examples 4.1 to 4.14, wherein said light fixture data (202, 204) further comprise data identifying a beam spread or range of beam spreads for the light adapted to be emitted by said one or more light fixtures (110).

Example 4.16: The method according to any of the previous Examples 4.1 to 4.15, wherein said illumination characteristics of said one or more light fixtures (110) identify a light intensity and/or a color, or a range of light intensities and/or colors adapted to be emitted by said one or more light fixtures (110).

Example 4.17: The method according to any of the previous Examples 4.1 to 4.16, wherein said light fixture data (202, 204) are stored in a light fixture database.

Example 4.18: The method according to any of the previous Examples 4.1 to 4.17, wherein said background illumination data (204) comprising data identifying the position and dimensions of apertures in said exposition area (160), such as windows (164) and/or doors (165).

Example 4.19: The method according to any of the previous Examples 4.1 to 4.18, wherein said illumination characteristics of said other natural and/or artificial light sources comprises at least one of: a light intensity and/or a color, or a range of light intensities and/or colors adapted to be emitted by said other natural and/or artificial light sources; the direction or range of directions of the lighted emitted by said other natural and/or artificial light sources.

Example 4.20: The method according to any of the previous Examples 4.1 to 4.19, wherein said background illumination data (204) are stored in an exposition area database.

Example 4.21 : The method according to any of the previous Examples 4.1 to 4.20, comprising: acquiring via a camera a plurality of images of said exposition area (160), and determining said background illumination data (204) as a function of said images of said exposition area (160).

Example 4.22: The method according to any of the previous Examples 4.1 to 4.21, wherein said executing a plurality of illumination simulations of said digital model of said exposition area by varying the illumination characteristics of said one or more light fixtures (110) and/or the illumination characteristics of said other natural and/or artificial light sources comprises at least one of: executing an illumination simulation of said digital model of said exposition area when said one or more light fixtures (110) are switched off; executing an illumination simulation of said digital model of said exposition area by varying the light intensity and/or color of said one or more light fixtures (110); and executing an illumination simulation of said digital model of said exposition area by varying the light intensity and/or color of said other natural and/or artificial light sources (164, 165).

Example 4.23: The method according to Example 4.22, wherein said executing a plurality of illumination simulations of said digital model of said exposition area by varying the illumination characteristics of said one or more light fixtures (110) and/or the illumination characteristics of said other natural and/or artificial light sources comprises at least one of: executing illumination simulations of said digital model of said exposition area for a minimum and a maximum value of the light intensity of said one or more light fixtures (110); and executing an illumination simulation of said digital model of said exposition area for a minimum and a maximum value of the light intensity of said other natural and/or artificial light sources (164, 165).

Example 4.24: The method according to any of the previous Examples 4.1 to 4.23, wherein said determining a set of light sensors (120) comprises for each artwork (140): analyzing the respective data identifying the expected illumination of said artwork (140) in order to determine a minimum and/or maximum light intensity value, and/or minimum and/or maximum light intensity values for a plurality of colors; and determining data identifying whether a light sensor (120) should be used or not used to monitor the artwork (140) as a function of said minimum and/or maximum light intensity value, and/or said minimum and/or maximum light intensity values for said plurality of colors.

Example 4.25: The method according to Example 4.24, wherein said determining a set of light sensors (120) comprises the following steps for each artwork (140): calculating the difference between said maximum and minimum light intensity value, and/or the differences between said maximum and/or minimum light intensity values for said plurality of colors, said difference or differences being indicative of the variability of the illumination of the respective artwork due to background illumination; comparing said difference or differences with at least one threshold; storing data identifying that a light sensor (120) should be used to monitor the artwork (140) when said difference or differences are greater than said at least one threshold; and optionally storing data identifying that a light sensor (120) should not be used to monitor the artwork (140) when said difference or differences are smaller than said at least one threshold.

Example 4.26: The method according to Example 4.24 or Example 4.25, wherein said determining a set of light sensors (120) comprises the following steps for each artwork (140): comparing said maximum light intensity value and/or said maximum light intensity values for said plurality of colors with at least one maximum threshold; storing data identifying that a light sensor (120) should be used to monitor the artwork

(140) when said maximum light intensity value and/or said maximum light intensity values for said plurality of colors are greater than said at least one maximum threshold; and optionally storing data identifying that a light sensor (120) should not be used to monitor the artwork (140) when said maximum light intensity value and/or said maximum light intensity values for said plurality of colors are smaller than said at least one maximum threshold. Example 4.27: The method according to any of the previous Examples 4.1 to 4.26, comprising determining (410) the type of each light sensor (120) in said set of light sensors (120).

Example 4.28: The method according to Example 4.27, wherein each light sensor (120) is selected from a list comprising at least two of the following light sensors (120): a light sensor positioned in proximity of the artwork (140) to be monitored, thereby measuring the light received at the artwork (140); a light sensor positioned in proximity of the light fixture (110) used to illuminate the artwork (140) to be monitored, thereby measuring the light emitted by the light fixture 110, which permits to calculate the light received at the artwork (140) to be monitored as a function of geometrical data specifying the position of the artwork (140) with respect to the light fixture (110); a light sensor, such as a camera, configured to measure the characteristics of the light reflected by the artwork (140) to be monitored; and a light sensor, such as a camera, configured to measure the characteristics of the light reflected by a reference surface positioned in proximity of the artwork (140) to be monitored.

Example 4.29: The method according to Example 4.27 or Example 4.28, wherein said determining (410) the type of each light sensor (120) comprises acquire data identifying the position of already installed light sensors (120) in said exposition area (160).

Example 4.30: The method according to Example 4.27 or Example 4.28, comprising determining (410) the type of each light sensor (120) as a function of the type and/or characteristics of the respective artwork (140), and/or the characteristics of the exposition area (160).

Example 4.31 : The method according to any of the previous Examples 4.1 to 4.30, comprising:

- modifying said digital model of said exposition area (160) in order to include: o light sensor data (204, 218) comprising data identifying the position and characteristics of at least one light sensor (120) in said set of light sensors (120); and executing at least one illumination simulation (412) of said digital model of said exposition area, and determining for each illumination simulation data identifying a respective expected illumination of each of said at least one artwork (140) and a respective expected illumination of said at least one light sensor (120) in said set of light sensors (120).

Example 4.32: The method according to Example 4.31, comprising: determining an expected measurement value provided by said at least one light sensor (120) in said set of light sensors (120) as a function of said expected illumination of said at least one light sensor (120) in said set of light sensors (120) and the respective characteristics of said at least one light sensor (120) in said set of light sensors (120). Example 4.33: The method according to Example 4.31 or Example 4.32, wherein said artwork data (204, 206, 208) further comprise data identifying a requested target illumination of said artwork (140) and/or a maximum illumination for said artwork (140), and wherein the method comprises: determining a target measurement value for said measurement value provided by said at least one light sensor (120) as a function of said expected illumination of said at least one light sensor (120) in said set of light sensors (120), the requested target illumination of said artwork (140) and the respective characteristics of said at least one light sensor (120) in said set of light sensors (120); and/or determining a maximum measurement value for said measurement value provided by said at least one light sensor (120) as a function of said expected illumination of said at least one light sensor (120) in said set of light sensors (120), the maximum illumination of said artwork (140) and the respective characteristics of said at least one light sensor (120) in said set of light sensors (120).

Example 4.34: The method according to Example 4.33, comprising:

- transmitting said target measurement value and/or said maximum measurement value for said at least one light sensor (120) in said set of light sensors (120) to a control system (130) of said lighting system (110), said control system (130) being configured to receive said measurement values from said light sensors (120) in said set of light sensors (120) and verify said measurement values and/or control the light fixtures (110) of said lighting system (100).

Example 4.35: The method according to any of the previous Examples 4.1 to 4.34, comprising: obtaining data (206) identifying characteristics of an artwork (140), obtaining data (204) identifying characteristics of an exposition area (160), determining a set of light fixtures (110) and/or operating setting for a set of light fixtures (110) as a function of said data (206) identifying characteristics of said artwork (140) and said data (204) identifying characteristics of said exposition area (160), wherein said light fixture data (202, 204) comprise data identifying the position, orientation and illumination characteristics of the light fixtures (110) in said set of light fixtures (110). Example 4.35 may also comprise the features of any of the previous Examples 3.2 to 3.40. Example 4.36: A device comprising a display device, a user interface and at least one processing unit configured to implement the method according to any of the previous Examples 4.1 to 4.35. Example 4.37: A system comprising: a device comprising a display device, a user interface, a first processing unit and a first communication interface connected to a wide area network, such as Internet, a server comprising a second communication interface connected to said wide area network, and a second processing unit, wherein said first processing unit and said second processing unit are configured to implement the method according to any of the previous Examples 4.1 to 4.35.

Example 4.38: The device according to Example 4.36 or the system according to Example 4.37, wherein said display device and said user interface are implemented with a touchscreen. Example 4.39: The device according to Example 4.36 or the system according to Example 4.37, wherein said device is a smartphone, a tablet or a personal computer.

Example 4.40: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 4.1 to 4.35.

Example 4.41: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Example 4.1 to 4.35.

Example 5

Example 5.1: A lighting system (100) configured to monitoring the irradiation of an object (140) with light generated by a light fixture (110), comprising:

- the light fixture (110) comprising one or more light sources (117), which together are configured to emit light (500) with a spatial radiation characteristic, a data processing unit (113, 123, 133) connected to the light fixture (110) and configured to obtain information on an intensity of the light (500) emitted by the light sources (117), a first memory (202, 204) connected to the data processing unit (113, 123, 133), in which information about the spatial positioning of the light fixture (110) with respect to a surface (142) of the object (140) is stored, and a second memory (202) connected to the data processing unit (113, 123, 133), in which information about the spatial radiation characteristic of the one or more light sources (117) or the light fixture (110) is stored, wherein the data processing unit (113, 123, 133) is configured to calculate and output a local intensity of the light incident at the respective position for a plurality of positions on the surface (142) of the object (140) as a function of the information on the light intensity, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture (110).

Example 5.2: The lighting system according to Example 5.1, further comprising a light sensor (120i) arranged within or adjacent to the light fixture (110) and configured to measure the intensity of the light emitted from the one or more light sources (117) in the light fixture (110), wherein the data processing unit (113, 123, 133) is connected to the light sensor (120i) to receive the information on the measured light intensity.

Example 5.3: The lighting system according to Example 5.1, further comprising a time measuring device (504) configured to determine an output an operating time of the light sources (117) in which the light sources (117) have been operated since they were put into operation for irradiating the object (140), a current and/or voltage measuring device (116k) configured to measure a current and/or voltage with which the one or more light sources (117) are operated, and a third memory (202), in which a function or table is stored, with which values of a light intensity are respectively assigned to a combination of a current value and/or a voltage value, and an operating time of the one or more light sources (117), wherein the data processing unit (113, 123, 133) is connected to the time measuring device (504), the current and/or voltage measuring device (116k) and the third memory (202) and is configured to receive the measured values for the current and/or voltage and the operating time respectively, and to calculate the information on the measured light intensity on the basis of the function or the table.

Example 5.4: The lighting system according to any of the previous Examples 5.1 to Example 5.3, wherein the data processing unit (113, 123, 133) is configured to obtain sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).

Example 5.5: The lighting system according to Example 5.4, wherein the data processing unit (113, 123, 133) is configured to compare the calculated local intensity for at least one of the plurality of positions with a limit value in the sensitivity information for that position and to output a signal determined as a function of the comparison.

Example 5.6: The lighting system according to Example 5.5, wherein the signal is a control signal transmitted to a data processing unit (113) configured to receive the signal, and to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110) as a function of the control signal.

Example 5.7: The lighting system according to any of the previous Examples 5.4 to Example 5.6, wherein the information obtained by the data processing unit (113, 123, 133) about an intensity of the light (500) emitted by the one or more light sources (117) includes respective information about light intensity for a plurality of different predetermined wavelength ranges, and wherein the sensitivity information obtained by the data processing unit (113, 123, 133) for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 5.8: The lighting system according to Example 5.7, wherein the data processing unit (113, 123, 133) is configured to calculate a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and to compare the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 5.9: The lighting system according to any one of the previous Examples 5.4 to 5.8, comprising a camera (508) configured to scan the surface (142) of the object (140) in order to obtain color and/or brightness values for positions on the surface (142), and wherein the data processing unit (113, 123, 133) is configured to receive the position-dependent color and/or brightness values from the camera (508) and to calculate a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.

Example 5.10: The lighting system according to any of the previous Examples 5.4 to 5.8, comprising a camera (508) or a reader device, such as a near-field communication device, configured to read an identification (510) attached to the object (140), and wherein the data processing unit (113, 123, 133) is configured to receive the identifier and obtain the sensitivity information for the object (140) to be irradiated from a memory (206) as a function of the identifier.

Example 5.11: The lighting system according to Example 5.9 or Example 5.10, wherein the light fixture (110), a control system (130) operatively connected to the light fixture, or a mobile unit operatively connected to the light fixture (110) comprises the camera (508).

Example 5.12: The lighting system according to Example 5.11, wherein the mobile unit is wirelessly connected to the light fixture (110).

Example 5.13: The lighting system according to Example 5.12, wherein the mobile unit is a smartphone or a tablet.

Example 5.14: The lighting system according to Example 5.13, wherein the smartphone comprises the data processing unit (133), and wherein the first, second and/or third memory is stored in the smartphone or in a cloud accessible by the smartphone.

Example 5.15: The lighting system according to any of the previous Examples 5.11 to 5.14, wherein the information on the spatial positioning of the light fixture (110) with respect to the surface (142) of the object (140) includes data identifying a distance (d) between the one or more light sources (117) and a reference point of the surface (142) of the object (140) and an angle of inclination (a) of the light fixture (110) with respect to a surface normal or a plane of the surface (142).

Example 5.16: The lighting system according to Example 5.15, wherein the light fixture (110) has associated or comprises a distance sensor (I2O 3 ), preferably an ultrasonic sensor, configured to measure the distance (d) between the light fixture (110) and the surface (142) and to transmit the measurement result to the data processing unit (113, 123, 133).

Example 5.17: The lighting system according to any of the previous Example 5.1 to 5.16, comprising an inclination angle sensor (I2O 2 ), which is preferably provided in the light fixture (110) or on the surface (142) of the object (142), wherein the inclination angle sensor (I2O 2 ) is configured to measure an angle of inclination (a) of the light fixture (110) with respect to a surface normal or a plane of the surface (142), and to transmit the measurement result to the data processing unit (113, 123, 133).

Example 5.18: The lighting system according to any of the previous Example 5.1 to 5.17, wherein the information on the spatial radiation characteristic of the one or more light sources (117) or the light fixture (110) includes data with a two-dimensional distribution of intensities on one plane, or on a plurality of planes at different distances from the one or more light sources (117), and wherein the data processing unit (113, 123, 133) is configured to calculate the local intensity at the given positions on the surface (142) of the object (140) by means of mathematical projection or inter- or extrapolation as a function of the two-dimensional distribution of intensities on the one plane, or on the plurality of planes.

Example 5.19: The lighting system according to Example 5.18, wherein the plane or the plurality of planes are perpendicular to an optical axis (502) of the light (500) emitted by the one or more light sources (117) of the light fixture (110).

Example 5.20: A corresponding method of monitoring the irradiation of an object (140), e.g. comprising:

- irradiating the object (2) with light generated by one or more light sources (117) of a light fixture (110) having a spatial radiation characteristic,

- providing information about an intensity of the light emitted by the one or more light sources (117) to a data processing unit (113, 123, 133),

- providing information about the spatial positioning of the light fixture with respect to a surface (142) of the object (140) to the data processing unit (113, 123, 133),

- providing information about the spatial radiation characteristics of the one or more light sources (117) or the light fixture (110) to the data processing unit (113, 123, 133),

- for a plurality of positions on the surface (142) of the object (140), calculating, using the data processing unit (113, 123, 133), a local intensity of the light incident at the respective position as a function of the information on intensity of the light, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture (110), and outputting the calculation result.

Example 5.21: A corresponding method of monitoring the irradiation of an object (140) with light generated by one or more light sources (117) of a light fixture (110) having a spatial radiation characteristic, e.g. comprising the steps of:

- receiving information about an intensity of the light emitted by the one or more light sources (117),

- receiving information about the spatial positioning of the light fixture with respect to a surface (142) of the object (140),

- receiving information about the spatial radiation characteristics of the one or more light sources (117) or the light fixture (110),

- for a plurality of positions on the surface (142) of the object (140), calculating a local intensity value of the light incident at the respective position as a function of the information on intensity of the light, the information on the spatial radiation characteristic and the information on the spatial positioning of the light fixture (110);

- generating a control command as a function of the a local intensity values; and

- transmitting the control command to the light fixture (110). Example 5.22: The method according to Example 5.21, comprising receiving sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).

Example 5.23: The method according to Example 5.22, comprising comparing the calculated local intensity values for at least one of the plurality of positions with a limit value in the sensitivity information for that position.

Example 5.24: The method according to Example 5.23, wherein the control signal is configured to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110).

Example 5.25: The method according to any one of the previous Examples 5.21 to 5.24, wherein the information about an intensity of the light emitted by the one or more light sources (117) includes respective information about light intensity for a plurality of different predetermined wavelength ranges, and wherein the sensitivity information for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 5.26: The method according to Example 5.25, comprising: calculating a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and comparing the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 5.27: The method according to any of the previous Examples 5.22 to 5.26, comprising:

- receiving position-dependent color and/or brightness values from a camera (508), and calculating a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.

Example 5.28: The method according to any of the previous Examples 5.22 to 5.26, comprising:

- receiving an identifier from a reader device, and obtaining the sensitivity information for the object (140) to be irradiated from a memory (206) as a function of the identifier.

Example 5.29: A device comprising a display device, a user interface and at least one data processing unit (113, 123, 133) configured to implement the method according to any of the previous Examples 5.21 to 5.28.

Example 5.30: A system comprising: a device comprising a display device, a user interface, a first processing unit and a first communication interface connected to a wide area network, such as Internet, a server comprising a second communication interface connected to said wide area network, and a second processing unit, wherein said first processing unit and said second processing unit are configured to implement the method according to any of the previous Examples 5.21 to 5.28. Example 5.31: The device according to Example 5.29 or the system according to Example 5.30, wherein said display device and said user interface are implemented with a touchscreen.

Example 5.32: The device according to Example 5.29 or the system according to Example 5.30, wherein said device is a smartphone, a tablet or a personal computer.

Example 5.33: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Example 5.21 to 5.28.

Example 5.34: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Example 5.21 to 5.28.

Example 6

Example 6.1: A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, wherein a light sensor (120) is installed in said exposition area (160) in order to measure a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range, the method comprising the steps of: during a calibration phase (610-618), obtaining a global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range and measuring via said light sensor (120) the global and/or local light intensity values of the light (600) reflected by said artwork (140); during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light reflected by said artwork (140); and during a normal operation phase (630-640), measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light reflected by said artwork (140), and estimating via said mathematical function or a dataset the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light reflected by said artwork (140).

Example 6.2: The method according to Example 6.1, comprising an actuator (602) configured to vary the position of said light sensor (120) with respect to the artwork (140).

Example 6.3: The method according to Example 6.1 or Example 6.2, comprising: during said calibration phase (610-618), varying the position of said light sensor (120) according to a given profile, and measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light reflected by said artwork (140) for said at least one wavelength or wavelength range; during said training phase, determining said mathematical function or said dataset as a function of said sequence of said plurality of global and/or said plurality of local light intensity values of the light reflected by said artwork (140); during said normal operation phase (630-640), varying the position of said light sensor (120) according to said given profile, measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light reflected by said artwork (140), and estimating the global and/or the plurality of local light intensities at said artwork (140) as a function of said sequence of said plurality of global and/or said plurality of local light intensity values of the light reflected by said artwork (140).

Example 6.4: The method according to any of the previous Examples 6.1 to 6.3, wherein said varying the position of said light sensor (120) comprises varying the distance and/or angle of said light sensor (120) with respect to said artwork (140).

Example 6.5: The method according to any of the previous Examples 6.1 to 6.4, wherein said obtaining said global and/or said plurality of local light intensities at said artwork (140) comprises:

- measuring (612) the global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range.

Example 6.6: The method according to any of the previous Examples 6.1 to 6.4, wherein said obtaining said global and/or said plurality of local light intensities at said artwork (140) comprises: obtaining geometrical data identifying the distance and optionally orientation of said one or more light fixtures (110) with respect to said artwork (140);

- measuring a global and/or a plurality of local intensities of light (500) emitted by said one or more light fixtures (110) for at least one wavelength or wavelength range; and calculating (614) the global and/or the plurality of local light intensities at said artwork (140) as a function of said measured global and/or local intensities of light (500) emitted by said one or more light fixtures (110 and said geometrical data.

Example 6.7: The method according to any of the previous Examples 6.1 to 6.6, comprising: during said training phase, calculating specular and/or diffusive reflectance of said artwork (140), and during said normal operation phase, calculating the global and/or the plurality of local light intensities at said artwork (140) as a function of the global and/or plurality of local measured light intensity values of the light (600) reflected by said artwork (140) and said specular and/or diffusive reflectance of said artwork (140).

Example 6.8: The method according to any of the previous Examples 6.1 to 6.7, comprising: during said calibration phase, sending (618) control commands to said one or more light fixtures (110) in order to vary the characteristics of the light (500) emitted by said one or more light fixtures (110), and each time obtaining (612, 614) the global and/or plurality of local light intensities at said artwork (140) and measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140).

Example 6.9: The method according to Example 6.8, wherein said control command is configured to vary at least one of the following characteristics of the light (500) emitted by said one or more light fixtures (110): light intensity, frequency/color, polarization, direction and/or beam spread.

Example 6.10: The method according to any of the previous Examples 6.1 to 6.9, comprising: during said calibration (610-618) and/or training phase, storing said global and/or said plurality of local light intensities at said artwork (140) and the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140) in a data structure, such as a Look-up Table; during said normal operation phase (630-640), estimating the global and/or the plurality of local light intensities at said artwork (140) via interpolation of the date stored in said data structure.

Example 6.11: The method according to any of the previous Examples 6.1 to 6.9, comprising: during said training phase, training a machine learning algorithm, such as an artificial neural network; and during said normal operation phase (630-640), estimating the global and/or the plurality of local light intensities at said artwork (140) via said machine learning algorithm. Example 6.12: The method according to any of the previous Examples 6.1 to 6.11, comprising: switching off said one or more light fixtures (110);

- measuring via said light sensor (120) the global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140); and estimating a global and/or a plurality of local light intensities of ambient light at said artwork (140) as a function of the measured global and/or the plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range.

Example 6.13: The method according to Example 6.12, comprising: detecting the presence of persons in said exposition area (160); and switching off said one or more light fixtures (110), when no persons have been detected in said exposition area (160).

Example 6.14: The method according to Example 6.12 or Example 6.13, wherein natural light may enter into said exposition area (160) through an aperture (164) in said exposition area (160), wherein said aperture has associated means for varying the intensity of natural light entering through said aperture (164), wherein the method comprises: sending one or more control commands to said means for varying the intensity of natural light entering through said aperture (164) as a function of the estimated global and/or plurality of local light intensity of ambient light at said artwork (140).

Example 6.15: The method according to any of the previous Examples 6.1 to 6.14, comprising: sending one or more control commands to said one or more light fixtures (110) in order to vary the characteristics of the light emitted by said one or more light fixtures (110) as a function of said estimated global and/or plurality of local light intensity at said artwork (140).

Example 6.16: The method according to Example 6.15, comprising: obtaining data (208) identifying requested global and/or plurality of local intensity characteristics for at least one wavelength or wavelength range; sending one or more control commands to said at least one light fixture (110) in order to vary the light intensity at said artwork (140), such that the global and/or plurality of local light intensities at said artwork (140) corresponds to said requested global and/or plurality of local intensity characteristics.

Example 6.17: The method according to Example 6.16, wherein said requested intensity characteristics comprises requested intensity values for a plurality of wavelengths/colors. Example 6.18: The method according to any of the previous Examples 6.1 to 6.17, comprising: comparing said estimated global and/or plurality of local light intensities at said artwork (140) with at least one threshold value, and sending one or more control commands to said at least one light fixture in order to reduce said global and/or plurality of local light intensities at said artwork (140) below said at least one threshold value.

Example 6.19: The method according to Example 6.18, wherein said at least one threshold value comprises respective threshold values for a plurality of wavelengths/colors.

Example 6.20: The method according to Example 6.18 or Example 6.19, comprising receiving sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).

Example 6.21: The method according to Example 6.20, comprising comparing the calculated local intensity values for at least one of the plurality of positions with a limit value in the sensitivity information for that position.

Example 6.22: The method according to Example 6.21, wherein the control signal is configured to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110).

Example 6.23: The method according to any one of the previous Examples 6.20 to 6.22, wherein the sensitivity information for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 6.24: The method according to Example 6.23, comprising: determining a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and comparing the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges. Example 6.25: The method according to any of the previous Examples 6.20 to 6.24, comprising:

- receiving position-dependent color and/or brightness values from a camera (508), and calculating a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.

Example 6.26: The method according to any of the previous Examples 6.20 to 24, comprising:

- receiving an identifier from a reader device, and obtaining the sensitivity information for the object (140) to be irradiated from a memory (206) as a function of the identifier.

Example 6.27: The method according to any of the previous Examples 6.1 to 6.26, wherein said light sensor (120) is configured to provide a plurality of local light intensity values for different wavel engths/ col or s .

Example 6.28: The method according to any of the previous Examples 6.1 to 6.27, wherein said light sensor (120) is a 2D light sensor providing pixel data, wherein the value of each pixel is indicative of a respective light intensity, wherein measuring via said light sensor (120) a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) comprises: determining a subset of pixels comprising said artwork (140), and determining a global and/or a plurality of local light intensity values as a function of the values of the subset of said pixels comprising said artwork (140).

Example 6.29: The method according to Example 6.27 or Example 6.28, wherein said light sensor (120) is a camera.

Example 6.30: A lighting system (100) configured to illuminating an artwork (140) in an exposition area (160), said lighting system (100) comprising: one or more light fixtures (110) configured to illuminate said artwork (140) with light having variable characteristics as a function of a control command; a light sensor (120) configured to be installed in said exposition area (160) in order to measure a global and/or a plurality of local light intensity values of the light (600) reflected by said artwork (140) for at least one wavelength or wavelength range; and a control system (130) comprising a memory having stored a mathematical function or dataset adapted to estimate a global and/or a plurality of local light intensities at said artwork (140) for at least one wavelength of wavelength range as a function of the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), and wherein said control system (130), during a normal operation phase, is configured to: o measure via said light sensor (120) the global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), o estimate via said mathematical function or dataset the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), o send one or more control commands to said one or more light fixtures (110) in order to vary the characteristics of the light emitted by said one or more light fixtures (110) as a function of said estimated global and/or plurality of local light intensities at said artwork (140).

Example 6.31: The lighting system (100) according to Example 6.30, wherein said control system (130) is configured to: during a calibration phase, measure via said light sensor (120) the global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), and obtain the global and/or plurality of local light intensities at said artwork (140), and during a training phase, determine said mathematical function or dataset adapted to estimate the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (600) reflected by said artwork (140), and store said mathematical function or dataset to said memory.

Example 6.32: The lighting system (100) according to Example 6.31, wherein the control system (130) is configured to implement the steps of the method according to any of Examples 6.1 to 6.29.

Example 6.33: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 6.1 to 6.29.

Example 6.34: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Example 6.1 to 6.29.

Example 7

Example 7.1: A method of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110) configured to emit light with variable characteristics as a function of a control command, wherein a reference luminance target (700) is installed in proximity of said artwork (140), whereby said reference luminance target (700) is illuminated with the light emitted by said one or more light fixtures (110), and wherein a light sensor (120) is installed in said exposition area (160) in order to measure a global and/or a plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) for at least one wavelength or wavelength range, the method comprising the steps of: during a calibration phase (710-716), obtaining a global and/or a plurality of local light intensities at said artwork (140) and/or at said reference luminance target (700) for at least one wavelength or wavelength range, and measuring via said light sensor (120) the global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700); during a training phase (720), determining a mathematical function and/or a dataset adapted to estimate the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700); and during a normal operation phase (730), measuring via said light sensor (120) the global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) and estimating the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity of the light (702) reflected by said reference luminance target (700).

Example 7.2: The method according to Example 7.1, comprising an actuator (602) configured to vary the position of said light sensor (120) with respect to the reference luminance target (700). Example 7.3: The method according to Example 7.1 or Example 7.2, comprising: during said calibration phase (710-716), varying the position of said light sensor (120) according to a given profile, and measuring a sequence of a plurality of global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) for said at least one wavelength or wavelength range; during said training phase (720), determining said mathematical function and/or said dataset as a function of said sequence of said plurality of global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700); during said normal operation phase (730), varying the position of said light sensor (120) according to said given profile, measuring a sequence of a plurality of global and/or a plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and estimating the global and/or the plurality of local light intensities at said artwork (140) as a function of said sequence of said plurality of global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700).

Example 7.4. The method according to any of the previous Examples 7.1 to 7.3, wherein said varying the position of said light sensor (120) comprises varying the distance and/or angle of said light sensor (120) with respect to said reference luminance target (700).

Example 7.5: The method according to any of the previous Examples 7.1 to 7.4, wherein during said calibration phase is measured a global and/or plurality of local light intensities at said reference luminance target (700), wherein the method comprises: obtaining geometrical data identifying the distance and orientation of said one or more light fixtures (110) with respect to said artwork (140) and said reference luminance target (700), and during said training phase, calculating the global and/or plurality of local light intensities at said artwork (140) as a function of said global and/or plurality of local light intensities at said reference luminance target (700) and said geometrical data, and determining a mathematical function or dataset adapted to estimate the calculated global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700).

Example 7.6: The method according to any of the previous Examples 7.1 to 7.4, wherein during said calibration phase is measured a global and/or plurality of local light intensities at said reference luminance target (700), and wherein the method comprises: obtaining geometrical data identifying the distance and orientation of said one or more light fixtures (110) with respect to said artwork (140) and said reference luminance target (700), during said training phase (720), determining a mathematical function or dataset adapted to estimate the global and/or plurality of local light intensities at said reference luminance target (700) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and during said normal operation phase (730), estimating the global and/or plurality of local light intensities at said reference luminance target (700) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and calculating the global and/or plurality of local light intensities at said artwork (140) as a function of said estimated global and/or plurality of local light intensities at said reference luminance target (700) and said geometrical data. Example 7.7: The method according to Example 7.6, comprising: during said training phase (720), determining a specular and/or diffusive reflectance of said reference luminance target (700), and during said normal operation phase, calculating the global and/or plurality of local light intensities at said reference luminance target (700) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) and said specular and/or diffusive reflectance of said reference luminance target (700).

Example 7.8: The method according to any of the previous Examples 7.1 to 7.7, wherein a plurality of reference luminance targets (700) is installed in proximity of said artwork (140), whereby said plurality of reference luminance targets (700) is illuminated with the light emitted by said one or more light fixtures (110), and wherein one or more light sensors (120) are installed in said exposition area (160) in order to measure the global and/or plurality of local light intensity values of the light (702) reflected by said plurality of reference luminance target (700), wherein the method comprises: during said training phase (720), determining a mathematical function or dataset adapted to estimate the global and/or plurality of local light intensities at said artwork (140) as a function of the said measured global and/or plurality of local light intensity values of the light reflected by said plurality of reference luminance target (700). Example 7.9: The method according to the combination of any of Examples 7.5 to 7.7 and Example 7.8, wherein said calculating the global and/or plurality of local light intensities at said artwork (140) as a function of said global and/or plurality of local light intensity at said reference luminance target (700) and said geometrical data is performed via interpolation of said global and/or plurality of local light intensities of the light (702) reflected by said plurality of reference luminance target (700).

Example 7.10: The method according to any of the previous Examples 7.1 to 7.9, comprising: during said calibration phase (710-716), sending (716) control commands to said one or more light fixtures (110) in order to vary the characteristics of the light (500) emitted by said one or more light fixtures (110), and each time measuring the global and/or plurality of local light intensities at said artwork (140) and/or at said reference luminance target (700), and via said light sensor (120) the global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700).

Example 7.11: The method according to Example 7.10, wherein said control command is configured to vary one or more of the following characteristics of the light emitted by said one or more light fixtures (110): light intensity, frequency/color, polarization, direction and/or beam spread.

Example 7.12: The method according to any of the previous Examples 7.1 to 7.11, comprising: during said training phase (720), storing said global and/or plurality of local light intensities at said artwork (140) and/or at said reference luminance target (700), and the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) in a data structure, such as a Look-up Table; during said normal operation phase (730), estimating the global and/or plurality of local light intensities at said artwork (140) via interpolation of the date stored in said data structure.

Example 7.13: The method according to any of the previous Examples 7.1 to 7.11, comprising: during said training phase (730), training a machine learning algorithm, such as an artificial neural network; and during said normal operation phase (740), estimating the global and/or plurality of local light intensities at said artwork (140) via said machine learning algorithm.

Example 7.14: The method according to any of the previous Examples 7.1 to 7.13, comprising: switching off said one or more light fixtures (110);

- measuring via said light sensor (120) the global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700); and estimating the global and/or plurality of local light intensities of ambient light at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700).

Example 7.15: The method according to Example 7.14, comprising: detecting the presence of persons in said exposition area (160); and switching off said one or more light fixtures (110), when no persons have been detected in said exposition area (160).

Example 7.16: The method according to Example 7.14 or Example 7.15, wherein natural light may enter into said exposition area (160) through an aperture (164) in said exposition area (160), wherein said aperture has associated means for varying the intensity of natural light entering through said aperture (164), wherein the method comprises: sending one or more control commands to said means for varying the intensity of natural light entering through said aperture (164) as a function of the estimated global and/or plurality of local light intensities of ambient light at said artwork (140).

Example 7.17: The method according to any of the previous Examples 7.1 to 7.16, comprising: sending one or more control commands to said one or more light fixtures (110) in order to vary the characteristics of the light emitted by said one or more light fixtures (110) as a function of said estimated global and/or plurality of local light intensities at said artwork (140).

Example 7.18: The method according to Example 7.17, comprising: obtaining data (208) identifying requested global and/or plurality of local intensity characteristics for at least one wavelength or wavelength range; sending one or more control commands to said at least one light fixture (110) in order to vary the light intensity at said artwork (140), such that the global and/or plurality of local light intensities at said artwork (140) corresponds to said requested global and/or plurality of local intensity characteristics.

Example 7.19: The method according to Example 7.18, wherein said requested intensity characteristics comprises requested intensity values for a plurality of wavelengths/colors. Example 7.20: The method according to any of the previous Examples 7.1 to 7.19, comprising: comparing said estimated global and/or plurality of local light intensities at said artwork (140) with at least one threshold value, and sending one or more control commands to said at least one light fixture (110) in order to reduce said global and/or plurality of local light intensities at said artwork (140) below said at least one threshold value.

Example 7.21: The method according to Example 7.20, wherein said at least one threshold value comprises respective threshold values for a plurality of wavelengths/colors.

Example 7.22: The method according to Example 7.20 or Example 7.21, comprising receiving sensitivity information for the object (140) to be irradiated, in which limit values for a maximum local intensity are stored for positions on the surface (142) of the object (140).

Example 7.23: The method according to Example 7.22, comprising comparing the calculated local intensity values for at least one of the plurality of positions with a limit value in the sensitivity information for that position. Example 7.24: The method according to Example 7.23, wherein the control signal is configured to adapt or switch off a power supply (116) of the light fixture (117) or individual light sources (117) of the light fixture (110).

Example 7.25: The method according to any one of the previous Examples 7.22 to 7.24, wherein the sensitivity information for the object (140) to be irradiated for the respective positions on the surface (142) of the object (140) includes a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 7.26: The method according to Example 7.25, comprising: determining a local intensity value for at least one of the plurality of positions for each of said plurality of different predetermined wavelength ranges, and comparing the calculated local intensity value with a respective limit value for each of said plurality of different predetermined wavelength ranges.

Example 7.27: The method according to any of the previous Examples 7.22 to 7.26, comprising:

- receiving position-dependent color and/or brightness values from a camera (508), and calculating a limit value for each of the positions on the basis of a fixed predetermined association between the color and/or brightness values and a sensitivity.

Example 7.28: The method according to any of the previous Example 7.22 to 7.26, comprising:

- receiving an identifier from a reader device, and obtaining the sensitivity information for the object (140) to be irradiated from a memory (206) as a function of the identifier.

Example 7.29: The method according to any of the previous Examples 7.1 to 7.28, wherein said light sensor (120) is configured to provide a plurality of local light intensity values for different wavel engths/ col or s .

Example 7.30: The method according to any of the previous Examples 7.1 to 7.29, wherein said light sensor (120) is a 2D light sensor providing pixel data, wherein the value of each pixel is indicative of a respective light intensity, and wherein measuring via said light sensor (120) a global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) comprises: determining a subset of pixels comprising said reference luminance target (700), and determining a global and/or plurality of local light intensity values as a function of the values of a subset of said pixels comprising said reference luminance target (700). Example 7.31: The method according to Example 7.29 or Example 7.30, wherein said light sensor (120) is a camera.

Example 7.32: The method according to any of the previous Examples 7.1 to 7.31, wherein said reference luminance target (700) comprises a portion being a Lambertian emitter having a substantially uniform diffusive reflection.

Example 7.33: The method according to any of the previous Examples 7.1 to 7.32, wherein said reference luminance target (700) comprises a white paint, such as barium sulfate BaS04, or polymers with diffusive particles, such as polycarbonate or PMMA, or silicone with A1203 or Ti02.

Example 7.34: The method according to any of the previous Examples 7.1 to 7.33, wherein said reference luminance target (700) comprises a portion having the same material as said artwork (140).

Example 7.35: The method according to any of the previous Examples 7.1 to 7.34, wherein said reference luminance target (700) comprises a portion having a non-null specular reflectivity component, whereby a specular reflection is emitting at a given angle, and wherein said light sensor (120) is installed at said given angle.

Example 7.36: The method according to any of the previous Examples 7.1 to 7.35, wherein said reference luminance target (700) comprises a plurality of portions having different specular reflectivity and/or diffusive reflectivity.

Example 7.37: The method according to Example 7.36, wherein said plurality of portions of said reference luminance target (700) are implemented with different materials.

Example 7.38: A lighting system (100) configured to illuminating an artwork (140) in an exposition area (160), said lighting system (100) comprising: one or more light fixtures (110) configured to illuminate said artwork (140) with light having variable characteristics as a function of a control command; a reference luminance target (700) configured to be installed in proximity of said artwork (140), whereby said reference luminance target (700) is illuminated with the light emitted by said one or more light fixtures (110); a light sensor (120) configured to be installed in said exposition area (160) in order to measure a global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700) for at least one wavelength or wavelength range; and a control system (130) comprising a memory having stored a mathematical function or dataset adapted to estimate the global and/or plurality of local light intensities at said artwork (140) for at least one wavelength or wavelength range as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and wherein said control system (130), during a normal operation phase, is configured to: o measure via said light sensor (120) the global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), o estimate via said mathematical function or dataset the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and o send one or more control commands to said one or more light fixtures (110) in order to vary the characteristics of the light emitted by said one or more light fixtures (110) as a function of said estimated global and/or plurality of local light intensities at said artwork (140).

Example 7.39 The lighting system (100) according to Example 7.38, wherein said control system (130) is configured to: during a calibration phase, measure via said light sensor (120) the global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and obtain the global and/or plurality of local light intensities at said artwork (140) and/or at said reference luminance target (700), and during a training phase, determine said mathematical function or dataset adapted to estimate the global and/or plurality of local light intensities at said artwork (140) as a function of the measured global and/or plurality of local light intensity values of the light (702) reflected by said reference luminance target (700), and store said mathematical function or dataset to said memory.

Example 7.40: The lighting system (100) according to Example 7.39, wherein the control system is configured to implement the steps of the method according to any of Examples 7.1 to 7.37. Example 7.41 A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 7.1 to 7.37.

Example 7.42: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 7.1 to 7.37.

Example 8

Example 8.1: A method of illuminating an artwork (140) in an exposition area (160) with a light fixture (110), said light fixture (110) comprising: a plurality of light sources (117); a driver circuit configured to provide an individually controllable power supply to each of said light sources (117) as a function of one or more control signals ( l re j, ; DSW1..DSW5); a data storage device (112) having stored at least one preset configuration data item (800); and a data processing unit (113) comprising a memory (1112); wherein the method comprises: reading (808) a preset configuration data item (800) from said data storage device (112) and storing said preset configuration data item (800) into said memory (1112); and generating said one or more control signals ( I re f DSW1..DSW5) as a function of the configuration data stored to said memory (1112). Example 8.2: The method according to Example 8.1, wherein said preset configuration data item (800) comprises data identifying a requested power supply for each of said light sources (117). Example 8.3: The method according to Example 8.2, wherein each light source (117) consists in a single LED.

Example 8.4: The method according to Example 8.2, wherein each light source comprises a plurality of LEDs configured to emit light with different colors, and wherein said preset configuration data item (800) comprises data identifying a requested color of the light emitted by each of said light sources (117), and wherein the method comprises: determining data identifying a requested power supply for each of said LEDs as a function of said requested power supply and said requested color for each of said light sources (117).

Example 8.5: The method according to Example 8.2, wherein each light source comprises a plurality of LEDs configured to emit light with different colors, and wherein said preset configuration data item (800) comprises data identifying a requested power supply for each of said LEDs.

Example 8.6: The method according to Example 8.4 or Example 8.5, comprising: storing said data identifying said requested power supply for each of said LEDs to said memory (1112).

Example 8.7: The method according to Example 8.1, wherein said preset configuration data item (800) comprises data identifying a requested illumination of said artwork (140), and wherein the method comprises: determining data identifying a requested power supply for each of said light sources (117) as a function of said requested illumination of said artwork (140).

Example 8.8: The method according to Example 8.7, wherein said requested illumination of said artwork (140) comprises for a plurality of areas of said artwork (140) data identifying the brightness and/or color of the illumination of the respective area.

Example 8.9: The method according to Example 8.8, wherein said requested illumination of said artwork (140) are stored in the form of an image, such as an RGB image.

Example 8.10: The method according to Example 8.8, wherein said requested illumination of said artwork (140) comprises a sequence of images, e.g. stored in the form of a video.

Example 8.11: The method according to any of Examples 8.7 to 8.10, comprising: determining for each light source (117) a respective area of said artwork (140) illuminated by the light source (117); and determining said data identifying a requested power supply for each of said light sources (117) as a function of the requested illumination of the respective area of said artwork (140).

Example 8.12: The method according to Example 8.11, comprising: determining for each light source (117) a respective area of said artwork (140) illuminated by the light source (117) as a function of data identifying the geometrical position of said light fixture (110) with respect to said artwork (140) and data identifying spatial radiation characteristics of said light fixture (110).

Example 8.13: The method according to Example 8.11, comprising: determining for each light source (117) a respective area of said artwork (140) illuminated by the light source (117) by selectively switching on subsets of said light sources (117) and monitoring the illumination of said artwork (140) in order to determine an area illuminated by said subsets of said light sources.

Example 8.14: The method according to any of the previous Examples 8.1 to 8.13, wherein said light fixture (110) comprises one or more lenses (115) configured to focus the light generated by said light sources (117) and at least one actuator (114) configured to vary the focal distance (D) of said one or more lenses (115).

Example 8.15: The method according to Example 8.14, comprising: selectively switching on at least a subset of said light sources (117); monitoring the illumination of said artwork (140) in order to determine an area illuminated by said subsets of said light sources; and driving said at least one actuator (114), such that the light generated by said light sources (117) illuminates mainly said artwork (140).

Example 8.16: The method according to any of the previous Examples 8.1 to 8.15, comprising: selectively switching on at least a subset of said light sources (117); monitoring the illumination of said artwork (140) in order to determine an area illuminated by said subsets of said light sources; and switching off the light sources (117) which do not illuminate said artwork (140).

Example 8.17: The method according to any of the previous Examples 8.1 to 8.16, wherein said data storage device (112) has stored a plurality of preset configuration data item (800), wherein with each preset configuration data item (800) is associated a respective univocal code.

Example 8.18: The method according to Example 8.17, comprising: receiving (804) a command (CMD) from a control system (130), said command comprising data identifying one of said univocal codes; selecting one of said plurality of preset configuration data item (800) as a function of said data identifying one of said univocal codes; and reading (808) said selected preset configuration data item (800) from said data storage device (112) and storing said selected preset configuration data item (800) into said memory (1112).

Example 8.19: The method according to Example 8.18, wherein said command comprises a field having stored said one of said univocal codes.

Example 8.20: The method according to any of the previous Examples 8.17 to 8.19, comprising: receiving (804) sensor data from a sensor (120); selecting one of said plurality of preset configuration data item (800) as a function of said sensor data; and reading (808) said selected preset configuration data item (800) from said data storage device (112) and storing said selected preset configuration data item (800) into said memory (1112).

Example 8.21: The method according to Example 8.20, wherein said sensor data comprise brightness and/or color data provided by a light sensor (120) configured to monitor the illumination of an artwork (140).

Example 8.22: The method according to any of the previous Examples 8.1 to 8.21, comprising: receiving (804) a command (CMD) from a control system (130); and modifying (910) said preset configuration data item (800) stored to said memory (1112) as a function of said command (CMD).

Example 8.23: The method according to Example 8.22, wherein said command (CMD) comprises data identifying one or more light sources (117), which should emit light with a lower or higher light intensity and/or a different color.

Example 8.24: The method according to Example 8.22, wherein said command (CMD) comprises data identifying a requested dimming level for all light sources (117).

Example 8.25: The method according to any of the previous Examples 8.1 to 8.24, comprising: receiving (804) sensor data from a sensor (120); and modifying (910) said configuration data item (800) stored to said memory (1112) as a function of said sensor data.

Example 8.26: The method according to Example 8.25, wherein said sensor data comprise brightness and/or color data provided by a light sensor (120) configured to monitor the illumination of an artwork (140).

Example 8.27: The method according to any of the previous Examples 8.1 to 8.26, comprising:

- transmitting via a control system (130) one or more control commands (CMD) comprising a configuration data item to said light fixture (110), and storing the transmitted configuration data item to said data storage device (112). Example 8.28: The method according to Example 8.27, comprising: accessing a database (202) having stored a plurality of preset configurations, selecting a subset of said plurality of preset configurations stored to said database (202); and

- transmitting said selected subset of preset configurations to said light fixture (110). Example 8.29: The method according to Example 8.28, comprising: selecting a subset of said plurality of preset configurations stored to said database (202) as a function of data (206) identifying the characteristics of said artwork (140) to be illuminated.

Example 8.30: The method according to any of the previous Examples 8.27 to 8.29, comprising: determining one or more parameters of the configuration data item to be transmitted to said light fixture (110) as a function of at least one of: o data identifying requested illumination characteristics (208) of said artwork (140), or a sequence thereof; o data (206) identifying the characteristics of said artwork (140) to be illuminated; o data (204) identifying the characteristics of said exposition area (160); and o data (210) identifying characteristics of a viewer.

Example 8.31: The method according to Example 8.30, comprising: obtaining an unvocal code associated with said artwork (140); and reading said data identifying the characteristics of said artwork (140) from a database

(206) as a function of said univocal code.

Example 8.32: The method according to Example 8.30, comprising: acquiring an image of said artwork (140); and determining said data identifying the characteristics of said artwork (140) via image processing of said image of said artwork (140).

Example 8.33: The method according to any of the previous Examples 8.30 to 8.32, wherein said data (206) identifying the characteristics of said artwork (140) comprise data identifying the shape and/or dimension of said artwork (140).

Example 8.34: The method according to any of the previous Examples 8.30 to 8.33, wherein said data (206) identifying the characteristics of said artwork (140) comprise data identifying areas of said artwork to be illumined with increased yellow or red light intensity.

Example 8.35: The method according to any of the previous Examples 8.1 to 8.34, wherein said artwork (140) is illuminated with a plurality of light fixtures (110), and wherein the method comprises: monitoring the illumination of said artwork (140) with the light generated by said plurality of light fixtures (110); determining an overlapping area illuminated by at least two light fixtures (110); and - transmitting at least one control command (CMD) to one or more of said at least two light fixtures (110) in order to reduce the brightness of the light sources (117) emitting light towards said overlapping area illuminated by said at least two light fixtures (110). Example 8.35: A light fixture (110) comprising: a plurality of light sources (117); a driver circuit configured to provide an individually controllable power supply to each of said light sources (117) as a function of one or more control signals ( l re j, ; DSW1..DSW5); a data storage device (112) having stored at least one preset configuration data item (800); and a data processing unit (113) comprising a memory (1112), wherein said data processing unit (113) is configured to implement the method according to any of the previous Examples 8.1 to 8.34. Example 8.36: A lighting system (100) comprising at least one light fixture (110) according to Example 8.35 and a control system (130) configured to send control commands (CMD) to said at least one light fixture (110).

Example 8.37: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 8.1 to 8.34.

Example 8.38: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 8.1 to 8.34.

Example 9

Example 9.1 : A method of operating a light fixture (110), said light fixture (110) comprising: a light module (118) comprising one or more light sources (117); a power supply circuit (900) configured to provide a DC voltage (Vbus) a regulated current generator (902) configured to provide an output current (i o t) to said one or more light sources (117) as a function of a reference signal (z re /); a current sensor (116ki) configured to provide a first measurement signal {FBi) indicative of said output current (z 0Mi ); and a data processing unit (113) operatively connected to said regulated current generator (902) and said current sensor (116ki); wherein the method comprises executing the following steps via said data processing unit (113): setting (908, 910, 924) said reference signal (ire/) as a function of data identifying a requested illumination ( <P rej ) to be generated by said one or more light sources (117); determining (910, 928) an upper and a lower current threshold as a function of said reference signal (z re /); obtaining said first measurement signal ( FBi );

- verifying (910, 930) whether said first measurement signal {FBi) is between said upper and said lower current threshold; and in case said verification indicates that said first measurement signal {FBi) is not between said upper and said lower current threshold, generating (910, 932) an error signal. Example 9.2: The method according to Example 9.1, wherein said setting (924) said reference signal ( i rej ) as a function of data identifying a requested illumination ( F h /) to be generated by said one or more light sources (117) comprises: determining (908, 924) a further reference signal ( i re f ) as a function of said data identifying a requested illumination ( <P rej ) to be generated by said one or more light sources (117); in case said verification indicates that said first measurement signal ( FBi ) is between said upper and said lower current threshold, varying (910, 936) said reference signal (iref) such that said first measurement signal (FBi) corresponds to said further reference signal (iref). Example 9.3: The method according to Example 9.1 or Example 9.2, wherein said regulated current generator (902) is configured to use said first measurement signal ( FBi ) provided by said current sensor (116ki) to regulate said first measurement signal {FBi) to said reference signal

(/re/).

Example 9.4: The method according to Example 9.1 or Example 9.2, wherein said regulated current generator (902) comprises a current sensor (116ki) configured to provide a further measurement signal, and wherein said regulated current generator (902) is configured to regulate said further measurement signal (FBi) to said reference signal {i re }).

Example 9.5: The method according to Example 9.3 or Example 9.4, wherein said regulated current generator (902) comprises: a switching stage (116h) and optionally an output filter stage (116i) configured to receive said DC voltage (V b u s ) and generate said output current (/„„,) as a function of one or more drive signals ( DRVne ); and a control circuit (116m) configured to generate said one or more drive signals ( DRVue ) as a function of said first measurement signal {FBi) or said further measurement signal, and said reference signal {ire}).

Example 9.6: The method according to Example 9.5, wherein control circuit (116m) comprises a regulator circuit comprising at least one of: a proportional (P), low-pass filtering (PT1), integral (I), or derivative (D) component.

Example 9.7: The method according to any of the previous Examples 9.1 to 9.6, wherein said power supply circuit (900) comprises:

- two input terminals (116a, 116b) for receiving an AC input voltage ( V m ,Ac ); a rectifier circuit (116f) configured to generate a DC input voltage {V m, Dc) as a function of said AC input voltage {Vm c).

Example 9.8: The method according to Example 9.7, wherein said power supply circuit (900) comprises: a filter circuit (116g) configured to generate said DC voltage {V bus ) as a function of said DC input voltage {V m, Dc).

Example 9.9: The method according to Example 9.7, wherein said power supply circuit (900) comprises: a PFC electronic converter (116g) configured to generate said DC voltage {V b u s ) as a function of said DC input voltage {V m, Dc).

Example 9.10: The method according to any of the previous Examples 9.1 to 9.9, wherein said current sensor (116ki) is connected in series with said light module (118).

Example 9.11: The method according to any of the previous Examples 9.1 to 9.10, wherein said current sensor (116ki) comprises a shunt resistor (Rs) configured to provide a voltage proportional to the output current {i out ) flowing through said shunt resistor (Rs). Example 9.12: The method according to any of the previous Examples 9.1 to 9.11, wherein said current sensor (116ki) comprises a low pass filter providing the average value of the voltage at said shunt resistor (Rs).

Example 9.13: The method according to any of the previous Examples 9.1 to 9.12, wherein said first measurement signal (FBi ) is proportional to the average output current

Example 9.14: The method according to any of the previous Examples 9.1 to 9.13, wherein said data processing unit (113) comprises a watchdog circuit (912), and wherein the method comprises: monitoring the operation of said data processing unit (113) via said watchdog circuit (912).

Example 9.15: The method according to any of the previous Examples 9.1 to 9.16, wherein said light sources (117) comprise LEDs or laser diodes.

Example 9.16: The method according to any of the previous Examples 9.1 to 9.15, wherein said light fixture (110) comprises a light sensor (120a) configured to generate a second measurement signal indicative of the illumination (F) generated by said light module (118), and wherein the method comprises executing the following steps via said data processing unit (113):

- varying (908, 926) said reference signal {ir e j) in order to regulate said signal indicative of the illumination (F) to said requested illumination (Fh·/).

Example 9.17: The method according to Example 9.15, wherein said light fixture (110) comprises a voltage sensor (116k 2 ) configured to generate a third measurement signal {FBi) indicative of the voltage (V out ) at said one or more LEDs or laser diodes, and wherein the method comprises executing the following steps via said data processing unit (113): determining (908, 926) said reference signal (ir e /) as a function of said third measurement signal (FBi).

Example 9.18: The method according to Example 9.17, wherein said light fixture (110) comprises a temperature sensor (120b) configured to generate a fourth measurement signal indicative of the temperature (T) of said one or more LEDs or laser diodes, and wherein the method comprises executing the following steps via said data processing unit (113): determining (908, 926) said reference signal (ir e j) as a function of said fourth measurement signal.

Example 9.19: The method according to Example 9.18, wherein said determining (908, 926) said reference signal (irej) comprises: accessing a data-structure having stored values of said reference signal (ir e j) for a plurality of combinations of said data identifying a requested illumination (<P rej ), said third measurement signal (FBi) and said fourth measurement signal; and obtaining (908) the reference signal {ir e j) associated with the current combination of data identifying a requested illumination (<P rej ), said third measurement signal (FBi) and said fourth measurement signal. Example 9.20: The method according to Example 9.18, wherein said determining (908, 926) said reference signal (i re j) comprises: calculating (908) the reference signal (i re j) via a mathematical function of the current combination of data identifying a requested illumination said third measurement signal (FBi) and said fourth measurement signal.

Example 9.21: The method according to any of the previous Examples 9.1 to 9.20, wherein the method comprises executing the following steps via said data processing unit (113): receiving a control command (CMD) from a control system (130); and determining said data identifying a requested illumination (F re f) as a function of said control command.

Example 9.22: The method according to Example 9.21, wherein said light fixture (110) comprises a data storage device (112) having stored at least one preset configuration data item (800), and wherein said determining said data identifying a requested illumination (F re f) as a function of said control command comprises: reading (808) a preset configuration data item (800) from said data storage device (112) as a function of said control command (CMD).

In various embodiments, Example 9.22 may be combined with any of Examples 8.1 to 8.35 Example 9.23: The method according to any of the previous Examples 9.1 to 9.22, wherein said data processing unit (113) comprises an analog-to-digital converter (914) configured to provide a digital sample of said first measurement signal ( FBi ), and wherein said verifying (910, 930) whether said first measurement signal (FBI) is between said upper and said lower current threshold comprises:

- verifying (910, 930) whether said digital sample of said first measurement signal ( FBi ) is between said upper and said lower current threshold.

Example 9.24: The method according to Example 9.23, wherein said data processing unit (113) comprises a microprocessor (1130), and wherein the method comprises verifying (910, 930) whether said digital sample of said first measurement signal (FBi) is between said upper and said lower current threshold via software instructions executed via said microprocessor (1130). Example 9.25: The method according to Example 9.23 or Example 9.24, wherein said data processing unit (113) is configured to obtain said digital sample of said first measurement signal (FBi) with a sampling frequency being smaller than 1 kHz, preferably between 1 and 100 Hz, more preferably between 1 and 20 Hz.

Example 9.26: The method according to any of the previous Examples 9.1 to 9.25, wherein said light fixture (110) comprises at least one electronic switch (904) configured to selectively disable the power supply of said regulated current generator (902) as a function of said error signal. Example 9.27: The method according to any of the previous Examples 9.1 to 9.26, wherein the method comprises executing the following steps via said data processing unit (113): sending (910, 932) an error message to a control system (130) as a function of said error signal. Example 9.28: The method according to Example 9.27, wherein said error message comprises data identifying said reference signal (z re /) and/or said first measurement signal (FBi).

Example 9.29: The method according to any of the previous Examples 9.1 to 9.28, wherein said light fixture (110) is used to illuminate an artwork (140), and wherein said requested illumination ( F,- e f) is determined as a function of a requested illumination or a maximum illumination threshold of said artwork (140).

Example 9.30: The method according to Example 9.29, wherein said requested illumination or said maximum illumination threshold of said artwork (140) comprise data identifying a requested illumination or maximum illumination threshold for a plurality of wavelengths or wavelength ranges.

Example 9.31: A light fixture (110) comprising: a light module (118) comprising one or more light sources (117); a power supply circuit (900) configured to provide a DC voltage (Vhus); a regulated current generator (902) configured to provide an output current (i o t ) to said one or more light sources (117) as a function of a reference signal (z re /); a current sensor (116ki) configured to provide a first measurement signal (FBi) indicative of said output current (z 0Mi ); and a data processing unit (113) operatively connected to said regulated current generator (902) and said current sensor (116ki), wherein said data processing unit (113) is configured to implement the method according to any of the previous Examples 9.1 to 9.30.

Example 9.32: A lighting system (100) comprising at least one light fixture (110) according to Example 9.31 and a control system (130) configured to send control commands (CMD) to said at least one light fixture (110).

Example 9.33: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 9.1 to 9.30.

Example 9.34: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 9.1 to 9.30.

Example 10

Example 10.1: A method of illuminating an artwork (140) in an exposition area (160) with at least one light fixture (110), wherein the method comprises:

- receiving one or more datasets for each of a plurality of artworks (140) and storing each dataset in a database (132 R ), each dataset comprising: o data identifying a list of pigments of the respective artwork (140); o data identifying the illumination of each pigment of said list of pigments during a given time period; o data identifying the ageing of each pigment of said list of pigments during said given time period;

- receiving (1002) data identifying a list of pigments of said artwork (140) to be illuminated; determining (1006-1010) a maximum illumination threshold for the illumination of said artwork (140) to be illuminated as a function of said list of pigments of said artwork (140) to be illuminated and said datasets stored in said database (132 R ); controlling (1012) the illuminating said artwork (140) to be illuminated in order to ensure that the illumination of said artwork (140) corresponds to or is smaller than said maximum illumination threshold for the illumination of said artwork (140).

Example 10.2: The method according to Example 10.1, wherein said data identifying a list of pigments of an artwork (140) comprises:

( e.g . for bi-dimensional artworks, such as paintings) a bi-dimensional matrix having stored data identifying the pigment type in a given horizontal and vertical position of the respective artwork (140); or

(e.g. for three-dimensional artworks) data identifying the pigment type on a surface of a three-dimensional model of the respective artwork (140).

Example 10.3: The method according to Example 10.1 or Example 10.2, wherein said data identifying a list of pigments comprises data identifying at least one of: a pigment color, a pigment type, and a pigment material.

Example 10.4: The method according to any of the previous Examples 10.1 to 10.3, comprising:

- receiving an image of an artwork (140);

- receiving a table having stored data identifying a pigment type or pigment material for a respective color;

- processing said image of said artwork (140) in order to determine the colors of said artwork (140), and associating with each color a respective pigment type or pigment material as a function of said table having stored data identifying a pigment type or pigment material.

Example 10.5: The method according to Example 10.4, comprising:

- receiving a plurality of tables having stored data identifying a pigment type or pigment material for a respective color for a plurality of artwork types and/or artists;

- receiving data identifying a type and/or an artist of said artwork; and selecting one of said plurality of tables having stored data identifying a pigment type or pigment material for a respective color as a function of said data identifying the type and/or artist of said artwork.

Example 10.6: The method according to Example 10.5, wherein said artwork type is selected from the list including one or more of: drawing, print-out, photography, textile, Old Master painting, or modern art painting. Example 10.7: The method according to any of the previous Examples 10.4 to 10.6, wherein said processing said image of said artwork (140) in order to determine the colors of said artwork (140), and associating with each color a respective pigment type or pigment material as a function of said table having stored data identifying a pigment type or pigment material, comprises:

( e.g . for bi-dimensional artworks, such as paintings) processing said image of said artwork (140) in order to determine the colors of each pixel of said image, and associating with each pixel a respective pigment type or pigment material; or (e.g. for three-dimensional artworks) processing said image of said artwork (140) in order to determine the color of a plurality of areas of a surface of a three-dimensional model of the respective artwork (140), and associating with each area a respective pigment type or pigment material.

Example 10.8: The method according to any of the previous Example 10.4 to 10.6, wherein said processing said image of said artwork (140) in order to determine the colors of said artwork (140), and associating with each color a respective pigment type or pigment material as a function of said table having stored data identifying a pigment type or pigment material, comprises:

- processing said image of said artwork (140) in order to determine the areas of said artwork having the same color, and associating with each area a respective pigment type or pigment material.

Example 10.9: The method according to any of the previous Examples 10.1 to 10.8, wherein said list of pigments of said artwork (140) to be illuminated are stored in said database (132 R ), and wherein said data identifying a list of pigments of said artwork (140) to be illuminated comprise an univocal artwork code associated with said list of pigments of said artwork (140) to be illuminated.

Example 10.10: The method according to any of the previous Examples 10.1 to 10.9, wherein said data identifying the illumination of each pigment of said list of pigments during said given time period comprise at least one global illumination value indicative of the illumination of the respective artwork (140) or local illumination values indicative of the illumination of given areas of said artwork (140).

Example 10.11: The method according to any of the previous Examples 10.1 to 10.10, wherein said data identifying the illumination of each pigment of said list of pigments during said given time period comprise a plurality of intensity values for respective wavelengths or wavelength ranges.

Example 10.12: The method according to any of the previous Examples 10.1 to 10.11, comprising determining said data identifying the illumination of each pigment of said list of pigments during said given time period as a function of at least one of: power supply parameters of light sources (117) used to illuminate the respective pigment; an intensity of light emitted by light sources (117) used to illuminate the respective pigment; an intensity of light (500) received at the respective pigment; an intensity of light (600) reflected by the respective pigment; and an intensity of light (702) reflected by a reference surface (700).

For example, in various embodiments, any of the light sensors of Examples 5, 6 or 7 is used. Example 10.13: The method according to any of the previous Examples 10.1 to 10.12, comprising determining said data identifying the ageing of each pigment of said list of pigments during said given time period by:

- receiving at least a first image of the respective artwork (140) at the beginning of said given time period and a second image of the respective artwork (140) at the end of said given time period;

- processing said at least first and second image in order determine variations in the pixel data of said at least first and second image, said variations being indicative of ageing of the pigments associated with the respective pixels exhibiting a variation.

Example 10.14: The method according to any of the previous Examples 10.1 to 10.13, comprising: receiving a dataset for said artworks (140) to be illuminated, said dataset comprising: o said data identifying said list of pigments of said artwork (140) to be illuminated; o data identifying the illumination of each pigment of said list of pigments of said artwork (140) to be illuminated during a given time period; o data identifying the ageing of each pigment of said list of pigments of said artwork (140) to be illuminated during said given time period; and storing (1004) said dataset in said database (132 R ).

Example 10.15: The method according to any of the previous Examples 10.1 to 10.14, wherein said determining a maximum illumination threshold for the illumination of said artwork (140) to be illuminated comprises:

- for each of said pigments in said list of pigments of said artwork (140) to be illuminated, determining (1006) a maximum illumination threshold for the illumination of the respective pigment as a function of said datasets stored in said database (132 R ); determining said maximum illumination threshold for the illumination of said artwork (140) to be illuminated as a function of said maximum illumination thresholds for the illumination of the respective pigments;

Example 10.16: The method according to Example 10.15, wherein said maximum illumination threshold comprises a plurality of maximum illumination values for respective wavelengths or wavelength ranges.

Example 10.17: The method according to Example 10.15 or Example 10.16, wherein said determining (1006) a maximum illumination threshold for the illumination of the selected pigment comprises: processing said datasets stored in said database (132 R ) in order to select datasets having a similar pigment as said selected pigment; processing the data of said selected datasets in order to determine correlations between the selected pigment, said data identifying the illumination of the selected pigment and said data identifying the ageing of the selected pigment.

Example 10.18: The method according to Example 10.17, wherein said data identifying the illumination of each pigment of said list of pigments during said given time period comprise a plurality of intensity values for respective wavelengths or wavelength ranges.

Example 10.19: The method according to Example 10.18, wherein each dataset comprises also one or more of further parameters selected from the group of: temperature, humidity, oxygen level of the air in the exposition area of the respective artwork (140), location of the exposition area (160) of the respective artwork (140) and position of the artwork (140) within the exposition area (160).

Example 10.20: The method according to Example 10.18 or Example 10.19, wherein said processing the data of said selected datasets comprises: selecting a set of features amongst said plurality of intensity values for respective wavelengths or wavelength ranges and optionally said one or more of further parameters. Example 10.21: The method according to Example 10.20, wherein said selecting a set of features comprises: executing a feature selection operation in order to select said set of features amongst said plurality of intensity values for respective wavelengths or wavelength ranges and optionally said one or more of further parameters, said set of features representing a given number of most relevant features, which are linked to the ageing of the selected pigment.

Example 10.22: The method according to Example 10.21, wherein said feature selection operation comprises a Principal Component Analysis or the calculation of a Probability Density Function.

Example 10.23: The method according to any of the previous Examples 10.20 to 10.22, wherein said processing the data of said selected datasets comprises: generating an ageing model of said selected pigment, said ageing model configured to estimate said data identifying the ageing of the selected pigment as a function of the data of said set of features.

Example 10.24: The method according to Example 10.23, wherein said generating an ageing model of said selected pigment as a function of the data of said set of features comprises:

- training a machine learning method, wherein said machine learning method receives at input said set of features and provides at output said data identifying the ageing of the selected pigment.

Example 10.25: The method according to Example 10.24, wherein said machine learning method is an Artificial Neural Network or a Support Vector Machine. Example 10.26: The method according to any of the previous Examples 10.23 to 10.25, wherein determining (1006) a maximum illumination threshold for the illumination of the respective pigment comprises:

- varying the values of the features corresponding to intensity values for respective wavelengths or wavelength ranges; and observing the data identifying the ageing of the selected pigment as estimated by said ageing model.

Example 10.27: The method according to Example 10.26, wherein said varying the features corresponding to intensity values for respective wavelengths or wavelength ranges comprises: receiving a requested illumination spectrum for the illumination of said artwork to be illuminated, said requested illumination spectrum comprising relative intensity values for respective wavelengths or wavelength ranges;

- varying the value of one of said features corresponding to intensity values for respective wavelengths or wavelength ranges; and calculating the values of the other of said features corresponding to intensity values for respective wavelengths or wavelength ranges as a function of the value of said one of said features and said requested illumination spectrum.

Example 10.28: The method according to Example 10.27, wherein determining (1006) a maximum illumination threshold for the illumination of the respective pigment comprises: selecting a maximum ageing value for said data identifying the ageing of the selected pigment;

- varying the value of said one of said features and calculating the values of the other of said features a plurality of times, each time observing the data identifying the ageing of the selected pigment as estimated by said ageing model; and selecting the maximum value of said one of said features and the respective values of the other of said features for which the data identifying the ageing of the selected pigment is below said maximum ageing value.

Example 10.29: The method according to Example 10.28, wherein said set of features comprises said temperature and/or said humidity, and wherein determining (1006) a maximum illumination threshold for the illumination of the respective pigment comprises: receiving measured values of said temperature and/or said humidity; and

- using said measured values of said temperature and/or said humidity as input of said ageing model.

Example 10.30: The method according to any of the previous Examples 10.15 to 10.29, wherein said determining a maximum illumination threshold for the illumination of said artwork (140) to be illuminated comprises: determining the minimum value of said maximum illumination thresholds of said pigments; and - using said minimum value as said maximum illumination threshold for the illumination of said artwork (140).

Example 10.31: The method according to any of the previous Examples 10.1 to 10.30, wherein said controlling (1012) the illuminating said artwork (140) to be illuminated comprises: sending one or more control commands (CMD) to said at least one light fixture (110). Example 10.32: The method according to Example 10.31, wherein said one or more light fixtures (110) are configured to vary the illumination generated by said one or more light fixtures (110) as a function of said one or more control commands (CMD).

For example, in various embodiments, the method of Example 8 may be used for this purpose. Example 10.33: A lighting system comprising at least one light fixture (110) configured to illuminate an artwork (140) in an exposition area (160) and a control system (130 L , 13 O R ) configured to implement the method according to any of the previous Examples 10.1 to 10.32. Example 10.34: The lighting system according to Example 10.33, wherein said control system (130L, 13 OR) comprises: a local control system (130 L ) configured to control operation of said at least one light fixture (110); and a remote control system configured to implement the method according to any of the previous Examples 10.1 to 10.32.

Example 10.35: A computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for implementing the method according to any of Examples 10.1 to 10.32.

Example 10.36: A non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to perform steps of the method according to any of Examples 10.1 to 10.32.

Example 11

Example 11.1: A method of producing a translucent optical element (115 2 ) for a light fixture (110), wherein said translucent optical element (115 2 ) is implemented with a translucent material (1150, 1152) comprising a first surface (1154) for receiving a light radiation ( F ') and an opposite second surface (1156) for providing an attenuated second light radiation (F'), wherein said second surface (1156) is arranged at a given variable thickness ( L ) from said first surface (1154), the method comprising the steps of: obtaining a first matrix of first light intensity values (F'), wherein each first light intensity value ( F ') is associated with a respective area of said first surface (1154) and identifies the intensity of light expected to enter the respective area of said first surface (1154); obtaining a second matrix of second light intensity values (F*) having the same dimension as said first matrix, wherein each second light intensity value (F*) is associated with a respective area of said second surface (1156) and identifies the intensity of light requested to exit the respective area of said second surface (1156) when the expected intensity of light enters said first surface; calculating a matrix of light transmission ratios (7) having the same dimension as said first matrix and said second matrix, wherein each light transmission ratio (7) is calculated as a function of a respective first light intensity value ( F ') and a respective second light intensity value (F*); obtaining an attenuation factor of said translucent material (1150, 1152); calculating a matrix of thickness values ( L ) having the same dimension as said matrix of light transmission ratios (7), wherein each thickness value ( L ) is calculated as a function of a respective light transmission ratio (7) and said attenuation factor of said translucent material (1150, 1152), and wherein said matrix of thickness values ( L ) identifies the requested thickness of said translucent material (1150, 1152) between said first surface (1154) and said second surface (1156) in order to obtain said intensity of light requested to exit said second surface (1156) when said expected intensity of light enters said first surface; and producing said translucent optical element (115 2 ) by shaping said translucent material (1150, 1152) as a function of said matrix of thickness values (. L ).

Example 11.2: The method according to Example 11.1, wherein said light fixture (110) comprises a light module (118) configured to emit light and a first set of optical elements (115i) arranged in the light path of said light emitted by said light module (118), and wherein said translucent optical element (115 2 ) is configured to be mounted in a first plane (1106) at a first distance (dl) from said first set of optical elements (115i).

Example 11.3: The method according to Example 11.2, wherein said first plane (1106) is perpendicular to the optical axis (502) of the light provided by the first set of optical elements

(1150-

Example 11.4: The method according to Example 11.2 or Example 11.3, comprising mounting said translucent optical element (115 2 ) at said first distance (dl) from said first set of optical elements (115i).

Example 11.5: The method according to any of Examples 11.2 to 11.4, wherein said first set of optical elements (115i) comprises at least one lens and/or reflector.

Example 11.6: The method according to any of Examples 11.2 to 11.5, wherein said obtaining a first matrix of first light intensity values ( F l ) comprises: determining (1112, 1114) the beam pattern of the light provided by the first set of optical elements (115i) in said first plane (1106).

Example 11.7: The method according to Example 11.6, wherein said determining (1114) the beam pattern in said plane (1106) comprises: measuring (1112) a beam pattern of the light provided by said first set of optical elements (115i) in said first plane (1106). Example 11.8: The method according to Example 11.6, wherein said determining (1114) the beam pattern in said plane (1106) comprises: measuring a beam pattern of the light provided by said first set of optical elements (115i) in a second plane; and calculating the beam pattern in said first plane (1106) via geometrical projection of the measured beam pattern.

Example 11.9: The method according to Example 11.8, wherein said second plane is perpendicular to the optical axis (502) of the light provided by the second set of optical elements (115i), said second plane being at a distance greater than said first distance (dl).

Example 11.10: The method according to Example 11.6, wherein said light fixture (110) comprises a second set of optical elements (115 3 ) mounted at a second distance (d2) from said translucent optical element (115 2 ).

Example 11.11: The method according to Example 11.10, wherein said determining (1114) the beam pattern in said plane (1106) comprises: obtaining an optical transfer function of said second set of optical elements (115 3 ); measuring a beam pattern of the light provided by said second set of optical elements (115i) in a second plane; and calculating the beam pattern in said first plane (1106) via geometrical projection of the measured beam pattern and as a function of the optical transfer function of the second set of optical elements (115 3 ).

Example 11.12: The method according to Example 11.11, wherein said second plane is perpendicular to the optical axis (502) of the light provided by the second set of optical elements (115i), said second plane being at a distance greater than the sum of said first distance (dl) and said second distance (d2).

Example 11.13: The method according to any of the previous Examples 11.10 to 11.12, wherein said second set of optical elements (115 3 ) comprises a framer, shutter or gobo having an aperture.

Example 11.14: The method according to Example 11.13, comprising mounting said translucent optical element (115 2 ) in said aperture of said framer, shutter or gobo.

Example 11.15: The method according to any of the previous Examples 11.6 to 11.14, wherein measuring (1112) a beam pattern in a plane comprises: positioning a reference surface in said plane; illuminating said reference with the light generated by said light module (118); and measuring the light intensity of the light reflected by said reference surface with a camera (120).

Example 11.16: The method according to Example 11.15, wherein said reference surface is a Lambertian surface. Example 11.17: The method according to Example 11.15 or Example 11.16, wherein said reference surface is a wall (163) of an exposition area (160) where an artwork (140) should be fixed.

Example 11.18: The method according to any of the previous Examples 11.1 to 11.17, wherein said light fixture (110) is configured to illuminate an artwork (140).

Example 11.19: The method according to Example 11.18, wherein said obtaining a second matrix of second light intensity values (F') comprises: obtaining requested illumination values for said artwork (140); obtaining the position of said artwork (140) with respect to said light fixture (110); and calculating a requested beam pattern in said first plane (1106) via geometrical projection of the requested illumination values as a function of the position of said artwork (140) with respect to said light fixture (110).

Example 11.20: The method according to Example 11.19, wherein said light fixture (110) comprises a second set of optical elements (115 3 ) mounted at a second distance (d2) from said translucent optical element (115 2 ), and wherein said calculating a requested beam pattern comprises: obtaining an optical transfer function of said second set of optical elements (115 3 ); and calculating the requested beam pattern as a function of the optical transfer function of said second set of optical elements (115 3 ).

Example 11.21: The method according to Example 11.19 or 11.20, wherein said calculating a matrix of light transmission ratios comprises: calculating a ratio between a respective second light intensity value (F') and a respective first light intensity value (F 1 ).

Example 11.22: The method according to any of the previous Examples 11.1 to 11.21, comprising: calculating a further matrix of further thickness values ( L ’) having a greater dimension than said matrix of thickness values ( L ) by interpolation of said thickness values (/.); and producing said translucent optical element (115 2 ) by shaping said translucent material (1150, 1152) as a function of said further matrix of thickness values (. L ).

Example 11.23 : The method according to any of the previous Examples 11.1 to 11.22, wherein said producing said translucent optical element (115 2 ) by shaping said translucent material (1150, 1152) comprises one of: producing said translucent optical element (115 2 ) via injection molding; producing said translucent optical element (115 2 ) via a material removal process; or producing said translucent optical element (115 2 ) via additive manufacturing.

Example 11.24: The method according to any of the previous Examples 11.1 to 11.23, wherein said translucent material (1150, 1152) comprises: a plastic material, such as thermoplastic materials, e.g. polycarbonate (PC) or acrylic/polymethyl methacrylate (PMMA), silicone or glass material. Example 11.25: The method according to any of the previous Examples 11.1 to 11.24, wherein said translucent material (1150, 1152) comprises absorbing and/or scattering particles (1152) distributed in a base material (1150).

Example 11.26: The method according to Example 11.25, wherein said particle (1152) are particles of AI2O3, S1O2, T1O2.

Example 11.27: A light fixture (110) comprising: a light module (118) comprising one or more light sources (117) configured to emit light, a first set of optical elements (115i) arranged in the light path of said light emitted by said light module (118), and a translucent optical element (115 2 ) produced with the method according to any of the previous Examples 11.1 to 11.26 and mounted in a plane (1106) at a distance (dl) from said first set of optical elements (115i).

Example 11.28: The light fixture (110) according to Example 11.27, wherein said first set of optical elements (115i) comprises at least one of: a reflector for the light emitted by said light module (118); a micro-reflector structure, wherein a micro-reflector is arranged in correspondence with each light source (117) or a set of light sources (117); a collimator lens or lens structure arranged in front of said light module (118); and a micro-lens structure, wherein one or more micro-lenses are arranged in correspondence with each light source (117).

Example 11.29: The light fixture (110) according to Example 11.27 or Example 11.28, comprising a second set of optical elements (115 3 ) mounted at a second distance (d2) from said translucent optical element (115 2 ).

Example 11.30: The method according to Example 11.29, wherein said second set of optical elements (115 3 ) comprises a framer, shutter or gobo having an aperture.

Example 11.31: The method according to Example 11.30, wherein said translucent optical element (115 2 ) is mounted in said aperture of said framer, shutter or gobo.

Example 11.32: Various embodiments also relate to a respective software tool configured to determine automatically the thickness values ( L ), such as a computer-program product that can be loaded into the memory of at least one processor and comprises portions of software code for generating a technical specification adapted to be used to produce a translucent optical element (1152) for a light fixture (110), wherein said translucent optical element (1152) is implemented with a translucent material (1150, 1152) comprising a first surface (1154) for receiving a light radiation (F 1 ) and an opposite second surface (1156) for providing an attenuated second light radiation (F*), wherein said second surface (1156) is arranged at a given variable thickness (L) from said first surface (1154), said portions of software code being configured to implement the steps of: obtaining a first matrix of first light intensity values (F'), wherein each first light intensity value (F') is associated with a respective area of said first surface (1154) and identifies the intensity of light expected to enter the respective area of said first surface (1154); obtaining a second matrix of second light intensity values having the same dimension as said first matrix, wherein each second light intensity value is associated with a respective area of said second surface (1156) and identifies the intensity of light requested to exit the respective area of said second surface (1156) when the expected intensity of light enters said first surface; calculating a matrix of light transmission ratios (7) having the same dimension as said first matrix and said second matrix, wherein each light transmission ratio (7) is calculated as a function of a respective first light intensity value ( F ') and a respective second light intensity value (F); obtaining an attenuation factor of said translucent material (1150, 1152); and calculating a matrix of thickness values (L) having the same dimension as said matrix of light transmission ratios (7), wherein each thickness value (L) is calculated as a function of a respective light transmission ratio (7) and said attenuation factor of said translucent material (1150, 1152), and wherein said matrix of thickness values (L) identifies the requested thickness of said translucent material (1150, 1152) between said first surface (1154) and said second surface (1156) in order to obtain said intensity of light requested to exit said second surface (1156) when said expected intensity of light enters said first surface.

Example 11.33: The computer-program product according to Example 11.32, wherein said portions of software code are configured to implement the method according to any of the previous Examples 11.2 to 11.22.

Example 11.34: Similarly, various embodiments relate to a non-transitory computer-readable medium storing instructions that, when executed, cause a computing device to generate a technical specification adapted to be used to produce a translucent optical element (1152) for a light fixture (110), wherein said translucent optical element (1152) is implemented with a translucent material (1150, 1152) comprising a first surface (1154) for receiving a light radiation (F ί ) and an opposite second surface (1156) for providing an attenuated second light radiation (F*), wherein said second surface (1156) is arranged at a given variable thickness (L) from said first surface (1154), said instructions, when executed, cause said computing device to perform the steps of: obtaining a first matrix of first light intensity values (F'), wherein each first light intensity value (F') is associated with a respective area of said first surface (1154) and identifies the intensity of light expected to enter the respective area of said first surface (1154); obtaining a second matrix of second light intensity values (F*) having the same dimension as said first matrix, wherein each second light intensity value (F*) is associated with a respective area of said second surface (1156) and identifies the intensity of light requested to exit the respective area of said second surface (1156) when the expected intensity of light enters said first surface; calculating a matrix of light transmission ratios (7) having the same dimension as said first matrix and said second matrix, wherein each light transmission ratio (7) is calculated as a function of a respective first light intensity value ( F ') and a respective second light intensity value (F*); obtaining an attenuation factor of said translucent material (1150, 1152); and calculating a matrix of thickness values ( L ) having the same dimension as said matrix of light transmission ratios (7), wherein each thickness value ( L ) is calculated as a function of a respective light transmission ratio (7) and said attenuation factor of said translucent material (1150, 1152), and wherein said matrix of thickness values ( L ) identifies the requested thickness of said translucent material (1150, 1152) between said first surface (1154) and said second surface (1156) in order to obtain said intensity of light requested to exit said second surface (1156) when said expected intensity of light enters said first surface.

Example 11.35: The non-transitory computer-readable medium according to Example 11.34, wherein said instructions, when executed, cause said computing device to perform steps of the method according to any of the previous Examples 11.2 to 11.22.

Example 12

Generally, the various embodiments described herein may also be combined to form advanced methods of illuminating an artwork (140) in an exposition area (160) with a lighting system (100) comprising one or more light fixtures (110).

For example, according to Example 11.1, a translucent optical element may be mounted in a light fixture (110) in order to generate a requested (nominal) illumination of said artwork (140), such as requested illumination values provided by an artist. Specifically, the translucent optical element (115 2 ) is implemented with a translucent material (1150, 1152) comprising a first surface (1154) for receiving a light radiation ( F ') and an opposite second surface (1156) for providing an attenuated second light radiation (F'), wherein the second surface (1156) is arranged at a given variable thickness ( L ) from said first surface (1154).

Specifically, according to Example 11.2, the light fixture (110) may comprise a light module (118) configured to emit light and a first set of optical elements (115i) arranged in the light path of said light emitted by said light module (118), wherein said translucent optical element (115 2 ) is configured to be mounted in a first plane (1106) at a first distance (dl) from said first set of optical elements (115i).

In order to determine the characteristics of the translucent optical element (115 2 ) the method may first send a control command to said light module (118) in order to operate the light module (118) with a first/nominal operating condition, e.g. characterized by a given light intensity and spectral characteristics. Next the method obtains a first matrix of first light intensity values ( F '), wherein each first light intensity value (F 1 ) is associated with a respective area of the first surface (1154) and identifies the intensity of light expected to enter the respective area of the first surface (1154). For example, according to Example 11.6, the method may obtain the first matrix of first light intensity values ( F ') by determining (1112, 1114) the beam pattern of the light provided by the first set of optical elements (115i) in the first plane (1106), when the light module (118) operates in said first/nominal operating condition.

Moreover, the method obtains a second matrix of second light intensity values (F 1 ) having the same dimension as the first matrix, wherein each second light intensity value (F*) is associated with a respective area of the second surface (1156) and identifies the intensity of light requested to exit the respective area of the second surface (1156) when the expected intensity of light enters the first surface. For example, according to Example 11.19, the method may determine the second matrix of second light intensity values (F*) by obtaining requested (nominal) illumination values for the artwork (140), obtaining the position of said artwork (140) with respect to the light fixture (110), and calculating a requested beam pattern in the first plane (1106) via geometrical projection of the requested illumination values as a function of the position of the artwork (140) with respect to the light fixture (110).

Accordingly, once having determined the first matrix of first light intensity values (F 1 ) and the second matrix of second light intensity values (F*), the method may calculate a matrix of light transmission ratios (7) having the same dimension as the first matrix and the second matrix, wherein each light transmission ratio (7) is calculated as a function of a respective first light intensity value (F 1 ) and a respective second light intensity value (F'). Next the method may obtain an attenuation factor of the translucent material (1150, 1152) and calculate a matrix of thickness values ( L ) having the same dimension as the matrix of light transmission ratios (7), wherein each thickness value ( L ) is calculated as a function of a respective light transmission ratio (7) and the attenuation factor of the translucent material (1150, 1152), and wherein the matrix of thickness values ( L ) identifies the requested thickness of the translucent material (1150, 1152) between the first surface (1154) and the second surface (1156) in order to obtain the intensity of light requested to exit the second surface (1156) when the expected intensity of light enters said first surface. Finally, the translucent optical element (115 2 ) may be produced by shaping said translucent material (1150, 1152) as a function of the matrix of thickness values ( L ) and, according to Example 11.4, the translucent optical element (115 2 ) may be mounted at the first distance (dl) from the first set of optical elements (115i).

Accordingly, thanks to the translucent optical element, the artwork (140) is illuminated with the requested (nominal) illumination/intensity values when the light module (118) operates in the first/nominal operating condition. However, this does not take into account background light and that the requested (nominal) illumination values, in particular the spectral/color characteristics, may be changed, e.g. based on the viewer's eye characteristics.

For example, according to Example 2.1, the method may obtain data identifying a viewer’s eye characteristics (210) and determine modified illumination values by modifying the requested (nominal) illumination values as a function of the viewer’s eye characteristics (210). Specifically, the method may then generate one or more control commands in order to vary the characteristics of the light emitted by the one or more light fixtures (110) as a function of the modified illumination values.

For example, in order to ensure that the artwork is indeed illuminated with the modified illumination values, the method may use a light sensor (120) installed in the exposition area (160). Specifically, according to Example 6.1, the light sensor (120) may be configured to measure a global and/or a plurality of local light intensity values of the light (600) reflected by the artwork (140) for at least one wavelength or wavelength range. Accordingly, in this case the method may determine a global and/or a plurality of local light intensities at said artwork (140) by: during a calibration phase (610-618), obtaining a global and/or a plurality of local light intensities at the artwork (140) for at least one wavelength or wavelength range and measuring via the light sensor (120) the global and/or local light intensity values of the light (600) reflected by the artwork (140); during a training phase, determining a mathematical function or a dataset adapted to estimate the global and/or the plurality of local light intensities at the artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light reflected by the artwork (140); and during a normal operation phase (630-640), measuring via the light sensor (120) the global and/or the plurality of local light intensity values of the light reflected by the artwork (140), and estimating via the mathematical function or dataset the global and/or the plurality of local light intensities at the artwork (140) as a function of the global and/or the plurality of measured light intensity values of the light reflected by the artwork (140).

Thus, the method may send one or more control commands to the light fixture (110) in order to vary the (intensity and/or color) characteristics of the light emitted by the light fixture (110), such that the estimated global and/or plurality of local light intensities at the artwork (140) correspond to the modified illumination values.

Of course, without prejudice to the principle of the invention, the details of construction and the embodiments may vary widely with respect to what has been described and illustrated herein purely by way of example, without thereby departing from the scope of the present invention, as defined by the ensuing claims.

While various embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, embodiments may be practiced otherwise than as specifically described and claimed. Embodiments of the present disclosure are directed to each individual feature, system, aspect, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, aspects, articles, materials, kits, and/or methods, if such features, systems, aspects, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure. Particularly, any element of the disclosure and any aspect thereof may be combined, in any order and any combination, with any other element of the disclosure and any aspect thereof.

The above-described embodiments can be implemented in any of numerous ways. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.

Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.

Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.

Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.

The various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine. In this respect, various disclosed concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the disclosure discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present disclosure as discussed above.

The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present disclosure need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present disclosure. Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.

Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.

Also, various inventive concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.

All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms. The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”

The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.

As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of’ or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.

As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or “at least one of A or B,” or “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

In the claims, as well as in the disclosure above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of’ and “consisting essentially of’ shall be closed or semi-closed transitional phrases, respectively, as set forth in the eighth edition as revised in July 2010 of the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03 For the purpose of this disclosure and the claims that follow, the term “connect” has been used to describe how various elements interface or couple. Such described interfacing or coupling of elements may be either direct or indirect.

Thus, although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as preferred forms of implementing the claims.

GLOSSARY

Actuator

Actuators comprise components or devices, usually as part of a machine, that can transform electric, hydraulic or pneumatic energy into a mechanical movement. Actuators can be used to move parts of the illumination system like the light fixture or the sensor or to move parts of the light fixture or the sensor.

Artificial Intelligence

Data fusion describes a process, in which information coming from various data sources is combined in order to describe a state. In order to obtain information on a state, observations or measurements are performed. Objective of data fusion is to provide a decision rule for mapping the observations into the state space. In this respect, artificial intelligence or machine learning refers to a method, wherein historical data are used for this pupose, thereby adapting the estimation based on a learning process. Thus, generally, an artificial intelligence or machine learning method receives at input a set of observations/features and provides at output an estimate of the state. For example, the historical data may be used online, e.g. in order to find most similar situations (e.g. based on a Mean Square Error), or offline, e.g. in order to generate via supervised or unsupervised learning methods a model (usually described via one or more mathematical functions) adapted to estimate the state based on the observations/features. For example, well-known machine learning methods using a supervised learning comprise Artificial Neural Networks and Support- Vector Machines. For example, an Artificial Neural Network uses a given number of input nodes (corresponding to the observations/features), output nodes (corresponding to the state to be estimated) and various hidden nodes (between the input nodes and output nodes) usually organized as layers. Such nodes essentially define a parameterized mathematical function, and a supervised learning method may be used to adapt the parameters of the mathematical function based on the historical data.

Artist

An Artist is a person (or Artificial Intelligence computer) who wants to present an artwork. An artist may want to specify preferred lighting settings for the Artwork, for example candle light, sunrise, midday or evening lighting scenarios, or lighting under a certain angle and beam spread, or changing colors according to a pre-defmed or ad-hoc generated time table or schedule. Such a lighting setting may be specified by color temperature or color location, or based on reference lighting settings, Image Representing Digital Data Set (DDS), spectral reference data or color tables. An art gallery might take such settings and transform them into operating commands for each of the installed fixtures so that such input data and related lighting settings are adjusted and represented as best as possible.

An Artist may allow storing such preferred lighting conditions in an Artist Lighting Scenario Matrix (ALSM) so that it can be used by an Art Gallery or by a display device APP or GUI for proper transformation of digital image data (DDS) into pixelated display settings. An artist may even want to include personal eye deficiency data into that Artist Lighting Scenario Matrix (ALSM).

Artwork

An artwork may be a painting, a picture, a sculpture, an assortment of various pieces of art, people and the like. An artwork may encompass self-lit objects. An artwork may be located in any exposition area, such as an art museum or an art gallery or exhibition or somewhere else (inside and outside of a building). An artwork may be illuminated by natural and/or artificial light for viewing purposes and for taking photographs and image recording of the artwork.

Seeing an Artwork may be done life in person (on-site), or remotely using a display or image projection device.

An artwork may be illuminated with daylight, candle light or any other artificial light source, also in combination with each other. Light sources emit light that may be represented by its intensity, frequency, polarization, direction and beam spread. Every illumination setting can (theoretically) be described by an Illumination Matrix (IM).

An artwork may be illuminated with various kinds of light sources at the same time under different irradiating angles and beam diameters. Image taking ( e.g . with a camera) may be done under various lighting situations and under various positions and angles with respect to the illuminated artwork. This means that image measurement (image taking) is functionally related to artwork illumination (ambient and artificial), the reflectivity features of an Artwork and the image measurement characteristics an artwork may be represented by a (multi-dimensional) Digital Data Set (DDS).

Camera

A Camera, e.g. a CCD or CMOS camera, uses optics and photoelectrical sensor chips for pixelated image measurement, as well as software digitizing and transforming such data using representations in color spaces. Usually, a camera needs to have filter segments that are placed in front of a sensor chip, for example RGB filters in a Bayer configuration/setting, in order to allow for color perception and respective measurement. However, filter segments, chips and signal procession will show some kind of variation.

Control System

The control system is a device which gathers data from at least one sensor and controls sensors and light fixtures. The control system can include a data processing unit, a data storage device, a user interface and an interface.

Data processing unit

A data processing unit is a computing unit which receives data (e.g. from a sensor or from a data storage device) manipulates them according to some rules or algorithms and outputs information that can be displayed e.g. on a graphical user interface to interact with a user or outputs control data to control e.g. sensors or light fixtures. A data processing unit can be part of a smartphone, a computer system or situated non-locally in the cloud. It can be connected to other devices via an interface. The data processing unit can comprise additional device like e.g. a clock. Data storage device

A data storage device is a physical device which can store data, e.g. a hard-drive, an optical disc or a flash memory. The data can be stored on the data storage device in a simple file-structure or using a database. The database can be based on blockchain-technology. The data storage device can be physically connected to a data processing unit or it can be placed in the cloud and connected via an interface.

Display

A Display may be of any kind of currently used (or anticipated of future use) display or displaying units, for example LCD-Displays, AMOLED-Displays, Laser Projection Devices, Plasma Screens, Augmented and Virtual Reality Glasses, and the like. Each display has its own color representation possibilities, viewing angles, brightness range and related limitations. A display or a displaying unit (e.g. a laser projector) has some kind of control unit with data processing, data storing and data communication capabilities that is configured to select, calculate and apply Optical Transfer Functions to a provided Digital Image Representation, so that a pixelated image can be properly displayed.

Display device

A Display Device has the ability to transform digital image contents into a visual representation, i.e. it has the ability to transfer digital image data (however complex) into electronic commands for display or projection pixel control. A Display Device may be equipped with an APP or Graphical User Interface (GUI) that allows a user to change display settings, for example based on personal preferences or based on the Artist’s input.

Eve Testing Device

An Eye Testing Device and Method is any kind of device/method that assumes to measure visual eye characteristics, like color perception. Due to complexity and certainly also due to incomplete understanding of eye function and color recognition, only some aspects have so far been accessible for research and testing. Since every human is affected by ageing, certain eye deficiencies occur over time, like Presbyopia (difficulty with near vision focus), cataract glaucoma and macular degeneration. Some of these conditions may be measured with standard test procedures like Ishihara plate tests, Holmgren tests and Farnsworth tests. Other People may be affected with color blindness or color deficiencies, like Protanomaly (red weak), Deuteranomaly (green weak) and Tritanomaly (blue weak). Certain methods of detection of eye deficiencies have been described in literature. Some approaches have been developed to mitigate this problem by providing re-coloring techniques, especially by color enhancing or color change for computer-based vision.

Human eve

A Human Eye is an organ of perception. A human eye is a very complex biological product that finally transfers signals to the visual cortex area of the brain. Color perception is based on many influencing factors (both physiological and psychological). A human eye and therefore color perception will degenerate over time caused by biological shortcomings and degenerations. Therefore, each human eye has its own visual characteristics and limitations. This means that the eye performance is correlated to the age of a person.

Illumination system

An illumination system can comprise a light fixture, a sensor, a control unit and/or a user interface. An illumination system can modify via the control system the illumination (e.g. intensity, color, color temperature, illumination pattern) based on pre-defmed rules, sensor input or inputs received via the user interface. It can also allow easy commissioning of the light fixtures and the sensors and provide people interaction.

Interface

The interface can connect the light fixture, the sensor and the control system to other light fixtures, sensors or control systems. The interface may be wired. Alternatively, or in addition, the interface may be wireless, such as the wireless communication standard available under the trade designation “Bluetooth”, or the wireless personal-area-network standard available under the trade designation “Zigbee”, or a wireless local-area-network (WLAN), or may be an optical wireless communication.

Light fixture

Artificial light is provided by Lighting Fixtures. A light fixture is used to illuminate objects or persons. It can be controlled by a control system. The light fixture is configured to emit light at different colors, color temperatures and intensities.

The light fixture can contain a light module with light sources. The light sources can be of the same kind (e.g. white color of the same color temperature) or of different kinds (e.g. red, blue and green). The light sources can be driven by a driver. The illumination provided by the light fixture can be static or dynamic.

The light fixture can also contain a data storage device, a data processing unit, optics, an interface or actuators.

Object

An object can be anything the illumination system interacts with, e.g. a stage, a building, a room, or an artwork.

Optical Transfer Function

An Image Transfer Function (ITF) or Optical Transfer Function (OTF) describes how an incoming light distribution is changed when passing through or being deflected or reflected by the next optical element or passing from one Display Medium to another. In this context, an Optical Transfer Function (OTF) describes how an Image Representing Digital Data Set (DDS) is changed when passed on to the next image displaying process in chain. This means that a process chain may have several Optical Transfer Functions (OTF), including the eye’s Optical Transfer Function (e.g. represented in a viewer’s personal eye data set). For all this, data compressing algorithms may be used. Sensor

A sensor is a device, module or subsystem whose purpose it is to detect events or changes in its environment and send the information to other electronics, such other electronics usually having a data processing unit.

A sensor in present embodiments can be a resistive, a capacitive, an inductive, a magnetic, an optical (e.g. a camera or spectral sensor), an acoustic and/or a chemical sensor.

A sensor can comprise a camera sensor, a Lidar sensor for measurements in the infrared wavelength range, a radar sensor, or an acoustic sensor for measurement in the infrasound, audible and ultrasound frequency range.

A sensor can be infrared sensitive and measure for example the presence and location of humans or animals.

A sensor can be connected directly or indirectly to a data storage device or a data processing unit through its interface.

Software

Software relates to instructions (software code) to be executed by one or more programmable processors. For example, such a processor may range from a low complexity microprocessor, e.g. of a micro-controller, to a multi-processor system, e.g. of a modern computer or a distributed cloud system. However, software code may also be used to program other programmable processors, such as Programmable Gate Arrays. For example, modern lighting systems may include a light fixture, a sensor, a local control system and a remote control system. All these elements of the lighting systems may comprise one or more programmable processors. Accordingly, the operations of the various elements described herein may often be implemented by programming these processors in a suitable manner. The reference to "at least one processor" is evidently intended to highlight the possibility that the respective operation may be implemented in a modular form and/or distributed. The software executed by a processor is usually stored to a memory. For example, the software to be executed by a processor may already be stored during the production phase to the memory, e.g. in the case of a micro controller of a light fixture, may be provided on a computer-readable, non-transitory medium, or may be downloaded from the Internet, e.g. when an application should be installed on a computer, smartphone or tablet. Thus, when using the term software, reference is made to a computer-program product, loadable into the memory of at least one processor and comprising portions of software code capable of implementing the disclosed operations when the product is run on at least one processor. Moreover, as used herein, reference to a computer-program product is understood to be equivalent to a reference to a computer-readable, non-transitory medium containing respective instructions.

User

A user is a person that interacts with the illumination system. A user can be the operator of the illumination system who defines rules how the system should behave. Alternatively, a user can be a person (e.g. a visitor of a museum or an actor on a stage) who is illuminated by the illumination system and whose behavior and actions are measured and detected by the sensor. Specifically, a Viewer is a person who wants to perceive an Artwork. A user (U) may be a person with rather standard (normal) Visual Perception (UVP) but may still have Preferences for certain lighting conditions and color Perception (UVPP). A user may know the individual Eye Deficiencies Matrix (EDM) based on a variety of measuring methods. Such matrices (UVP, EDM) may be stored on a chip card or be otherwise electronically available (e.g. cloud based). An on-site or off-site user (viewer) may be willing to provide the personal UVP and EDM data to an Art Gallery or to an APP -Provider or to a Display Device Provider and allow use of such data for changed or improved image display and lighting setting. For all this, data compressing algorithms may be used.

LIST OF REFERENCE

100 Illumination System

110 Light Fixture

111 Interface of Light Fixture

112 Data Storage Device (DSD) of Light Fixture

113 Data Processing Unit (DPU) of Light Fixture 1130 Microprocessor 1132 Memory

114 Actuator of Light Fixture

115 Optics

116 Driver

116a..116d Terminals 116e Input filter 116f Rectifier

116g Filter of PFC converter

116h Switching stage 1161 Output filter 116k Feedback circuit 116m Control circuit

117 Light Source

118 Light Module

118a..118d Terminals

120 Sensor

121 Interface of Sensor

122 Data Storage Device (DSD) of Sensor

123 Data Processing Unit (DPU) of Sensor

124 Actuator of Sensor

130 Control System

131 Interface of Control System

132 Data Storage Device (DSD) of Control System

133 Data Processing Unit (DPU) of Control System

134 User Interface

140 Object (e.g. Artwork)

141 Carrier material of object 140

142 Surface of object 140 142a Upper side of surface 142 142b Lower side of surface 142

143 Frame 144 Color layer

150 Person

160 Exposition area

161 Ceiling

162 Floor

163 Wall

164 Window

165 Door 200 Database 202 Light fixture database 204 Exposition area database 206 Artwork database 208 Artist illumination database 210 Visitor’s eye database 212 Display device database 214 Camera database 216 Image database 218 Sensor database 220 Memory card 222 Mobile device 224 Printed support 230 Camera 240 Image 250 Display device 300 Start step 302 Receive characteristics of the exposition area and artwork 304 Determine recommended configuration of light fixtures 306 Display recommended set of light fixtures and the respective settings 308 Select a different set of light fixtures and/or different settings 310 Select set of light fixtures and settings 312 Generate configuration parameters and/or technical specification 314 Stop step 320 Select type of artwork

322, 324, 326 Verify room height

328 Verify brightness of room

330 Dataset, e.g. look-up table

400 Start step

402 Select settings for natural and artificial light sources

404 Simulate model 406 Verification step

408 Determine expected illumination of artworks

410 Select set of light sensors

412 Determine expected measurement values of the light sensors

414 Stop step

500 Light radiation

502 Optical axis

504 Timer module/device

506 Cloud/Internet

508 Camera

510 Identifier

520 Obtain intensity data

522 Obtain spatial positioning data

524 Obtain spatial radiation data

526 Determine global and/or local intensity values

528 Obtain global and/or local sensitivity data

530 Compare data

600 Reflected light

602 Actuator for changing position

610 Illuminate object 140

612 Measure illumination of object 140

614 Calculate/estimate illumination of object 140

616 Measure light reflected by object 140

618 Change settings of light fixture(s) 110

630 Measure light reflected by object 140

632 Obtain reference value

634 Compare reflected light with reference value

636 Change settings of light fixture(s) 110

638 Maintain settings of light fixture(s) 110

640 Determine new reference value

700 Reference luminance target (RLT)

702 Reflected light

710 Illuminate object 140

712 Determine illumination of object 140 and/or RLT 700

714 Measure light reflected by RLT 700

716 Change settings of light fixture(s) 110

720 Training step

730 Determine light at artwork

800 Preset configuration data 802 Start step

804 Receive command or sensor data

806 Verify whether to read new preset configuration data or to adapt the configuration data

808 Read new preset configuration data

810 Adapt configuration data

900 Power supply circuit

902 Regulated voltage or current generator

904 Means for disabling the regulated voltage or current generator

906 Light output control module

908 Light flux regulation module

910 Output current control module

920 Start step

922 Calibration step

924 Set reference signal for requested light flux

926 Adapt reference signal

928 Determine current threshold values

930 Verify value of measure output current

932 Generate error signal

934 Stop step

936 Adapt reference signal

1000 Start step

1002 Receive data identifying pigments of an artwork

1004 Stored data to database

1006 Determine maximum illumination of given pigments of the artwork

1008 Verify whether other pigments have to be processed

1010 Determine maximum illumination of all pigments of the artwork

1012 Send maximum illumination values to one or more light fixtures

1014 Stop step

1100 First curve of light intensity values

1102 Second curve of light intensity values

1104 Reference surface

1106 Plane for mounting a translucent optical element within a light fixture

1110 Start step

1112 Measure beam pattern

1114 Determine expected light intensity values

1116 Receive data identifying a requested illumination of an artwork

1118 Determine requested light intensity values

1120 Determine properties of the translucent optical element 1122 Produce translucent optical element 1124 Mount translucent optical element within light fixture 1126 Stop step 1150 Base material of a translucent optical element 1152 Particles dispensed in the base material of the translucent optical element

1154 First surface of the translucent optical element 1156 Second surface of the translucent optical element