Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SPECTRAL UNSAMPLING
Document Type and Number:
WIPO Patent Application WO/2023/152490
Kind Code:
A1
Abstract:
A spectral upsampling method is disclosed. The method comprises receiving a target illuminant colour and receiving or retrieving a number, n, of basis spectra. The method further comprises minimising a colour difference between the target illuminant colour and an estimated illumination colour, the estimated illumination colour represented as two or more stimulus values, each stimulus value calculated as an inner product of a weighted sum of spectral power distributions of the n basis spectra with a respective colour matching function. The method further comprises outputting the n weights corresponding to the estimated illumination colour having the minimised colour difference.

Inventors:
GUARNERA GIUSEPPE (GB)
GITLINA YULIYA (GB)
GHOSH ABHIJEET (GB)
DESCHAINTRE VALENTIN (GB)
Application Number:
PCT/GB2023/050285
Publication Date:
August 17, 2023
Filing Date:
February 08, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
LUMIRITHMIC LTD (GB)
International Classes:
G01J3/46; F21K9/65; G01J3/28; G06T7/90; G06T15/50; G06V10/141; H04N1/60
Foreign References:
US20170318178A12017-11-02
Other References:
MALLET IAN ET AL: "Spectral Primary Decomposition for Rendering with sRGB Reflectance", EUROGRAPHICS SYMPOSIUM ON RENDERING, 1 January 2019 (2019-01-01), XP093039456
WENZEL JAKOB ET AL: "A Low-Dimensional Function Space for Efficient Spectral Upsampling", COMPUTER GRAPHICS FORUM : JOURNAL OF THE EUROPEAN ASSOCIATION FOR COMPUTER GRAPHICS, WILEY-BLACKWELL, OXFORD, vol. 38, no. 2, 7 June 2019 (2019-06-07), pages 147 - 155, XP071545496, ISSN: 0167-7055, DOI: 10.1111/CGF.13626
NIMIER-DAVID M.VICINI D.ZELTNER T.JAKOB W.: "Mitsuba 2: A retargetable forward and inverse renderer", ACM TRANS. GRAPH, vol. 38, no. 6, November 2019 (2019-11-01)
PHARR M.JAKON W.HUMPHREYS G.: "Physically Based Rendering: From Theory to Implementation", 2016, MORGAN KAUFMANN, article "Sparse recovery of hyperspectral signal from natural rgb images", pages: 19 - 34
UNGER J.WENGER A.HAWKINS T.GARDNER A.DEBEVEC P.: "Eurographics Workshop on Rendering", 2003, THE EUROGRAPHICS ASSOCIATION, article "Optimizing Color Matching in a Lighting Reproduction System for Complex Subject and Illuminant Spectra"
JAKOB W.HANIKA J.: "A low-dimensional function space for efficient spectral upsampling", COMPUTER GRAPHICS FORUM (PROCEEDINGS OF EUROGRAPHICS, vol. 38, no. 2, March 2019 (2019-03-01), pages 2
MALLETT I.YLTKSEL C.: "Eurographics Symposium on Rendering - DL-only and Industry Track", 2019, THE EUROGRAPHICS ASSOCIATION, article "Spectral Primary Decomposition for Rendering with sRGB Reflectance"
AKBARINIA A.GEGENFURTNER K. R.: "Color metamerism and the structure of illuminant space", J. OPT. SOC. AM. A, vol. 35, no. 4, April 2018 (2018-04-01), pages B231 - B238
BIANCO S.: "Reflectance spectra recovery from tristimulus values by adaptive estimation with metameric shape correction", J. OPT. SOC. AM. A, vol. 27, no. 8, August 2010 (2010-08-01), pages 1868 - 1877
"CIE: Publication 15.2-1986 colorimetry", 1986, article "Tech report"
DEBEVEC P.: "Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques", 1998, COMMISSION INTERNATIONALE DE L'ECLAIRAGE, article "Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography", pages: 189 - 198
DEBEVEC P.GRAHAM P.BUSCH J.BOLAS M.: "ACM SIGGRAPH 2012 Talks", 2012, ASSOCIATION FOR COMPUTING MACHINERY, article "A singleshot light probe"
DEBEVEC P.HAWKINS T.TCHOU C.DUIKER H.-P.SAROKIN W.SAGAR M.: "Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques", 2000, ACM PRESS/ADDISON-WESLEY PUBLISHING CO., article "Acquiring the reflectance field of a human face", pages: 145 - 156
DEEP K.SINGH K. P.KANSAL M.MOHAN C.: "A real coded genetic algorithm for solving integer and mixed integer optimization problems", APPLIED MATHEMATICS AND COMPUTATION, vol. 212, no. 2, 2009, pages 505 - 518, XP026099479, DOI: 10.1016/j.amc.2009.02.044
FOSTER D. H.AMANO K.NASCIMENTO S. M. C.FOSTER M. J.: "Frequency of metamerism in natural scenes", J. OPT. SOC. AM. A, vol. 23, no. 10, October 2006 (2006-10-01), pages 2359 - 2372
FENG G.FOSTER D. H.: "Predicting frequency of metamerism in natural scenes by entropy of colors", J. OPT. SOC. AM. A, vol. 29, no. 2, February 2012 (2012-02-01), pages A200 - A208
FAIRCHILD M. D.ROSEN M. R.JOHNSON G. M.: "Acquisition and reproduction of color images: colorimetric and multispectral approaches", 2001, RIT-MUNSELL COLOR SCIENCE LABORATORY
GUARNERA G. C.BIANCO S.SCHETTINI R.: "Turning a digital camera into an absolute 2d tele-colorimeter", COMPUTER GRAPHICS FORUM, vol. 38, no. 1, 2019, pages 73 - 86
GITLINA Y.GUARNERA G. C.DHILLON D. S.HANSEN J.LATTAS A.PAI D.GHOSH A.: "Practical measurement and reconstruction of spectral skin reflectance", COMPUTER GRAPHICS FORUM, vol. 39, no. 4, 2020, pages 75 - 89, XP071489947, DOI: 10.1111/cgf.14055
GRASSMANN H.: "Zur theorie der farbenmischung", ANNALEN DER PHYSIK, vol. 165, no. 5, pages 69 - 84
JUNG A.WILKIE A.HANIKA J.JAKOB W.DACHSBACHER C.: "Wide gamut spectral upsampling with fluorescence", COMPUTER GRAPHICS FORUM, vol. 38, no. 4, 2019, pages 87 - 96, XP071489725, DOI: 10.1111/cgf.13773
LEGENDRE C.YU X.DEBEVEC P.: "Optimal led selection for multispectral lighting reproduction", ELECTRONIC IMAGING, vol. 8, 2017, pages 25 - 32
DEBEVEC P.: "Practical multispectral lighting reproduction", ACM TRANS. GRAPH., vol. 35, July 2016 (2016-07-01), pages 4
MENG J.SIMON F.HANIKA J.DACHSBACHER C.: "Physically meaningful rendering using tristimulus colours", COMPUTER GRAPHICS FORUM, vol. 34, no. 4, 2015, pages 31 - 40, XP055898613, DOI: 10.1111/cgf.12676
PARK J.LEE M.GROSSBERG M. D.NAYAR S. K.: "Multispectral imaging using multiplexed illumination", 2007 IEEE 11TH INTERNATIONAL CONFERENCE ON COMPUTER VISION, 2007, pages 1 - 8
PETERS C.MERZBACH S.HANIKA J.DACHSBACHER C.: "Using moments to represent bounded signals for spectral rendering", ACM TRANS. GRAPH., vol. 38, no. 4, July 2019 (2019-07-01), XP058686695, DOI: 10.1145/3306346.3322964
RATNASINGAM S.HERNANDEZ-ANDRES J.: "Illuminant spectrum estimation at a pixel", J. OPT. SOC. AM. A, vol. 28, no. 4, April 2011 (2011-04-01), pages 696 - 703
SHI J.YU H.HUANG X.CHEN Z.TAI Y.: "Optoelectronic Imaging and Multimedia Technology III", vol. 9273, 2014, SPIE, article "Illuminant spectrum estimation using a digital color camera and a color chart", pages: 1 - 9
TOMINAGA S.FUKUDA T.: "Color Imaging XII: Processing, Hardcopy, and Applications", vol. 6493, 2007, SPIE, article "Omnidirectional scene illuminant estimation using a multispectral imaging system", pages: 352 - 359
TOMINAGA S.TANAKA N.: "Omnidirectional scene illuminant estimation using a mirrored ball", JOURNAL OF IMAGING SCIENCE AND TECHNOLOGY, vol. 50, no. 3, 2006, pages 217 - 227
TODOVA L.WILKIE A.FASCIONE L.: "Eurographics Symposium on Rendering - DL-only Track", 2021, THE EUROGRAPHICS ASSOCIATION, article "Moment-based Constrained Spectral Uplifting"
WANDELL B. A.: "The synthesis and analysis of color images", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE PAMI- 9, vol. 1, 1987, pages 2 - 13, XP008073377, DOI: 10.1109/TPAMI.1987.4767868
WYSZECKI G.STILES W. S.: "Quantitative Data and Formulae", 1982, WILEY, article "Color Science: Concepts and Methods"
CIE: "Commission Internationale de l'Eclairage proceedings", 1931, CAMBRIDGE UNIVERSITY PRESS
SHARMA, G.WU, W.DALAL, E.N: "The CIEDE 000 color-difference formula: Implementation notes, supplementary test data, and mathematical observations", COLOR RESEARCH AND APPLICATIONS, vol. 30, 2005, pages 21 - 30, Retrieved from the Internet
SMPTE ENGINEERING GUIDELINE - DIGITAL SOURCE PROCESSING - COLOR PROCESSING FOR D-CINEMA, 10 November 2010 (2010-11-10), pages 1 - 81
"SMPTE Recommended Practice - D-Cinema Quality - Reference Projector and Environment", RP 431-2:2011, 6 April 2011 (2011-04-06), pages 1 - 14
Attorney, Agent or Firm:
DUNLEAVY, Christopher (GB)
Download PDF:
Claims:
Claims

1. A spectral upsampling method comprising: receiving a target illuminant colour; receiving or retrieving a number, n, of basis spectra; minimising a colour difference between the target illuminant colour and an estimated illumination colour, the estimated illumination colour represented as two or more stimulus values, each stimulus value calculated as an inner product of a weighted sum of spectral power distributions of the n basis spectra with a respective colour matching function; and outputting the n weights corresponding to the estimated illumination colour having the minimised colour difference.

2. The method of claim 1 wherein the step of minimising the colour difference between the target illuminant colour and an estimated illumination colour is performed by employing a stochastic optimisation method.

3. The method of claim 1 or 2 wherein the initial weight of each of the n basis spectra is the inverse of the colour difference calculated between the target illuminant and that basis spectrum expressed in the two or more stimulus values.

4. The method of claim 3 wherein the initial weight of each of the n basis spectra is calculated using the equation:

1 1

Wk = — * -

N E<)4(Labre Labk) where is a normalisation factor computed as £ =1 — , Labref is the CIE Lab colour of the target illuminant, Labk is the CIE Lab colour of the kth basis spectrum and AE94 is the colour difference between Labref and Labk-

5. The method of any one of claims 2 to 4, wherein the stochastic optimisation method comprises convex optimisation.

6. The method of any one of claims 2 to 5, wherein the stochastic optimisation method comprises a genetic algorithm.

7. The method of any one of claims 2 to 6, wherein the stochastic optimisation method comprises a neural algorithm.

8. The method of any one of claims 1 to 7, wherein the basis spectra are selected from the group consisting of: red, green, blue, 2700K white, 4000K white and 5700K white light emitting diodes;

Gaussian-generated bases bases; or standard illuminants bases.

9. The method of any one of claims 1 to 8, wherein the target illuminant colour is sampled using a light probe. to. The method of any one of claims 1 to 9, further comprising driving n sets of light sources with intensities corresponding to the n output weights, each set of light sources including one or more lights configured to emit light having the respective basis spectrum.

11. The method of anyone of claims 1 to 10, wherein the target illuminant colour comprises colour values from a region of an image.

12. The method of any one of claims 1 to 11 wherein the colour difference which is minimised is calculated according to: where Labref is the Lab colour of the target illuminant, XYZ Lab represents a conversion that maps CIE XYZ values to the Lab colour space, Sk is a set of basis spectra and xyz(X) are the CIE 19312° colour matching functions.

13. An apparatus comprising: n sets of light sources, each set corresponding to a distinct basis spectrum; a controller configured, in response to receiving or retrieving a target illuminant colour: to retrieve or calculate n weights corresponding to the target illuminant colour, to drive the n light sources such that the relative intensities of the n light sources correspond to the n weights; wherein the n weights correspond to a minimised colour difference, the colour difference being between the target illuminant colour and an estimated illumination colour, the estimated illumination colour represented as two or more stimulus values, each stimulus value calculated as the inner product of a weighted sum of the spectral power distributions of the n basis spectra with a respective colour matching function.

14. An apparatus of claim 13 wherein the n sets of light sources are comprised in a lighting booth.

15. An apparatus of claim 14, wherein the n sets of light sources are arranged to illuminate a volume within the lighting booth. 16. An apparatus of claim 13 or 14 wherein the n sets of light sources are comprised in a light stage or light cage.

17. An apparatus of claim 13, wherein the n sets of light sources are comprised in a display.

18. An apparatus of claim 17, wherein the display is a Digital Cinema Initiatives Protocol 3 display.

Description:
Spectral Upsampling

Field

The present method relates to spectral upsampling methods, and methods and apparatuses implementing spectrally upsampled information. Specifically, the method includes using image-based representations such as light probes, and is applied to drive improved spectral lighting reproduction.

Background Illumination is one of the fundamental elements of virtual scenes alongside shape/geometry and materials and has a major effect on the overall scene appearance. Modern rendering software allows to render scenes with complex spectral reflectance and illumination, allowing to precisely simulate optical phenomenon such as iridescence, refraction or diffraction (Nimier-David M., Vicini D., Zeltner T., Jakob W.: Mitsuba 2: A retargetable forward and inverse Tenderer. ACM Trans. Graph. 38, 6 (Nov. 2019); Pharr M., Jakon W., Humphreys G.: Physically Based Rendering: From Theory to Implementation. Morgan Kaufmann, 2016). Spectral illumination may help to improve the accuracy of on-set lighting reproduction (Wenger A., Hawkins T., Ebevec P.: Optimizing Color Matching in a Lighting Reproduction System for Complex Subject and Illuminant Spectra. In Eurographics Workshop on Rendering (2003), Dutre P., Suykens F., Christensen P. H., Cohen-Or D., (Eds.), The Eurographics Association). The main challenge in obtaining these benefits is that spectral data for illumination is not commonly available, and methods for measuring spectral data are complex, time consuming and often require specialised equipment.

While recent works have proposed to facilitate RGB to spectral reflectance upsampling for driving spectral rendering, the creation or acquisition of spectral illumination remains a challenge, particularly for environmental illumination (Jakob W., Hanika J.: A low-dimensional function space for efficient spectral upsampling. Computer Graphics Forum (Proceedings of Eurographics) 38, 2 (Mar. 2019); Mallett I., Yuksel C.: Spectral Primary Decomposition for Rendering with sRGB Reflectance. In Eurographics Symposium on Rendering - DL-only and Industry Track (2019), Boubekeur T., Sen P., (Eds.), The Eurographics Association. Previous work on spectral upsampling has relied on the smoothness of reflectance spectra, a property not shared by most illumination spectra. References:

ARAD B., BEN-SHAHAR 0.: Sparse recovery of hyperspectral signal from natural rgb images. In European Conference on Computer Vision (2016), Springer, pp. 19-34. Hereinafter “ARAD2016”. AKBARINIA A., GEGENFURTNER K. R.: Color metamerism and the structure of illuminant space. J. Opt. Soc. Am. A 35, 4 (Apr 2018), B231-B238. Hereinafter “AKBARINIA2018”.

BIANCO S.: Reflectance spectra recoveiy from tristimulus values by adaptive estimation with metameric shape correction. J. Opt. Soc. Am. A 27, 8 (Aug 2010), 1868-1877. Hereinafter “BIANC02010”.

CIE: Publication 15.2-1986 colorimetiy, 2nd edition. Tech report (1986). Hereinafter “CIE1986”.

CIE: Publication 116-1995 Industrial colour-difference evaluation. ISBN 49783 900734 60 2. Commission Internationale de L'Eclairage. Hereinafter “CIE1995”. DEBEVEC P.: Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography. In Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques (New York, NY, USA, 1998), SIGGRAPH ’98, Association for Computing Machinery, p. 189-198. Hereinafter “DEBEVEC1998”. DEBEVEC P., GRAHAM P., BUSCH J., BOLAS M.: A singleshot light probe. In

ACM SIGGRAPH 2012 Talks (New York, NY, USA, 2012), SIGGRAPH ’12, Association for Computing Machinery. Hereinafter “DEBEVEC2012”.

DEBEVEC P., HAWKINS T., TCHOU C., DUIKER H.-P., SAROKIN W., SAGAR M.: Acquiring the reflectance field of a human face. In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques (USA, 2000),

SIGGRAPH ’00, ACM Press/Addison-Wesley Publishing Co., p. 145-156. Hereinafter “DEBEVEC2000”.

DEEP K., SINGH K. P., KANSAL M., MOHAN C.: A real coded genetic algorithm for solving integer and mixed integer optimization problems. Applied Mathematics and Computation 212, 2 (2009), 505-518. Hereinafter “DEEP2009”.

FOSTER D. H., AMANO K., NASCIMENTO S. M. C., FOSTER M. J.: Frequency of metamerism in natural scenes. J. Opt. Soc. Am. A 23, 10 (Oct 2006), 2359-2372. Hereinafter “FOSTER2OO6”.

FENG G., FOSTER D. H.: Predicting frequency of metamerism in natural scenes by entropy of colors. J. Opt. Soc. Am. A 29, 2 (Feb 2012), A200-A208. Hereinafter “FENG2012”. FAIRCHILD M. D., ROSEN M. R., JOHNSON G. M.: Spectral and metameric color imaging. Technical Report, RIT-Munsell Color Science Laboratory (2001). Hereinafter “FAIRCHILD2001”.

GUARNERA G. C., BIANCO S., SCHETTINI R.: Turning a digital camera into an absolute 2d tele-colorimeter. Computer Graphics Forum 38, 1 (2019), 73-86. Hereinafter “GUARNERA2019”.

GITLINA Y., GUARNERA G. C., DHILLON D. S., HANSEN J., LATTAS A., PAI D., GHOSH A.: Practical measurement and reconstruction of spectral skin reflectance. Computer Graphics Forum 39, 4 (2020), 75-89. Hereinafter “GITLINA2020”. GRASSMANN H.: Zur theorie der farbenmischung. Annalen der Physik 165, 5

(1853), 69-84. Hereinafter “GRASSMANN1853”.

HARDEBERG J. Y.: Acquisition and reproduction of color images: colorimetric and multispectral approaches. Universal Publishers, 2001. Hereinafter “HARDENBERG2001”. JAKOB W., HANIKA J. : A low-dimensional function space for efficient spectral upsampling. Computer Graphics Forum (Proceedings of Eurographics) 38, 2 (Mar. 2019). Hereinafter “JAKOB2O19”.

JUNG A., WILKIE A., HANIKA J., JAKOB W., DACHSBACHER C.: Wide gamut spectral upsampling with fluorescence. Computer Graphics Forum 38, 4 (2019), 87-96. Hereinafter “JUNG2019”.

LEGENDRE C., YUX., DEBEVEC P.: Optimal led selection for multispectral lighting reproduction. Electronic Imaging 2017, 8 (2017), 25-32. Hereinafter “LEGENDRE2019”.

LEGENDRE C., YUX., LIU D., BUSCH J., JONES A., PATTANAIK S., DEBEVEC P.: Practical multispectral lighting reproduction. ACM Trans. Graph. 35, 4 (July 2016). Hereinafter “LEGENDRE2016”.

MENG J., SIMON F., HANIKA J., DACHSBACHER C.: Physically meaningful rendering using tristimulus colours. Computer Graphics Forum 34, 4 (2015), 31-40. Hereinafter “MENG2015”. MALLETT L, YUKSEL C.: Spectral Primary Decomposition for Rendering with sRGB Reflectance. In Eurographics Symposium on Rendering - DL-only and Industry Track (2019), Boubekeur T., Sen P., (Eds.), The Eurographics Association. Hereinafter “MALLETT2019”.

NIMIER-DAVID M., VICINI D., ZELTNER T., JAKOB W.: Mitsuba 2: A retargetable forward and inverse Tenderer. ACM Trans. Graph. 38, 6 (Nov. 2019). Hereinafter “NIMIER-DAVID2019”. PARK J., LEE M., GROSSBERG M. D., NAYAR S. K.: Multispectral imaging using multiplexed illumination. In 2007 IEEE 11 th International Conference on Computer Vision (2007), pp. 1-8. Hereinafter “PARK2007”.

PETERS C., MERZBACH S., HANIKA J., DACHSBACHER C.: Using moments to represent bounded signals for spectral rendering. ACM Trans. Graph. 38, 4 (July 2019). Hereinafter “PETERS2019”.

RATNASINGAM S., HERNANDEZ-ANDRES J.: Illuminant spectrum estimation at a pixel. J. Opt. Soc. Am. A 28, 4 (Apr 2011), 696-703. Hereinafter “RATNASINGAM2011”. SHI J., YU H., HUANG X., CHEN Z., TAI Y.: Illuminant spectrum estimation using a digital color camera and a color chart . In Optoelectronic Imaging and Multimedia Technology III (2014), Dai Q., Shimura T., (Eds.), vol. 9273, International Society for Optics and Photonics, SPIE, pp. 1 - 9. Hereinafter “SHI2014”.

TOMINAGA S., FUKUDA T.: Omnidirectional scene illuminant estimation using a multispectral imaging system. In Color Imaging XII: Processing, Hardcopy, and Applications (2007), Eschbach R., Marcu G. G., (Eds.), vol. 6493, International Society for Optics and Photonics, SPIE, pp. 352 - 359. Hereinafter “TOMINAGA2007”.

TOMINAGA S., TANAKA N.: Omnidirectional scene illuminant estimation using a mirrored ball. Journal of Imaging Science and Technology 50, 3 (2006), 217- 227. Hereinafter “TOMINAGA2006”.

TODOVA L., WILKIE A., FASCIONE L.: Moment-based Constrained Spectral Uplifting. In Eurographics Symposium on Rendering - DL-only Track (2021), The Eurographics Association. Hereinafter “T0D0VA2021”.

UNGER J., WENGER A., HAWKINS T., GARDNER A., DEBEVEC P.: Capturing and Rendering With Incident Light Fields. In Eurographics Workshop on Rendering (2003), Dutre P., Suykens F., Christensen P. H., Cohen-Or D., (Eds.), The Eurographics Association. Hereinafter “UNGER2003”.

WANDELL B. A.: The synthesis and analysis of color images. IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI- 9, 1 (1987), 2-13. Hereinafter “WANDELL1987”.

WENGER A., HAWKINS T., DEBEVEC P.: Optimizing Color Matching in a Lighting Reproduction System for Complex Subject and Illuminant Spectra. In Eurographics Workshop on Rendering (2003), Dutre P., Suykens F., Christensen P. H., Cohen-Or D., (Eds.), The Eurographics Association. Hereinafter “WENGER2003”. WYSZECKI G., STILES W. S.: Color Science: Concepts and Methods,

Quantitative Data and Formulae. Wiley, 1982. Hereinafter “WYSZECKI1982”. Summary

According to a first aspect of the invention, there is provided a spectral upsampling method. The method comprises receiving a target illuminant colour and receiving or retrieving a number, n, of basis spectra. The method further comprises minimising a colour difference between the target illuminant colour and an estimated illumination colour, the estimated illumination colour represented as two or more stimulus values, each stimulus value calculated as an inner product of a weighted sum of spectral power distributions of the n basis spectra with a respective colour matching function and outputting the n weights corresponding to the estimated illumination colour having the minimised colour difference.

The number n may be greater than or equal to i. Stimulus values may be RGB tri-stimulus values. Stimulus values may be CIE XYZ or CI ELab values.

Colour matching functions maybe CIE 19312° Standard Observer colour matching functions: x(2),y(2), cmd z(2).

The colour difference may be any suitable colour difference, for example, E 94 colour difference (AE 94 ), AE* a b, AE* 00 , CIE DE 2000, or sRGB colour space differences.

The target illuminant colour may be expressed in the CIE Lab colour system. If the target illuminant colour is not expressed in the CIE Lab colour space (also referred to as colour representation), the target illuminant colour may be converted to the CIE Lab colour space. Any suitable colour space may be used, for example, Digital Cinema Initiative Protocol 3 (DCI-P3), Adobe® Red, Green, Blue (RGB), standard Red, Green, Blue (sRGB), ITU-R Recommendation BT.709 (Rec. 709), ITU-R Recommendation BT.2020 (Rec. 2020), or ColorEdge 318-4K (CG318-4K).

The inner products of the weighted sum of the spectral power distributions of the n basis spectra may be converted to the same colour space as the target illumination prior to calculating the colour difference. A colour space is a specific organization of colours. For example, standard Red, Green Blue (sRGB), Adobe RGB etc. The n basis spectra may synthetically computed or may be measured.

The n basis spectra may span wavelengths between 400 nm and 700 nm or between 380 nm and 780 nm. The n basis spectra may cover more than 50%, more than 60%, more than 70%, more than 80% or more than 90% of wavelengths between 380 nm and 780 nm.

The method may further comprise measuring the spectral power distributions of the n basis spectra.

Basis spectra may comprise absolute values. Basis spectra may comprise relative values.

The step of minimising the colour difference between the target illuminant colour and an estimated illumination colour may be performed by employing a stochastic optimisation method.

The initial weight of each of the n basis spectra may be the inverse of the colour difference calculated between the target illuminant and that basis spectrum expressed in the two or more stimulus values.

The initial weight of each of the n basis spectra may be calculated using the equation:

1

W k = — * -

N E<) 4 where is a normalisation factor computed the CIE Lab colour of the target illuminant, Labk is the CIE Lab colour of the k th basis spectrum and AE 94 is the colour difference between Lab re f and Labk-

The stochastic optimisation method comprises convex optimisation. The stochastic optimisation method comprises a genetic algorithm or a neural algorithm. The basis spectra may be selected from the group consisting of: red, green, blue, 2700K white, 4000K white and 5700K white light emitting diodes; Gaussian-generated bases bases; or standard illuminants bases. The target illuminant colour may be sampled using a light probe.

The method may further comprise using the light probe to sample the target illuminant colour. For example an RGB light probe. The method may further comprise driving n sets of light sources with intensities corresponding to the n output weights, each set of light sources including one or more lights configured to emit light having the respective basis spectrum.

The, or each, light source may include, or take the form of, an LED. For example, the n sets of light sources may be mounted on an LED sphere.

The target illuminant colour may comprise colour values from a region of an image.

The region of the image may comprise one or more pixels. The target illuminant colour may be calculated as an average colour across the region.

The colour difference which may be minimised is calculated according to: where Lab re f is the Lab colour of the target illuminant, XYZ Lab represents a conversion that maps CIE XYZ values to the Lab colour space, S k is a set of basis spectra and xyz(X) are the CIE 19312° colour matching functions.

According to a second aspect of the invention, there is provided an apparatus comprising n sets of light sources, each set corresponding to a distinct basis spectrum. The apparatus further comprises a controller configured, in response to receiving or retrieving a target illuminant colour: to retrieve or calculate n weights corresponding to the target illuminant colour, and to drive the n light sources such that the relative intensities of the n light sources correspond to the n weights. The n weights correspond to a minimised colour difference, the colour difference being between the target illuminant colour and an estimated illumination colour, the estimated illumination colour represented as two or more stimulus values, each stimulus value calculated as the inner product of a weighted sum of the spectral power distributions of the n basis spectra with a respective colour matching function.

The n weights may be calculated using the method according to the first aspect. The apparatus may include features corresponding to any features and/or steps of the method of the first aspect. The n sets of light sources may be comprised in a lighting booth.

The n sets of light sources may be arranged to illuminate a volume within the lighting booth. The n sets of light sources may be arranged to illuminate an object or part of an object, for example a human head or a human face.

The light sources may be configured in the lighting booth such that a user may select a target illumination colour, or a select a spectrum based on a target illumination colour. For example, a user may select from pre-sets including natural daylight, and several different spectra of artificial illumination.

The n sets of light sources may be comprised in a light stage or light cage. The apparatus may further comprise a camera.

The n sets of light sources maybe comprised in a display, for example, a Digital Cinema Initiatives Protocol 3 display.

Brief Description of the Drawings

Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:

Figure 1 is a process flow diagram of a method of spectral upsampling; Figure 2 is a schematic block diagram of a lighting apparatus;

Figure 3A is a representation of the environmental illumination used in Figures 3B and 3C;

Figure 3B is a comparison of toy figurines photographed in an outdoor lighting environment (b), and corresponding lighting reproduction inside a multispectral LED sphere using RGB LEDs (a), vs spectral lighting reproduction inside the LED sphere using the proposed spectral upsampling method (c);

Figure 3C is a comparison of zoomed in crop of the toy figurines in the same lighting environments as Figure 3B;

Figure 4 is the measured spectral power distributions of 6 different LED types employed as the basis for spectral upsampling of RGB illumination;

Figure 5A is a photograph generated from the provided measured spectral data using CIE 1931 (CIE (1932). Commission internationale de 1'Eclairage proceedings, 1931.

Cambridge: Cambridge University Press) 2-degree Colour Matching Functions as only spectral data are available; Figure 5B is a result of an optimization for outdoor environment illumination from the iCVL Hyperspectral Database Portal;

Figure 5C is a photograph generated from the provided measured spectral data using CIE 1931 (CIE (1932). Commission internationale de 1'Eclairage proceedings, 1931.

Cambridge: Cambridge University Press) 2-degree Colour Matching Functions as only spectral data are available;

Figure 5D is a result of an optimization for outdoor environment illumination from the iCVL Hyperspectral Database Portal;

Figure 6A is an evaluation of the method of Figure 1 on indoor illuminants (Philips Hue) measured for green and yellow; Figure 6B is an evaluation of the method of Figure 1 on indoor illuminants (Philips Hue) measured for cyan and white;

Figure 7A to 7D are comparisons of photographs of objects under environment illumination, under reproduced RGB illumination using RGB LEDs, and under reproduced spectral illumination using the spectral upsampling method of Figure 1; Figure 8 are images of additional examples of lighting reproduction on a white deer figurine for both indoor and outdoor lighting environments; Figure 9A is a tiled Figure of spectral upsampling of RGB the Grace Cathedral environment map. Top row shows original and reconstructed RGB environment map as well as displays AE 94 error between the two images. Middle and bottom rows show weight maps for each basis spectrum converted to RGB using CIE 1931 colour matching functions;

Figure 9B is a tiled Figure of spectral upsampling of RGB the Eucalyptus Grove environment map. Top row shows original and reconstructed RGB environment map as well as displays AE 94 error between the two images. Middle and bottom rows show weight maps for each basis spectrum converted to RGB using CIE 1931 colour matching functions;

Figure 10 is a graph of RGBCAW emission spectra employed by LeGendre et al. for spectral lighting reproduction using five narrowband and 1 broadband illuminants;

Figure 11 is a colour chart comparison of RGBW+ basis vs. RGBCAW spectral basis for outdoor and indoor illumination spectra;

Figure 12A is a graph of the Gaussian spectra with basis consisting of three broad and three narrow bands and colour chart evaluation;

Figure 12B is the SPD optimization results for the Gaussian spectra SPD optimization results for pink; Figure 12C is the SPD optimization results for the Gaussian spectra SPD optimization results for clear day;

Figure 12D is the SPD optimization results for the Gaussian spectra SPD optimization results for cyan;

Figure 12E is the SPD optimization results for the Gaussian spectra SPD optimization results for yellow;

Figure 12F is a colour chart illuminated by clear day spectrum;

Figure 13 is a comparison of full spectral rendering of a scene lit with D65 illuminant (middle row), with rendering using RGB values (1, 1, 1) for the illuminant (top row), and rendering with spectral upsampling of RGB using an optimized Gaussian basis (bottom row);

Figure 14A are spectra of various illuminants predicted using reference colourchartbased optimization, and the present proposed spectral upsampling method of Figure 1 compared against target ground truth spectra (measured using a spectrometer) for Yellow (Left) and Green (Right); Figure 14B are spectra of various illuminants predicted using reference colourchartbased optimization, and the present proposed spectral upsampling method of Figure 1 compared against target ground truth spectra (measured using a spectrometer) for Orange (Left) and Cyan (Right);

Figure 15A is a comparison of the present method to methodologies in the prior art for spectral upsampling of white illuminants. For each spectrum, the target and the spectra recovered is shown with each method, as well as colour chart evaluations;

Figure 15B is a comparison of the present method to methodologies in the prior art for spectral upsampling of green illuminants. For each spectrum, the target and the spectra recovered is shown with each method, as well as colour chart evaluations;

Figure 15C is a comparison of the present method to methodologies in the prior art for spectral upsampling of clear evening illuminants. For each spectrum, the target and the spectra recovered is shown with each method, as well as colour chart evaluations;

Figure 15D is a comparison of the present method to methodologies in the prior art for spectral upsampling of tiles illuminants. For each spectrum, the target and the spectra recovered is shown with each method, as well as colour chart evaluations; Figure 16A are images showing spectral upsampling of the Courtyard of the Doge’s Palace environment map using the method of Figure 1 with RGBW+ basis;

Figure 16B are images showing spectral upsampling of the St Peters environment map using the method of Figure 1 with RGBW+ basis;

Figure 17 are additional comparisons between RGB and spectral lighting reproduction using the approach of the present specification on a set of colour crayons, using a multispectral LED sphere;

Figure 18 are additional examples of spectral and colour reconstruction with the present method using the present specification’s RGBW+ basis;

Figure 19A is an initial set of Gaussians used to determine optimal bases with different number of narrow and broad bands;

Figure 19B is the optimal basis consisting of 3 broad and 3 narrow bands;

Figure 19C is the optimal basis consisting of one broad and five narrow bands is shown;

Figure 20A is an additional spectral and colour chart results for the selected optimal sets of Gaussians for Cloudy day. Comparison shows the target data against the results of the method of Figure 1 for three narrow and three broadband as well as the five narrow and one broadband Gaussian sets;

Figure 20B is an additional spectral and colour chart results for the selected optimal sets of Gaussians for cyan. Comparison shows the target data against the results of the method of Figure 1 for three narrow and three broadband as well as the five narrow and one broadband Gaussian sets; Figure 20C is an additional spectral and colour chart results for the selected optimal sets of Gaussians for pink. Comparison shows the target data against the results of the method of Figure 1 for three narrow and three broadband as well as the five narrow and one broadband Gaussian sets; Figure 20D is an additional spectral and colour chart results for the selected optimal sets of Gaussians for purple. Comparison shows the target data against the results of the method of Figure 1 for three narrow and three broadband as well as the five narrow and one broadband Gaussian sets;

Figure 21A is a comparison between the results achieved by the Genetic Algorithm optimization and by the Interior Point convex non-linear optimization (Convex optimization in the legend), both for the RGBW+ basis (top row) and for the Gaussian basis (bottom row) for white;

Figure 21B is a comparison between the results achieved by the Genetic Algorithm optimization and by the Interior Point convex non-linear optimization (Convex optimization in the legend), both for the RGBW+ basis (top row) and for the Gaussian basis (bottom row) for green;

Figure 21C is a comparison between the results achieved by the Genetic Algorithm optimization and by the Interior Point convex non-linear optimization (Convex optimization in the legend), both for the RGBW+ basis (top row) and for the Gaussian basis (bottom row) for Clear day;

Figure 21D is a comparison between the results achieved by the Genetic Algorithm optimization and by the Interior Point convex non-linear optimization (Convex optimization in the legend), both for the RGBW+ basis (top row) and for the Gaussian basis (bottom row) for cyan; and Figure 22 is a comparison of full spectral rendering (middle row) of a scene lit with the cyan illuminant reported in Figure 21, with rendering using the corresponding RGB values for the illuminant (top row), and rendering with spectral upsampling of RGB using the optimized Gaussian basis (bottom row). Detailed Description of Certain Embodiments

A practical approach for high fidelity spectral upsampling of RGB illumination is presented, including commonly used image-based representations such as light probes, and demonstrate its application in driving improved spectral lighting reproduction. 1. Introduction

A practical method is described herein for high fidelity spectral upsampling of previously recorded RGB illumination in the form of an image-based representation such as an RGB light probe. Unlike previous approaches that require multiple measurements with a spectrometer or a reference colour chart under a target illumination environment, the method descried here requires no additional information for the spectral upsampling step. Instead, a data-driven basis may be constructed using spectral distributions for incident illumination from a set of six RGBW LEDs (three narrowband and three broadband) that is employed to represent a given RGB colour using an optimized convex combination of the six basis spectra. As a generalization of the approach, an alternate LED basis is also evaluated, as well as a theoretical basis consisting of a set of narrow and broad Gaussians for this purpose. Good qualitative matching may be observed between the predicted illumination spectrum obtained using the described spectral upsampling approach and the ground truth illumination spectrum, while achieving near perfect matching of the RGB colour of the given illumination in the vast majority of cases. It is demonstrated that the spectrally upsampled RGB illumination can be employed for improved lighting reproduction using an LED sphere without requiring any reference colour chart measurements.

Referring to Figure 1, a process flow diagram of the method of spectral upsampling is shown.

A target illuminant colour is received (step Si). The target illuminant colour may be expressed in the CIE (Commission Internationale de 1’Eclairage) Lab colour system. If the target illuminant colour is not expressed in the CIE Lab colour representation, the target illuminant colour may be converted to the CIE Lab colour representation using conversion formulae. The target illuminant colour may be, for example, a colour of a pixel from an image, or may be the colour from an average from a region 11 (for example, see Figures 5A and 5C) of an image, although any suitable target illuminant colour can be used. The target illuminant colour is sampled using a light probe, for example, an RGB light probe.

A number n of basis spectra are received (step S2). The n basis spectra may be, for example, the spectral power distributions measured from colour emitting light emitting diodes (LEDs), (e.g. red, green and blue (LEDs)), or may be measured from different white spectrum LEDs. The spectral power distributions of the basis spectra may be overlapping, for example, a white spectral power distribution may overlap a colour spectral power distribution. The spectral power distributions of the basis spectra may be synthetic. The basis spectra may be relative or absolute values. The n basis spectra may span wavelengths in the visual spectrum, for example, between 380 nm and 780 nm and may cover more than 50% of this range of wavelengths.

Two or more stimulus values are then determined for an estimated illumination colour (step S3). An initial set of weights for each of the n basis spectra is either received or determined (step S4). The initial weights may be simply determined, for example, equal fractions 1/n. However, the initial weight of each of the n basis spectra may be the inverse of the colour difference calculated between the target illuminant and that basis spectrum expressed in the two or more stimulus values. Stimulus values may be RGB tri-stimulus values. Stimulus values may be CIE XYZ or CIELab values (Wyszecki G., Stiles W. S.: Color Science: Concepts and Methods, Quantitative Data and Formulae. Wiley, 1982).

The initial set of weights are then refined using a stochastic optimisation method to minimise the colour difference between the target illuminant colour and the estimated illumination colour (represented as two or more stimulus values), each stimulus value calculated as the inner product of a weighted sum of the colour values of spectral power distributions of the n basis spectra with a respective colour matching function (step S5). The colour matching functions may be, for example, CIE 19312° (CIE (1932). Commission Internationale de 1'Eclairage proceedings, 1931. Cambridge: Cambridge University Press) Standard Observer colour matching functions: x(2), y(2), and z(2).

The n weights corresponding to the minimised colour difference are then outputted (step S6).

The colour difference may be any suitable colour difference, for example, E 94 colour difference (AE 94 ), AE* 00 , CIE DE 2000, or sRGB colour space differences (CIE:

Publication 116-1995 Industrial colour-difference evaluation. ISBN 49783900734 60 2. Commission Internationale de L'Eclairage; Sharma, G., Wu, W. and Dalal, E.N. (2005), The CIEDE2000 color-difference formula: Implementation notes, supplementary test data, and mathematical observations. Color Research and Applications, 30: 21-30. lHtps:Z(dpLgr^^^^ The weights output from the method may be used to drive n sets of light sources (e.g. LEDs) with intensities corresponding to the respective weights, each light source including one or more lights configured to emit light within the respective spectrum. Referring to Figure 2, a lighting apparatus 1 comprises a controller 2, one or more light source drivers 3 and n sets of light sources 4 (41, 42, 4 3 , ..., 4n). The n sets of light sources 4 each correspond to a particular spectrum, for example, red, green or blue light, or broad or narrow spectrum white light. The controller 2 is configured to drive the n light sources via the light source driver 3 such that the relative intensities of the n light sources correspond to n weights. The n weights may be received or calculated using the method described above. The n weights correspond to a minimised colour difference, the colour difference being between the target illuminant colour and an estimated illumination colour, the estimated illumination colour represented as two or more stimulus values, each stimulus value calculated as the inner product of a weighted sum of the spectral power distributions of the n basis spectra with a respective colour matching function.

The light sources 4 are then able to reproduce the spectrum of a target illuminant colour which has been selected to determine the set of n weights from. The light sources 4 may be comprised in a light stage, a light cage, a light booth or a display, for example, a Digital Cinema Initiatives Protocol 3 (DCI P3 - EG 432-1:2010 - SMPTE Engineering Guideline - Digital Source Processing — Color Processing for D-Cinema," in EG 432- 1:2010 , vol., no., pp.1-81, 10 Nov. 2010, doi: 10.5594/SMPTE.EG432-1.2010; RP 431- 2:2011 - SMPTE Recommended Practice - D-Cinema Quality — Reference Projector and Environment," in RP 431-2:2011 , vol., no., pp.1-14, 6 April 2011, doi:

10.5594/SMPTE.RP431-2.2011 ) display. The lighting apparatus 1 may further comprise a camera which may be used for recording an image or video of the volume illuminated by the light sources. The light sources 4 may be arranged to illuminate at least part of an object or body, for example a face. The light sources 4 may be configured in the lighting booth such that a user may select a target illumination colour, or a select a spectrum based on a target illumination colour.

Given an illuminant RGB colour, the present approach combines six basis spectra to create a metamer illuminant spectrum. The results are demonstrated by upsampling legacy RGB HDR environment maps to a spectral representation. In addition to the spectral rendering possibilities enabled by this solution, it also allows to do more accurate lighting reproduction using a multispectral light stage (see Figures 3A to 3C).

The proposed solution presented hereinafter relies on a basis composed of six RGBW LEDs mixing narrowband and broadband spectra. First, the RGB input colour projected to CIELab space is matched with a linear combination of the selected basis spectra. Based on this initialization, the linear combination is optimised to match the input colour as closely as possible. Additionally, other bases may be used with the present approach including bases based on Gaussians, and an alternate set of LED bases previously employed for multispectral lighting (LEGENDRE2017) .

The present method is validated against measured spectra and demonstrates good qualitative results on both object relighting and colour checker reflectance reconstruction. These results are demonstrated to be of comparable quality to the state- of-the-art method of LEGENDRE2016 but without requiring additional reference colour chart measurements as required by LEGENDRE2016. The present approach to reflectance is compared with other spectral upsampling methods (JAKOB2O19;

MALLETT2019) confirming that this approach allows better reconstruction of nonsmooth illumination spectra.

To summarize, the present specification presents a practical RGB to spectral upsampling approach for illumination with a compact representation. Further, evaluation of multiple bases for spectral lighting representation, demonstrating the benefit of the proposed approach over existing RGB-to-spectral upsampling techniques for lighting reproduction.

2. Previous Work

The human eye contains three types of cone cells, each with a different sensitivity to a broad range of light wavelengths. Each cone i integrates spectral data over the visible range co = [38onm 78011111]: overall, cones act as colour receptors and allow trichromatic vision: where c = [ci, c 2 , c 3 ] is the colour response, ! ) is the spectral sensitivity of cones of type i, s(X) is a spectrum, either an illuminant spectral power distribution or the light reflected by a surface with a given spectral surface reflectance; X G co is the wavelength. CIE has defined a standard observer, with published spectral responses (colour matching functions x(2), y(2), z(2), leading to the CIE XYZ tristimulus colour space (WYSZECKI1982). The integration performed by the cone cells compresses the complete spectral information into just three sensory quantities, thus leading to a phenomenon known as metamerism, which arises when two different spectra Si and s 2 defined metamers (CIE1986), generate the same tristimulus values:

The concept of metamerism is not limited to human observers, but can be extended to any imaging system with three spectral channels (FAIRCHILD2001); similar considerations apply to monochrome sensors. In general, the recoveiy of the original spectral information from trichomatic colours is severely underconstrained. In theory, each colour can be generated by an infinite number of different spectra (MALLETT2019; PETERS2019). However, the frequency of metamerism in the real world is relatively rare (FOSTER2OO6; FENG2012; ARAD2016; AKBARINIA2018); in the reflective case, there are significantly less metameric matches for complex or saturated reflective spectra than for simpler and neutral ones.

Lighting Reproduction: RGB light probes (DEBEVEC1998) record the incident illumination at a specific location, typically by stitching together fisheye images from different directions or by acquiring a HDR sequence of a mirrored sphere. As demonstrated by Debevec et al. (DEBEVEC2012), the inclusion of diffuse strips between the quadrants of a cut-apart mirrored sphere allows the estimation of the full dynamic range from a single photograph. Such light probes can then be used for rendering virtual objects (DEBEVEC1998) or for lighting reproduction applications, for instance by driving colour and intensity of inward pointing LED lights in spherical domes (DEBEVEC2000; UNGER2003; LEGENDRE2016), for example in lighting booths, lighting stages and lighting cages. However, relying solely on RGB LEDs for real-world lighting reproductions can lead to poor colour rendition and visible colour casts, particularly when directly minimizing the sum of the square residuals of the reproduction light spectra to the target illuminant (Spectral Illumination Matching - SIM) (WENGER2003), due to gaps in the spectral power distribution of the resulting illumination. By increasing the number of LEDs beyond RGB, it is possible to significantly improve the quality of lighting reproduction results, matching a wider set of real-world illuminants. Similarly, relighting applications can benefit from more accurate spectral reflectance estimation achievable with a five- LED basis (PARK2007). LeGendre, et al. (LEGENDRE2017) demonstrated that by accounting for the colour rendition capabilities of different LED basis in a lighting reproduction setup, material colour appearance under arbitrary lighting can be accurately matched with just five

LEDs. Augmented RGB panoramic photography with five directional observations of a colour chart which is used to derive the weights to drive multispectral LED sphere with six types of LEDs. While LeGendre et al. (LEGENDRE2016) does not require the use of specialized equipment for camera characterization, the use of colour charts is common in illuminant estimation techniques. Unlike the approach of Legendre et al., the present proposed method does not require any other information besides RGB illumination as input (such as a light probe) for the spectral up-sampling step and in this respect can be used for spectral conversion of any recorded legacy RGB illumination. Illumination Estimation:

Estimating the scene illuminant from RGB image data, either as a spectrum or a RGB colour, is the main step in computational colour constancy, with a number of applications, ranging from colour imaging to computer vision. Often, the key assumption is that there is only a single, dominant illuminant in the scene. While three variables are necessary and sufficient to define a colour (GRASSMANN1853), due to metamerism more than three channels may be required to obtain a device-independent estimate of an illuminant spectrum. Ratnasingam and Hernandez-Andres (RATNASINGAM2011) focused on daylight illumination to demonstrate accurate per- pixel recovery of spectral data from six channels input data, modelling the Camera Spectral Sensitivities (CSSs) as Gaussian functions and constructing an illuminant invariant feature space from a large set of measured reflectances. Spectral estimation from spherical RGB images has been demonstrated (TOMINAGA2OO6), for mixed daylight and incandescent bulb illumination, under the assumption that the CSSs are known, and that spectra in the input image can be expressed as a linear combination of three basis spectra; the basis are obtained as the first three principal components of a dataset containing both CIE standard illuminants and measured light sources. In order to deal with spiky fluorescent spectra, the number of basis can be expanded to five and the RGB camera can be replaced with a monochrome camera equipped either with six gelatin filters or a liquid-ciystal tuneable filter used to sample the visible spectrum using up to 61 narrow bands; for each spectral band, the corresponding CSS must be known (TOMINAGA2OO7) . As opposed to this previous work, in the method presented here, there is no SIM performed, that is, there is no attempt to recover the unknown illuminant spectrum from the incomplete RGB information. Instead, the present specification focuses on per-pixel Metameric Illuminant Matching (MIM), providing a plausible spectrum that visually matches the reference colour for a human observer.

Spectral Upsampling:

In general, most techniques for spectral estimation greatly benefit from accurate camera characterization, which may be required to convert device-dependent RGB data to device-independent CIE XYZ tristimulus values. It is possible to estimate the incident illumination spectrum from the photograph of a colour chart with known spectral reflectances, used to compute the corresponding CIE XYZ values GUARNERA2019). Similar concepts have been applied to spectral reflectance estimation (SHI2014) from RGB data (spectral upsampling), with some key differences, since spectral reflectance is the ratio of the reflected light to the incident light and thus defined in the range [o 1], while emission spectra are unbounded. Furthermore, reflectance spectra are typically very smooth in the visible range (WANDELL1987), while emission spectra can display sharp peaks and have complex shapes, such as in the case of fluorescent lamps. A third important point is that some CIE XYZ triplets (therefore, some chromaticities), cannot be obtained from reflectance spectra, thus leading to concept of gamut of solid reflectances. A method based on local optimization (BIANC02010), aimed at recovering smooth, metameric reflectance spectra from user- specified tristimulus values has been proposed. Spectra may be precomputed (MENG2015) for a set of points forming a regular grid on the 2D CIE 1931 xy chromaticity diagram. To compute the spectrum corresponding to arbitrary CIE XYZ values, the precomputed spectra are interpolated, either using bilinear or baiycentric interpolation, depending on their location on the chromaticity diagram. Since the resulting spectrum might not conserve energy, a normalization step may be required, potentially introducing errors. Jakob and Hanika (JAKOB2O19) focused on the sRGB colour space, defining a 3-dimensional non-linear function space, whose coefficients are precomputed and stored in a 3D table that account also for the brightness; the coefficients are then converted into a spectrum at runtime. The advantage of explicitly accounting for brightness is the ability to produce smooth spectra even in case of dark input colours. Jung et al. (JUNG2019) addressed material beyond the gamut of solid reflectances, including a fluorescent component expressed in parametric form. Using optimization, Mallett and Yuksel (MALLET2019) defined three spectral primaries that allow to obtain relatively smooth, energy conserving reflectance spectra as a linear combination of the primaries, for any given input sRGB triplet. Peters et al. (PETERS2019) introduced a compact representation for reflectance spectra with known shape, complemented by a lookup table approach for assets stored in three colour channels. In this work it was demonstrated that with three to six scalars per pixel (4 to 8 bytes per pixel), any reflectance spectra can be accurately reconstructed by their bounded maximum entropy spectral estimate. In case of emission spectra, the number of coefficients m that may be required for a satisfactory reconstruction can be significantly higher (16 to 32), as well as the run time of their algorithm, with complexity 0(m 2 ). While their approach has the ability to encode very complex illumination spectra, it requires a known spectrum for computing the coefficients. Instead, in the present method, the employed basis focuses on compactly representing many common illumination spectra while requiring only the corresponding RGB colour as input. Similarly to Jakob and Hanika (JAKOB2O19), Todova et al. (T0D0VA2021) proposed the use of a pre-computed sRGB coefficient cube to map colours to the corresponding moment based representation of the reflectance spectra (PETERS2019). Todova et al.’s method allows to input a set of constraints, i.e. user-supplied spectra for some sRGB colours, that are closely preserved by the moment based representation and used as priors for the optimization of neighbouring colours in the cube. 3. Method

In the present specification, an initialized optimization procedure is proposed that converts measured RGB values of different illuminants into the spectral domain using a linear combination of selected illumination basis spectra. The present method is applied to convert photographed scene images and environment maps from commonly stored RGB format into spectral representation. Additionally, the proposed RGB-to- spectral upsampling step is employed to drive a multi-spectral LED sphere for improved lighting reproduction. The present method assumes the input RGB colour has been acquired using a camera with reasonable colour processing (e.g., with daylight white balance setting).

3.1. Spectral basis selection A relatively compact basis of six illuminant spectra was chosen to minimize the storage overhead compared to RGB. The size of basis is also motivated by the previous work of LeGendre et al. (LEGENDRE2016) who show that a set of six LEDs allows to cover most of the illumination spectrum.

The basis is defined using the set of measured spectral power distributions (SPDs) of the six LED types installed in a multi-spectral LED sphere. Namely, three narrowband Red 5, Green 6, and Blue 7 LEDs, and three broadband white LEDs (warm 2700K 8, neutral 4000K 9, and cool 5700K 10). The SPDs of these LEDs are measured with a spectrometer -Sekonic SpectroMaster C700- by placing it at the centre of the LED sphere. The measured spectral basis is shown in Figure 4.

Unless stated otherwise, results presented here are shown with the above described basis. However, the method is not limited to a specific basis and it is also demonstrated with the LED basis set employed by Legendre et al. and an alternate theoretical basis using a set of Gaussians later in Section 4.5 (see Figure 12).

To drive the multispectral light stage and to compute its dynamic range and the white point, as may be required by the colour space conversions embedded both in the initialization and optimization, the absolute value of each LED SPD is recovered, expressed in w/ (m 2 ■ sr ■ nrri). The CIE XYZ white point of the device used in this work, measured with all LEDs on at full intensity, is cd/m 2 . Such measurements maybe required only once, and can be obtained with a spectroradiometer or using a colourimetrically characterized camera, with support for the absolute scale (GUARNERA2019). Please note that just the knowledge of the relative intensity of the LED SPDs, as shown in Figure 4, would suffice for most practical applications.

3.2. Optimization

To reconstruct the target illuminant colour and spectrum, an optimization procedure is used to find a set of weights a, to linearly combine a given set of n basis spectra. To guide this optimization, the AE 94 colour difference (CIE1995) between the CIELab values of the reference illumination and the reconstructed spectrum is minimised: where Lab re f is the Lab colour of the target illuminant, XYZ — Lab represents a conversion that maps CIE XYZ values to the Lab colour space, S k is a set of basis spectra and xyz(X) are the CIE 19312° colour matching functions. While the present method is not limited to a specific colour space (e.g. it works with sRGB, Adobe RGB, Rec. 2020 etc.), the colour space in use must be known, in order to convert the illuminant colour to the corresponding CIE XYZ value and then to CIE Lab. Given the one-to-many relationship between trichromatic colours and possible metamers (see Section 2), it is possible for convex optimization techniques to get stuck in local minima. Therefore, to compute the set of weights a k a stochastic search strategy based on genetic algorithms is used. A comparison between the approach presented here and interior point convex non-linear optimization is reported in the supplemental material.

In the implementation, the set of individuals that will contribute to the definition of the next generation are selected by means of pair-wise comparisons in a tournament selection, where pairs are randomly picked. From this set, parents are selected according to panmixia; the crossover of their DNA (i.e. their coordinates) is given by the arithmetic mean. While this solution allows to reduce the algorithm complexity, it tends to destroy recessive genes in a small number of generations. To mitigate this effect, some of the individuals for a subsequent generation are produced by applying small mutations to the coordinates of a single parent, while the best performing individuals are included in the next generation. The parameter domain is discretized over a regular 6D lattice, constraining each variable to be an integer in the range [o 2 55], thus directly corresponding to the activation value of an LEDs in the LED sphere; the rules to create the initial population and the following ones are modified in order to enforce such constraints (DEEP2009). Using the weight range restriction for each element of the basis in the linear combination, it is ensured that the physical LEDs will be capable of reproducing the intensity of the target spectra.

While genetic algorithms do not require to specify an initial parameter guess, in the implementation, the possibility to initialize the population relying on barycentric interpolation is included, for which the contribution of each basis spectrum corresponds to the inverse of its distance to the target illuminant colour in the CIE Lab colour space. Each starting weight of one individual are computed as follows: where is a normalisation factor computed as £ =1 — , Labref is the CIE Lab colour of the target illuminant, Labk is the CIE Lab colour of the k th basis spectrum and AE 94 is the colour difference between Lab re f and Labk- The other individuals of the initial population are located in the neighborhood of Eq. 4, with weights randomly chosen in the interval Wk ± 5%, if within the [o 255] range. The average run-time of the method (single thread unoptimized implemen-tation) is s.ogs/color on a AMD Ryzen 72700 3.20 GHz processor and 32GB RAM desktop PC.

4. Evaluation

The choice of basis and approach is evaluated against measured ground truth spectra, and applications of the method are shown for spectral upsampling of existing HDR environments for spectral lighting reproduction. The direct spectral upsampling approach is compared against the approach of LeGendre et al. (LEGENDRE2016) which requires additional measurements of reference colourcharts under target lighting. Additionally, in comparison to existing methods for RGB to spectral upsampling which focus on reflectance (MALLETT2019; JUNG2019; T0D0VA2021) and show that, as discussed in their work, the spectrum smoothness assumption does not hold for many illuminants.

4.1. Recovered spectra evaluation

The present method is evaluated by comparing the results to measured spectra and RGB colour of both outdoor illuminants (HDR environment map regions) and indoor illuminants (Philips Hue LEDs).

4.1.1. Outdoors illuminant

In the case of outdoor illumination, the ground-truth comes from a BGU iCVL Hyperspectral Image Dataset (ARAD2016), containing HDR and spectral captures of outdoor scenes. For this evaluation the RGB colour was used as input to the method and the spectral measurements as ground-truth. For each environment, the target reference colour n was determined by averaging the colour of the dominant illuminant - usually a patch of sky or the sun. For the BGU iCVL Hyperspectral Image Dataset, the average of the spectra of the pixels used to determine the target reference colour and use this average spectra as the ground truth SPD.

Two examples of the results of the optimization on the spectral database are shown in Figures 5A to 5D. The SPD (resulting SPD 15) reconstructed by the optimization method closely approximates the shape of the spectral data (target SPD 16), as well as the reference RGB colour. The starting SPD 17 (initialization) is also indicated. By using the estimated SPDs to illuminate a colour chart, an average AE 94 difference of 1.3714 across is obtained the six evaluated RGB colours of outdoor illuminants. In all cases tested a close metamer and spectral match for both cloudy and clear day was found.

4.1.2. Indoors illuminant

Further, the present methods are evaluated on indoor lighting conditions as they exhibit veiy different spectrum compared to natural illumination. To do so, HDR images are captured and the spectra of different colours of Philips Hue LED lamps in a dark room are measured. Figures 6A and 6B show results generated for different colours of the LED lamp. Similarly to outdoor illuminants, the present method matches the ground-truth spectra and colour well. The SPD (resulting SPD 18) reconstructed by the optimization closely approximates the shape of the spectral data (target SPD 19), as well as the reference RGB colour. The starting SPD 20 (initialization) is also indicated. For all results, the present method very closely match the target colour and its spectrum.

4.2. Multispectral lighting reproduction The benefit of the present proposed spectral upsampling of RGB illumination for lighting reproduction is shown in Figures 7 and 8. Several objects recorded under both indoor and outdoor environmental illumination are shown where the corresponding RGB light probe with HDR imaging was acquired. A multispectral LED sphere is driven directly with the recorded RGB illumination, as well as the proposed spectral upsampling to drive all six LED types on the LED sphere for spectral lighting reproduction. Referring in particular to Figures 7 A to 7D, qualitatively, the present method allows the reproduction of the original lighting more accurately.

Referring in particular to Figure 8, the present spectral upsampling method driven spectral lighting reproduction is qualitatively a closer match to the reference photograph than RGB lighting reproduction. For the spectral lighting case, the optimized weights of the LED basis (RGBW+) are directly used to drive the intensity of each LED type on the LED sphere. As can be seen, the lighting reproduction achieved with the present spectral upsampling step is a better match to the input photograph of the objects compared to RGB lighting. This is consistent with the findings of LeGendre et al. 2016 (LEGENDRE2016) (despite the present method not requiring reference colour chart measurements), providing a qualitative validation of the present spectral upsampling method. Figures 3A to 3C shows another example of the present method used for driving spectral lighting reproduction. As can be seen in Figures 3A to 3C, the RGB-to-spectral upsampling of the input RGB lightprobe achieves a closer qualitative lighting reproduction on the figurines compared to simply driving the LED sphere with RGB values.

4.3. Spectral upsampling of environment maps The present method may also be used for spectral upsampling of pre-recorded RGB environment maps. Figures 9A and 9B show results generated for two popular environment maps - Grace Cathedral and Eucalyptus Grove. Here, the RGB pixels of the original environment maps were processed using the present method to obtain weights of the six LED types in the basis described above which are shown as the six weight maps colour coded with the colour produced by the corresponding LED type. Figure 9 also shows the reconstructed RGB colours produced by the weighted combination of the six LEDs which is a very close match to the input RGB colours in the environment map. The maximum error in the reconstructed environment illuminations is in the Grace Cathedral example with AE 94 = 0.9525. Additional examples of such spectral upsampling of environment maps are provided in the supplemental material.

4.4. Colour checker validation

In Figures 11, 12, 14 & 15, a comparison of the result of the present method with the ground truth spectra using colour chart renderings is shown. Each colour chart patch is illuminated with the ground truth and the recovered spectra, and the two colours are visualised in the same image, where for each patch the inner circle corresponds to the colour produced with the present method or previous methods and the outer frame represents the ground truth colour. The RGB values of each patch are generated by convolving the illumination spectrum, the known X-Rite colour chart patches reflectances ( https: / / www.b-d jelcolc • = < •: :-m A A: )rch$ jcke -" - tm ) and the CIE 19312- degree colour matching functions, which is followed by standard XYZ to RGB conversion. For error computation the CIE Lab space is used, to which XYZ values are converted using standard XYZ to CIE Lab conversion. The predicted spectrum is an excellent metamer for the target spectrum in almost all of the cases tested. This is particularly interesting for the case of outdoor illumination which has a flatter spectrum compared to indoor illumination. While the present selected basis fits nonsmooth indoor illumination spectrum with higher accuracy, the optimized spectrum for outdoor illumination is still an excellent metamer.

4.5. Basis evaluation While the present basis was chosen to enable environment lighting reproduction in a specific LED -sphere, other spectral bases can also be used with the present method. The present approach is also evaluated on other six-illuminants bases as previous works (LEGENDRE2016; GITLINA2020) have shown it to lead to good spectral and colourimetric accuracy, while keeping the basis compact with additional LEDs leading to marginal improvements. The results of the present method using the LED basis employed by LeGendre et al. (LEGENDRE2016) are shown and discuss a theoretical basis based on narrow and broad Gaussians which can be used for RGB-to-spectral illumination upsampling. 4.5.1. RGBCAW LED Basis

An alternate LED basis installed on USC ICT’s Lightstage X, shown in Figure 10, includes RGBW LEDs similar to the selected basis, but contains only one broad band white 23 instead of three types (warm 2700K 8, neutral 4000K 9, and cool 5700K 10 - see Figure 4). Instead, this basis includes two additional narrow band illuminants covering the amber 27 and cyan 28 spectrum.

Referring to Figure 11, a comparison of rendered patches of a colour chart under various illumination spectra recovered using the selected basis (RGBW+) against the RGBCAW basis, as described in Section 4.4, is shown. The selected basis (RGBW+) represents indoor illuminants slightly better while the RGBCAW basis works slightly better for outdoor illuminants. When comparing the average AE 94 error over all colourchecker patches across both outdoor and indoor evaluated illuminant colours, both bases are relatively close with AE 94 = 1.8658 for RGBW+ and AE 94 = 2.7350 for RGBCAW. Both bases produce very comparable overall results, with the RGBW+ basis producing slightly better results for indoor illumination (c, d).

4.5.2. Gaussian basis

While the primary basis selection was based on practical reasons such as using commonly available LEDs for lighting reproduction, analysis was performed on synthetically computed spectral bases consisting of a set of narrow and broad Gaussian functions. Gaussians were chosen for analysis as they are compact in representation (requiring only 2 values), have shapes that are similar to LED responses, and have been previously employed in realistic rendering pipelines (e.g. the Gaussian Spectrum node for texture reflectance and textured emission in Octane Render https://docs.otoy.com/Portal/Home.htm). To find the best Gaussian bases, different sets of narrow and broadband spectra were created using Gaussian functions with specified width (standard deviation) and peak locations (mean). For the narrow band set of Gaussians each peak were places at 25 nm distance from each other in the visible wavelength range (400-70011111) and standard deviation of 15 nm was used. For the broad band set, the standard deviation was increased to 4onm. The optimization described above was run on all possible combinations of this initial selection of Gaussians (see supplemental materials), specifying the number of broad and narrow bands to include. Finally, for each distribution of broad and narrow bands, the set of Gaussians which produces the smallest total AE 94 Lab error between the target and the reconstructed colour across all evaluated indoor and outdoor illuminants was chosen.

Referring to Figure 12 the results of this search with three narrow band spectra 30, 31, 32 and three broad band spectra 33, 34, 35 show a slight improvement in matching of the target illumination spectrum compared to the selected LED basis (RGBW+). The result spectral power distribution 36 and the target spectral power distributions 37 are plotted for each of Pink, Clear day, Cyan, and Yellow. Interestingly, the set of optimized Gaussians bear a strong similarity to the employed RGBW+ LED basis, with the three narrow bands corresponding to RGB colour primaries and the three broadbands being slightly shifted between warmer and cooler spectra. In particular, with this theoretical basis the pink spectrum is matched, which proved to be difficult with the physical selected basis (RGBW+), as described in Section 4.7. The selected basis is the optimal basis producing the least LAB total error for the twelve tested illuminants. While similar to the present RGBW+ LED basis, the Gaussian basis allows to better reconstruct illuminants than the RGBW+ and RGBACW LED bases. See Table 1 for numerical comparisons. Taking a cue from the RGBCAW LED basis, further experiments were conducted using Gaussian bases containing five narrowband spectra and one broadband spectra.

However, the average AE 94 of the best one ( E 94 = 1.8545) was slightly worse than that with three narrow-band spectra 30, 31, 32 and three broadband spectra 33, 34, 35 ( E 94 = t-3970). Table 1 shows the average AE 94 for each basis on colour checker illumination -as described in Sec. 4.4. Both set of Gaussian bases result in AE 94 < 2, typically considered as a good match and significantly better than the acceptability threshold in graphics art and display technology. The Gaussian bases analysis was consistent with the performance of the RGBW+ vs RGBCAW LED bases. Table 1

Given the slightly higher accuracy of the optimized Gaussian basis compared to the LED bases, it can be a good choice for spectral upsampling and representation of illumination for spectral rendering applications. Figure 13 shows rendering comparisons of a scene consisting of a banana on a ceramic bowl that is rendered with a

D65 illuminant with full spectral rendering (sampling at eveiy 2 nm intervals, middle row) using Mitsuba2. The corresponding rendering with RGB illumination with RGB values set to (1, 1, 1) results in noticeable difference (top row) compared to the reference spectral rendering, particularly on the white ceramic 40 and colour of the banana 41. Finally, the RGB-to-spectral upsampling using the optimized Gaussian basis results in a rendering that is nearly indistinguishable (bottom row) from the full spectral rendering. A similar rendering comparison for a narrowband illuminant in supplemental material is provided. The result of the present spectral upsampling is indistinguishable from the reference rendering, while colour differences in the RGB rendering are clearly visible. 4.6. Comparisons

4.6.1. Colourchart-based optimization

The spectral upsampling technique described here was compared to reference colourchart-based optimization similar to that employed by LeGendre et al. (LEGENDRE2016) for spectral lighting reproduction. Instead of solving for an entire environment map using five oriented colourcharts as done in LeGendre et al. ((LEGENDRE2016)), a simpler comparison using a single reference colourchart illuminated with mostly frontal illumination emitted from a Philips Hue LED bulb was performed. The colourchart-based optimization then searches for weights of the six colourchart responses due to the RGBW+ LEDs in the multispectral lightstage whose linear combination then minimizes the AE 94 colour differences over the set of 24 colours measured on the reference colourchart under a target illumination. The six optimized LED weights using the reference colourchart construct a predicted spectrum for the illumination through their linear combination. This is repeated for four different types of hues emitted by the LED bulb (yellow, green, orange, cyan - see Figures 14A and 14B). The target ground truth spectrum (Target SPD 45) is measured for each illumination condition using a spectrometer. Finally, the spectra for each of the illumination conditions is predicted using the method directly from the RGB values of the illumination recorded on a mirror ball (“Ours” 46). As can be seen in Figures 14A and 14B, the spectral upsampling method predicts spectra of very comparable quality, and in some cases even better quality, compared to the method of LeGendre et al. ((LEGENDRE2016), without requiring the measurement of a reference colourchart. Thus, the present method removes the requirement of specialized measurement of oriented colourcharts for environmental illumination, thereby enabling direct spectral upsampling of legacy RGB light probes for spectral lighting reproduction. Both the present method and the method of LeGendre et al. result in comparable high quality estimation of illumination spectra, with the present method having the advantage of not requiring any reference colourchart measurement. 4.6.2. Reflectance spectrum upsampling

Referring to Figures 15A and 15B, the present spectral upsampling method 50 is compared with to recent RGB-to-spectral upsampling techniques of Mallett & Yuksel (MALLETT2019) and Jakob & Hanika (JAKOB2O19) which are designed for smooth reflectance spectrum and the Target SPD 53. In Figures 15A and 15B, comparisons to the present method on illuminants (Clear evening, Green and White) and a reflectance (Tiles) are shown. For each spectrum, the target and the spectra recovered with each method are shown, as well as a colour chart comparison -as described in Section 4.4. This comparison illustrates the limitations of the smoothness assumption when recovering illuminants, as described in previous work (MALLETT2019; JAKOB2O19). Additionally, while not its primary purpose, the present method is also capable of recovering reasonable metamer spectra for reflectance (Tiles). This confirms that the present method can upsample entire RGB environment maps, which typically not only consist of illuminants, but also background pixels which represent reflectances convolved with illuminants. Figures 15A and 15B also include comparison to an unconstrained variant of the recently proposed method of Todova et al. (TODOVA2O21). Note that Todova et al ’s constrained method is designed to be used with user provided bases, but is limited to the sRGB gamut which does not apply for illumination spectrum. The spectrum smoothness assumption described in Mallett & Yuksel (MALLETT2019) and Jakob & Hanika (JAKOB2O19) for reflectance spectra does not hold for illuminants. The tiles result is used as an illuminant spectrum to simulate its appearance in an environment map background pixel and show that for such use-case, the present method recovers spectra with similar quality to previous work (on average, the present method results in AE 94 = 1.2332 against AE 94 = 1.7670 for Mallett & Yuksel (MALLETT2019) and AE 94 = 1.7861 Jakob & Hanika (JAKOB2O19)). . Conclusion

In summary the present method is a practical method for high fidelity spectral upsampling of recorded RGB illumination using a chosen set of illumination bases and optimizing their convex combination to best match an input RGB colour. The present method does not require any additional information such as reference colour chart measurements, making it suitable for directly applying it for upsampling of legacy RGB lighting environments for improved spectral rendering as well as lighting reproduction. Good qualitative matches to ground truth spectra for a range of measured outdoor and indoor illumination are demonstrated as is the benefit of the approach over existing spectral upsampling techniques designed for smooth reflectance spectra.

6. Additional results

Visualised results of spectral upsampling using the present method on two more environment maps is shown in Figures 16A and 16B, and present additional lighting reproduction examples in a multispectral LED sphere. Referring to Figure 17, RGB and spectral illumination reproduced with the present method are compared. Still referring to Figure 17, the top-row presents mirror ball photographs of the target environmental illumination. The spectral lighting reproduction (bottom row) is a closer match to the reference photograph than RGB lighting reproduction, while still being driven by RGB lightprobe as input. Referring to Figure 18, more results for the present method using RGBW+ basis are shown (Figure 18 - Resulting SPD 58, Target SPD 59, Starting SPD 60). The optimal Gaussian bases as well as corresponding spectral and colour checker reconstruction using these bases in the optimization procedure are shown in Figures 19 and 20. Figure 19A is an initial set of Gaussians used to determine optimal bases with different number of narrow and broad bands. Referring to Figure 19B, at the left, the optimal basis consisting of three broad 61, 62, 63 and three narrow bands 64, 65, 66, whereas on the right the optimal basis consisting of one broad 67 and five narrow bands 68, 69, 70, 71, 72 is reported. Figures 20A to 20D are additional spectral and colour chart results for the selected optimal sets of Gaussians (Figure 20A - Cloudy day; Figure 20B - Cyan; Figure 20C - Pink; Figure 20D - Purple). Comparison shows the target data 75 against the results of the method for three narrow and three broadband 76 as well as the five narrow and one broadband 77 Gaussian sets. Each colour chart patch is illuminated with the ground truth and the recovered spectra, and visualize the two colours in the same image: for each patch the inner circle corresponds to the colour produced with the method and the outer frame represents the ground truth colour.

Referring to Figure 20, each colour chart patch is illuminated with the ground truth and the recovered spectra, and the two colours in the same image are visualised: for each patch the inner circle corresponds to the colour produced with the method and the outer frame represents the ground truth colour.

Referring to Figures 21A to 21D the genetic algorithm based optimization 80 is compared with standard convex non-linear optimization 81 for spectral upsampling and shows genetic algorithm to converge to spectra that is closer to the ground truth (Target SPD 82 and Starting SPD 83) (Figure 21A - White; Figure 21B - Green; Figure 21C - Clear day; Figure 21D - Cyan). While convex optimization tends to produce optimized spectra that still preserve some features of the spectrum used for the initialisation (barycentric interpolation, see Section 3.2), on the average the stochastic nature of the GA allows to produce better solutions, less affected by local minima (i.e. with lower colourimetric error, often better matching the ground truth spectra, despite not being provided in input to the algorithms). In all cases, both algorithms have been compared on the same input, using the same initialization. Finally, Figure 22 shows another example of application of the RGB-to-spectral upsampling using the proposed Gaussian basis for improved colour accuracy of renderings. The result of the application of present spectral upsampling method is indistinguishable from the reference rendering, while colour differences in the RGB rendering are clearly visible.

The method of minimising a colour difference between the target illuminant colour and an estimated illumination colour may be performed using any suitable method, for example a stochastic optimization method, a genetic algorithm, a neural algorithm, for example, a neural network algorithm, and the like.

Modifications

It will be appreciated that various modifications may be made to the embodiments hereinbefore described. Such modifications may involve equivalent and other features which are already known in the design, manufacture and use of spectral upsampling methods, and methods and apparatuses implementing spectrally upsampled information, and component parts thereof, and which may be used instead of or in addition to features already described herein. Features of one embodiment may be replaced or supplemented by features of another embodiment.

Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.