Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHODS AND APPARATUS FOR DETERMINING A COLOUR VALUE OF TEETH
Document Type and Number:
WIPO Patent Application WO/2023/072743
Kind Code:
A1
Abstract:
The invention relates to methods and corresponding apparatus for determining a colour value of one or more teeth. The computer-implemented method comprises receiving an image of teeth and a calibration pattern; identifying one or more teeth from the image using a segmenting model, wherein the segmenting model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image; determining an observed colour of each of the one or more teeth from the image; identifying a plurality of coloured areas of the calibration pattern from the image; determining an observed colour of each of the coloured areas of the calibration pattern; determining a correction model by comparing the observed colour of each of the coloured areas of the calibration pattern with a respective known colour of a corresponding known pattern; and applying the correction model to the observed colour of each of the one or more teeth to determine a colour value of the one or more teeth.

Inventors:
HU WENCHAO (NL)
SUN YIWEN (NL)
WANG SHAN (NL)
YANG HUI (NL)
Application Number:
PCT/EP2022/079331
Publication Date:
May 04, 2023
Filing Date:
October 21, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNILEVER IP HOLDINGS B V (NL)
UNILEVER GLOBAL IP LTD (GB)
CONOPCO INC DBA UNILEVER (US)
International Classes:
G16H30/40; G16H50/20
Domestic Patent References:
WO2020201623A12020-10-08
Foreign References:
US20200246121A12020-08-06
CN111462114A2020-07-28
JP2008149117A2008-07-03
US20020064751A12002-05-30
CN113436734A2021-09-24
Other References:
HE, KAIMINGGEORGIA GKIOXARIPIOTR DOLLARROSS GIRSHICK: "Mask R-CNN", ARXIV: 1703.06870 [CS, 20 March 2017 (2017-03-20), Retrieved from the Internet
REDMON, JOSEPHALI FARHADI: "YOLOv3: An Incremental Improvement", ARXIV: 1804.02767 [CS, 8 April 2018 (2018-04-08), Retrieved from the Internet
FINLAYSON, GRAHAM D.MICHAL MACKIEWICZANYA HURLBERT: "Colour correction using root-polynomial regression", IEEE TRANSACTIONS ON IMAGE PROCESSING, vol. 24.5, 2015, pages 1460 - 1470, XP011574917, DOI: 10.1109/TIP.2015.2405336
W. LUOS. WESTLANDP. BRUNTONR. ELLWOODI.A. PRETTYN. MOHAN: "Comparison of the ability of different colour indices to assess changes in tooth whiteness", J. DENT., vol. 35, 2007, pages 109 - 116, XP005824683, DOI: 10.1016/j.jdent.2006.06.006
M. DEL MAR PEREZR. GHINEAM.J. RIVASA. YEBRAA.M. LONESCUR.D. PARAVINAL.J. HERRERA: "Development of a customized whiteness index for dentistry based on CIELAB colour space", DENT. MATER., vol. 32, 2016, pages 461 - 467, XP029423422, DOI: 10.1016/j.dental.2015.12.008
Attorney, Agent or Firm:
TANSLEY, Sally, Elizabeth (NL)
Download PDF:
Claims:
CLAIMS

1. A computer-implemented method (200) for determining a colour value of one or more teeth comprising: receiving (202) an image of teeth and a calibration pattern; identifying (204) one or more teeth from the image using a segmenting model, wherein the segmenting model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image; determining (206) an observed colour of each of the one or more teeth from the image; identifying (208) a plurality of coloured areas of the calibration pattern from the image; determining (210) an observed colour of each of the coloured areas of the calibration pattern; determining (212) a correction model by comparing the observed colour of each of the coloured areas of the calibration pattern with a respective known colour of a corresponding known pattern; and applying the correction model to the observed colour of each of the one or more teeth to determine (214) a colour value of the one or more teeth.

2. The method (200) according to claim 1 , wherein the colour value provides an indication of a level of whiteness of the one or more teeth.

3. The method (200) according to claim 1 or claim 2, wherein the colour value provides an indication of a median colour for all of the one or more teeth.

4. The method (200) according to any of the preceding claims, wherein the method comprises determining a chemical treatment in accordance with the colour value of the one or more teeth.

5. The method (200) according to claim 4, wherein the determined chemical treatment relates to recommending a tooth whitening product to teeth in accordance with the colour value of the one or more teeth. The method (200) according to any of the preceding claims, wherein the calibration pattern is positioned adjacent to a mouth containing the one or more teeth in the image. The method (200) according to any of the preceding claims, wherein the image comprises at least 8 teeth. The method (200) according to any of the preceding claims, wherein the images are received from a camera of a user device. The method (200) according to any of the preceding claims, wherein the method is provided by a software application, preferably the software application is WeChat Mini Program, Taobao Mini Program or App for a mobile device. A method performed by a user to determine a colour value of one or more teeth using a user device, wherein the user: holds a calibration pattern adjacent to the user’s mouth; bares one or more teeth to a camera of the user device; and operates the user device to perform the method (200) according to any of the preceding claims. A computer-implementing method for training a segmenting model to identify teeth, wherein the segmenting model is implemented by a machine learning algorithm, the method comprising providing the segmenting model with training data comprising a plurality of annotated images of teeth in which the teeth have been manually identified. A computer-readable medium comprising non-transitory computer program code configured to cause a processor to execute the method (200) according to any of the preceding claims. A mobile computing device comprising: the computer readable medium according to claim 12; a processor; and a camera for obtaining the image of teeth and the calibration pattern. A tooth whitening kit comprising: a tooth whitening product; a calibration pattern for use in the method according to any one of claims 1 to 10.

Description:
METHODS AND APPARATUS FOR DETERMINING A COLOUR VALUE OF TEETH

Field of the Invention

This invention relates to methods and apparatus for determining a colour value of one or more teeth and in particular, for user self-assessment of tooth whiteness.

Background of the Invention

Consumers have always had a strong desire for healthy and white teeth. Though many people are concerned about their tooth whiteness (which may also be referred to as teeth whiteness), it is difficult to know the tooth whiteness level by the naked eye. Traditionally, consumers needed to go to a dentist to know their tooth whiteness level. Professional dentist uses a set of teeth samples in a tooth whiteness evaluation method known as the VITA Bleachedguide 3D-MASTER®. A dentist may hold individual samples from the set of dental samples adjacent to a patient’s tooth in order to find a sample that appears to be of a similar colour to the patient’s tooth. Advantages of such a tooth categorization scheme include that the evaluation of tooth colour is performed by a trained specialist and that the scheme uses a validated tool. Disadvantages of such a scheme include that it needs a standard environment and that it requires consumers to visit a dental clinics in person. In addition, this desire for whiter teeth has given rise to a growing trend in the increased use of tooth whitening products, which range from toothpastes to mouthwashes and chewing gums. However, it is not easy to track one’s teeth colour changes after using the tooth whitening products.

Some tooth whitening products provide a printed colour calibration card for consumers to measure their tooth whiteness. Conventional colour calibration cards usually contain images of teeth of various shades. A user may hold the colour calibration card next to their mouth to allow another person to assess which image appears to be most similar to the teeth of the user, or the user may view their own teeth at the same time as the images in a mirror. Such colour calibration cards are not validated standard tools. Also, the evaluation accuracy could be compromised due to the printing quality of the card and the accuracy of the user’s visual evaluation.

Various aspects of the invention address problems encountered in prior art methods of tooth colour evaluation. Summary of the Invention

According to a first aspect of the invention there is provided a computer-implemented method for determining a colour value of one or more teeth, the method comprising: receiving an image of teeth and a calibration pattern; identifying one or more teeth from the image using a segmenting model, wherein the segmenting model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image; determining an observed colour of each of the one or more teeth from the image; identifying a plurality of coloured areas of the calibration pattern from the image; determining an observed colour of each of the coloured areas of the calibration pattern; determining a correction model by comparing the observed colour of each of the coloured areas of the calibration pattern with a respective known colour of a corresponding known pattern; and applying the correction model to the observed colour of each of the one or more teeth to determine a colour value of the one or more teeth.

The colour value may provide an indication of a level of whiteness of the one or more teeth. The colour value of the one or more teeth may be a single value associated with a plurality of teeth. The method may comprise determining a colour value of each of the one or more teeth by applying the correction model to the observed colour of each of the one or more teeth.

The observed colour of the teeth or the colour value may comprise an indication of a colour, tone or shade of the one or more teeth.

The coloured areas of the calibration pattern may also be referred to as areas of colour, or swatches. The areas of the calibration pattern may be respective homogenous areas of a single colour, including areas of a particular tone or shade.

The method may comprise selecting a central portion of each of the one or more teeth. Each central portion may be surrounded by a peripheral portion. Each central portion may comprise at least 60%, preferably from 65% to 95%, more preferably from 70% to 90% and even more preferably from 75 to 85% of the area of a visible area of a respective tooth within the image. A colour value of each of the one or more teeth may be associated with a respective central portion. The method may comprise determining a chemical treatment in accordance with the colour value of the one or more teeth. The chemical treatment may be determined by entering the colour value in a look-up table of chemical treatments. The determined chemical treatment may relate to applying a tooth whitening product to teeth in accordance with the colour value. Preferably, the tooth whitening product is applied to teeth for a period of time. The determined chemical treatment may relate to recommending a tooth whitening product to teeth in accordance with the colour value. Preferably the tooth whitening product is for producing a change in colour value of the teeth.

The method may comprise storing the colour value with an associated date-stamp or timestamp.

The images may be received from a camera of a user device. The method may be performed by a processor of the user device. The method may be performed remotely from the user device. The method may be performed by a computer server.

According to a second aspect there is provided a method performed by a user to determine a colour value of one or more teeth using a user device, wherein the user: holds a calibration pattern adjacent to the user’s mouth; bares one or more teeth to a camera of the user device; and operates the user device to perform the method of the first aspect.

According to a third aspect there is provided a computer-implementing method for training a segmenting model to identify teeth. The segmenting model may be trained to identify individual teeth, for example, on a tooth-by-tooth basis. The method may comprise providing the segmenting model with training data. The training data may comprise a plurality of annotated images of teeth in which the teeth have been manually identified.

According to a fourth aspect there is provided a computer-readable medium comprising non- transitory computer program code configured to cause a processor to execute any of the above methods.

According to a fifth aspect there is provided a mobile computing device comprising: a processor; a camera for obtaining an image of teeth and a calibration pattern; and the computer readable medium according to the fourth aspect. The mobile computing device may be a user’s portable computing device, such as a laptop computer, tablet (e.g. Apple® iPad®) or smart phone.

According to a sixth aspect, there is provided a tooth whitening kit comprising: a tooth whitening product; and a calibration pattern for use in the method according to the first and/or second aspects described above.

The tooth whitening kit may also provide instructions or a code for accessing a computer program configured to perform the method according to the first and/or second aspects described above, such as the providing of a quick response (QR) code, a universal resource location (URL) or details of the program name in an App Store.

According to a seventh aspect of the invention there is provided a data processing unit configured to perform any method described herein as a computer-implementable. The data processing unit may comprise one or more processors and memory, the memory comprising computer program code configure to cause the processor to perform any method described herein.

There may be provided a computer program, which when run on a computer, causes the computer to configure any apparatus, including a circuit, unit, device or system disclosed herein to perform any method disclosed herein. The computer program may be a software implementation. The computer may comprise appropriate hardware, including one or more processors and memory that are configured to perform the method defined by the computer program.

The computer program may be provided on a computer readable medium, which may be a physical computer readable medium such as a disc or a memory device, or may be embodied as a transient signal. Such a transient signal may be a network download, including an internet download. The computer readable medium may be a computer readable storage medium or non-transitory computer readable medium.

The skilled person will appreciate that except where mutually exclusive, a feature or parameter described in relation to any one of the above aspects may be applied to any other aspect. Furthermore, except where mutually exclusive, any feature or parameter described herein may be applied to any aspect and/or combined with any other feature or parameter described herein. Brief Description of the Figures

Figure 1 illustrates a schematic block diagram of a computer system;

Figure 2 illustrates a flow chart of a method for determining the colour value of one or more teeth;

Figure 3 illustrates an image of a user’s mouth with areas of individual teeth identified using a tooth segmentation model.

Detailed Description of the Invention

Except in the examples, or where otherwise explicitly indicated, all numbers in this description indicating amounts of material or conditions of reaction, physical properties of materials and/or use may optionally be understood as modified by the word “about”.

It should be noted that in specifying any ranges of values, any particular upper value can be associated with any particular lower value.

For the avoidance of doubt, the word “comprising” is intended to mean “including” but not necessarily “consisting of” or “composed of’. In other words, the listed steps or options need not be exhaustive.

The disclosure of the invention as found herein is to be considered to cover all embodiments as found in the claims as being multiply dependent upon each other irrespective of the fact that claims may be found without multiple dependency or redundancy.

Where a feature is disclosed with respect to a particular aspect of the invention (for example a composition of the invention), such disclosure is also to be considered to apply to any other aspect of the invention (for example a method of the invention) mutatis mutandis.

Figure 1 illustrates a schematic block diagram of a computer system 100 which may be used to implement the methods described herein. The system may typically be provided by a user device, such as a laptop computer, tablet computer of smart phone.

The system 100 comprises one or more processors 102 in communication with memory 104. The memory 104 is an example of a non-transitory computer readable storage medium. The one or more processors 102 are also in communication with one or more input devices 106 and one or more output devices 108. In addition to the one or more input devices 106, the processor is in communication with a camera 110 for obtaining one or more images. The various components of the system 100 may be implemented using generic means for computing known in the art. For example, the input devices 106 may comprise a keyboard or mouse, or a touch screen interface, and the output devices 108 may comprise a monitor or display, or an audio output device such as a speaker.

Figure 2 illustrates a method 200 for determining a colour value of one or more teeth. The method 200 may be implemented by computing means. The method 200 comprises steps that may be performed by a processor, either locally at a user device or remotely at a server. Preferably, the computer-implemented method is provided by a software application such as a WeChat Mini Program, Taobao Mini Program or App for a mobile device.

Various steps of the method 200 may be performed in a variety of orders and are not necessarily performed in the sequence given below. In particular, although some steps in Figure 2 are illustrated as being performed in parallel, alternatively the steps may be performed in series, and vice versa.

The method 200 comprises receiving 202 an image of teeth and a calibration pattern. Preferably, the image comprises at least 8 teeth. One or more teeth are identified 204 from the image using a segmenting model. Preferably, the segmentation model is configured to recognize individual teeth and associate a region of the image with each individual tooth. The identification of individual teeth, as opposed to groups of teeth, has been found to improve the accuracy of determination of the colour. The segmentation model may be implemented using a trained machined learning system, for example.

Figure 3 illustrates an image 300 of a user’s mouth with areas of individual teeth identified using a tooth segmentation model. The respective identified teeth have been marked by homogeneous masked regions 301-312 in this image 300.

Returning to Figure 2, an observed colour of each of the one or more teeth is determined 206 from the image. The observed colour may be taken to be the raw colour of a pixel or a plurality of pixels associated with a particular tooth. For example, one or more pixels at a central portion of an image of the tooth may be used to avoid shadow effects towards the edges of the teeth. In one example, the segmentation result of single tooth needs to be removed at least 10 % or of the overall tooth colour calculation. In such an example, the central region may occur up to 85 % or 90 % of the region of the tooth, and may exclude the peripheral region.

Using the image of the calibration pattern, a plurality of observed areas of colour of the calibration pattern are identified 208 and a colour of each of the areas of colour is determined 210. The calibration pattern corresponds to a known pattern and comprises a plurality of areas of different colours. Each area of colour on the calibration pattern sheet (and the known calibration pattern) may be a homogenous area of a single colour, tone or shade, although due to the effects of lighting the areas of colour will not necessarily appear to be homogeneous in the image.

A correction model is determined 212 by comparing the observed colour of each of the areas of colour of the calibration pattern with a respective known colour of a corresponding known pattern. The known pattern contains the colours would be expected to be observed in the calibration pattern if it were viewed under specific lighting conditions by a known device. For example, by comparing an observed red with a corresponding known red, and doing the same for a green area and a blue area, a substantial amount of information is available on the difference between the observed colour in the image and the actual known colour. A difference value for each of the areas of colour can be obtained. The difference values may be used to obtain a correction model using conventional colour filtering algorithms. The correction model may be configured to provide a mapping between observed colours and corrected colours (thereby accounting for the ambient lighting conditions or camera settings, for example), which may be applied to other parts of the same image. In this way, a colour value of the one or more teeth may be determined 214 based on the observed colour of each of the one or more teeth after adjustment using the correction model. The colour value may provide an indication of a level of whiteness, of a single tooth, or a plurality of teeth. For example, the colour value may provide an indication of an average (e.g. mean or median) colour for all of the one or more teeth that are visible. Preferably the colour value provides an indication of a median colour for all of the one or more teeth. In some examples, the colour value may be a definition of a colour in a recognised colour space, such as Cl ELAB. In other examples, the colour value may provide a whiteness score or a whiteness rating, which may be on an arbitrary scale.

A user may determine a colour value of one or more teeth using a user device by holding a calibration pattern adjacent to the user’s mouth, baring one or more teeth to a camera of the user device, and operating the user device to perform the method 200. That is, the user may use their own photo-taking device, such as a smartphone, to take a selfie, then process the image using a program to get a tooth whiteness results instantly. In daily scenarios, the teeth colours obtained directly from the photos of the teeth are not properly exposed to consistent and standard lighting conditions, and the images obtained from various smart devices are also not normalized for colour identification purpose. In order to overcome these limitations, the calibration pattern allows colour calibration and white balancing on teeth regions.

In some examples, the invention may replace the traditional tooth whiteness evaluation method (dentist scoring) with detecting tooth whiteness from photos taken by a mobile device, for example. Consumers can use the invention to know their tooth whiteness at any time and place. Time can be saved by avoiding the need to visit a dentist for the purpose of tooth colour evaluation. Consumers may use the method at home for tracking tooth whitening effects of products like whitening strips and whitening emulsions. The results of score may be available immediately.

The above method is configured to allow a user to determine their own tooth whiteness, for example. As such, it may be convenient to provide access to the method 200 alongside a tooth whitening product, so that the method 200 can be applied to determine the result of using the product.

In one example, there is provided a tooth whitening kit comprising a tooth whitening product and a calibration pattern for use in the method 200. The tooth whitening kit may also provide instructions or a code for accessing a computer program configured to perform the method 200. For example, the box of the kit may be applied with a code such as a QR code, a URL for assessing or obtaining the computer program, or of details of the program’s name in an App Store. The code or information for accessing the computer program may also be provided on the tooth whitening product container, instructions or part of the calibration pattern.

In some examples, the method 200 may further comprise providing a personalized product recommendations based on a users’ color value. In such examples, an appropriate chemical treatment is determined in accordance with the colour value determined for the one or more teeth. An appropriate treatment may be determined by entering the colour value in a look-up table of chemical treatments. The determined chemical treatment may relate to applying a tooth whitening product, such as a specific formulation of a tooth whitening agent, to teeth in accordance with the colour value. Preferably, the tooth whitening product is applied to teeth for a period of time. Alternatively, the determined chemical treatment may relate to recommending a tooth whitening product in accordance with the colour value. Preferably the tooth whitening product is for producing a change in colour value of the teeth.

The method may be used to track the progress of a tooth whitening treatment over time. For example, a user may wish to see how a particular product has affected their teeth over a period of use, such as a number of days or weeks. To assist in facilitating such comparisons, the method 200 may further comprise storing the colour value with an associated date-stamp or time-stamp. The aggregated data from a period of use may be stored in a database and the software may be configured to display the data to the user in the form of a table or graph, for example.

A computer-implementing method for training a segmenting model to identify teeth, wherein the segmenting model is implemented by a machine learning algorithm, the method comprising providing the segmenting model with training data comprising a plurality of annotated images of teeth in which the teeth have been manually identified. Preferably, the segmenting model is trained to identify individual teeth, for example, on a tooth-by-tooth basis.

A teeth segmentation model of a tooth whiteness detection algorithm based on deep neural network may be trained by: a. taking sample photos of teeth; b. conducting labelling for each individual tooth in the taken sample photos (that is the areas of teeth may be identified manually); and c. training the deep learning tooth-by-tooth segmentation model based on the sample photos with teeth labelling.

Preferably the sample photos of teeth are taken with abundant and uniform-distributed teeth colours. Preferably, the labelling is polygonal labelling.

A colour calibration card detection model of a tooth whiteness detection algorithm based on deep neural network may be trained by: a. taking sample photos of the colour calibration card; b. labelling the areas of colour calibration card in the taken photos; and c. training deep learning object detection model based on the sample photos with labelling. Preferably, the sample photos of the colour calibration card are taken in different brightness, colour temperature, shooting angle and distance. More preferably, the sample photos are taken with resolution more than 800p.

Preferably, the areas of colour calibration card in the taken photos are labelled with rectangular boxes.

Examples

The following example is to facilitate an understanding of the invention. The example is not intended to limit the scope of the claims.

An implementation of a method in accordance with that described previously with reference to Figure 2 is provided below. The inventors have identified that in order to ensure the reliability of a tooth whiteness evaluation, various technical challenges may be overcome. Tooth-by-tooth segmentation may be used to precisely identify a region of interest and reduce detraction from other area of mouth.

In one example, the following steps are taken by a user: i) User take a photo of their teeth and a colour calibration card held by the user, through an application on the user’s personal device, such as a smartphone or tablet. The colour calibration card comprises a plurality of swatches. ii) Obtain an image of all visible teeth by segmenting each individual tooth through a pretrained tooth-by-tooth segmentation model. iii) Detect the colour calibration card and the colour of each swatch through a pretrained colour calibration card (and colour swatch) detection model. iv) Calibrate the colour of each individual tooth in the photo from the changing of colour and brightness of the swatches matrix in the colour calibration card, and convert the colour (converted to Cl ELAB colour space to compare with vita shade) into a dental standard whiteness score.

Various aspects of the above four steps are described in further detail below.

In the first step, when a user takes a photo of a front angle of their teeth, the user should show teeth to the camera, bring the colour calibration card close to the teeth, reduce shadows and ensure the effect of light on teeth and the colour calibration card is consistent, such as the same light intensity and angle, to reduce the error caused by the relative colour changes in teeth and the colour calibration card. Typically, the image will comprise at least 8 teeth. The application can import an image or directly take a photo of user while exposing teeth along with a standard colour calibration card.

In the second step, after the image is obtained, masked regions of teeth are extracted from the image. This step process does not necessarily use information regarding facial details nor including gum regions.

A Mask R-CNN model may be used to train the tooth-by-tooth segmentation model used in the second step. Details regarding implementation of such a model can be found in Mask R-CNN: He, Kaiming, Georgia Gkioxari, Piotr Dollar, and Ross Girshick. “Mask R-CNN.”

ArXiv: 1703.06870 [Cs], March 20, 2017. http://arxiv.org/abs/1703.06870.

In one example, the dataset used for the training the model included 1 ,500 pieces of teeth photos taken from 100 users to address different colour intervals of teeth. Users may choose to train the model using more or fewer images, or select other deep neural network models than Mask R-CNN.

In the third step, the colour calibration card and all colour swatches in the card are detected to obtain the area of colour calibration card and the colour of each swatch through the pretrained colour calibration card (and colour swatch) detection model.

A YOLO v3 model may be used to label photos of the colour calibration card and train the colour calibration card (and colour swatch) detection model for use in the third step. Details regarding the implementation of such a model can be found in YOLOv3: Redmon, Joseph, and Ali Farhadi. “YOLOv3: An Incremental Improvement.” ArXiv: 1804.02767 [Cs], April 8, 2018. http://arxiv.org/abs/1804.02767.

1000 photos of the calibration card were used as a training dataset in one example. After the colour calibration card area is extracted, OpenCV (https://opencv.org/) model may be used to extract a numerical colour matrix of swatch colour values according to the relative arrangement position of swatches. Users may choose other deep neural network models, such as Mask R- CNN, to extract colour patches matrix on the colour carlibration card directly or indirectly, with or without OpenCV. A colour calibration and white balancing model is obtained in a conventional way using the obtained numerical colour matrix with known, true values for the colour matrix. The invention does not necessarily impose limitations on the type of calibration cards or numbers of colour patches on card, as long as the colour range covers a wide range of, or all, colours and whiteness.

In the fourth step, colour calibration and white balancing model is applied to each individual tooth area divided in the second step to obtain the true tooth colour after colour rendition, and the colour is converted into the whiteness score of each tooth according to dental standards. For example, the Finlayson 2015 colour calibration method (Finlayson, Graham D., Michal Mackiewicz, and Anya Hurlbert. "Colour correction using root-polynomial regression" IEEE Transactions on Image Processing 24.5 (2015): 1460-1470) may be applied to each tooth, then the RGB colour values may be extracted and converted to Cl ELAB colour values for all pixels.

To finally get tooth colour values, the lighting condition on different teeth is accounted for. The dark shading on lateral teeth and white reflections on front teeth are noise sources that should be removed from colour calculation. To increase the overall colour extraction accuracy and consistency, teeth regions were sliced from the image and eroded by 15% of individual area close to their own contours, because the teeth lateral sides of teeth tend to be relatively underexposed. That is, the 15% outer regions of each tooth region were removed. In a stricter case, the 10% most dark and bright values from the Cl ELAB colour ranges were also removed. The median value of each resulting tooth colour values may be used to be representative.

The colour value may provide a CIE whiteness index (WIO: W. Luo, S. Westland, P. Brunton, R. Ellwood, I. A. Pretty, N. Mohan, Comparison of the ability of different colour indices to assess changes in tooth whiteness, J. Dent. 35 (2007) 109-116) and CIELAB-based whiteness index (WID: M. del Mar Perez, R. Ghinea, M.J. Rivas, A. Yebra, A.M. lonescu, R.D. Paravina, L.J. Herrera, Development of a customized whiteness index for dentistry based on Cl ELAB colour space, Dent. Mater. 32 (2016) 461-467) based on their corresponding formulae, provided below. The closest colour on Vita Shade Guide by Euclidian distance of LAB representatives (Delta E) may also be provided as the colour value.

WIO = Y + 1075.012(x n - x) + 145.516(y n - y)

Where (x, y) and (x n , y n ) are the chromaticity coordinates of the sample and the reference white respectively. WI D = 0.5111* - 2.324a* - 1.100b*

(2) wherein L*, a* and b* relate to color coordinates lightness, green-red and blue-yellow respectively.

The colour value may also provide a whiteness score or a whiteness rating, which may be on an arbitrary scale.