Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD FOR DETERMINING MAKE-UP REMOVAL EFFICIENCY
Document Type and Number:
WIPO Patent Application WO/2014/102567
Kind Code:
A1
Abstract:
The invention relates to a method for evaluating make-up removal efficiency comprising the steps of: - Capturing a first image of a given made-up keratinous portion before make-up removal; - Capturing a second image of said keratinous portion after make-up removal; - Dividing each image into a plurality of evaluation areas; - Determining a pixel value for each evaluation area; - Determining a cleaning efficiency (E) by calculating an average of pixel values differences between similar evaluation areas of the first and the second captured images. - Determining a uniformity index (U) by calculating an average of pixel values differences between neighboring evaluation areas within the second captured image. The method of the invention provides reliable and complete information about makeup removal mean.

Inventors:
AGARWAL GAURAV (IN)
Application Number:
PCT/IB2012/057740
Publication Date:
July 03, 2014
Filing Date:
December 27, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ORÉAL L (FR)
International Classes:
G06T7/00; A45D44/00
Foreign References:
JP2011080915A2011-04-21
DE19846530A12000-04-13
US20110088711A12011-04-21
US20110088711A12011-04-21
FR2909871A12008-06-20
Other References:
MARAGOS P: "Morphological correlation and mean absolute error criteria", 19890523; 19890523 - 19890526, 23 May 1989 (1989-05-23), pages 1568 - 1571, XP010082648
LEI JI ET AL: "An Agreement Coefficient for Image Comparison", PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING, vol. 72, no. 7, 1 July 2006 (2006-07-01), pages 823 - 833, XP055076533
Download PDF:
Claims:
CLAIMS

1. A method for evaluating make-up removal efficiency comprising the steps of:

- Capturing an image of a keratinous portion after make-up removal; - Dividing said image into a plurality of evaluation areas;

- Determining a pixel value for each evaluation area;

- Determining a uniformity index (U) by calculating an average of pixel values differences between neighboring evaluation areas. 2. A method for evaluating make-up removal efficiency comprising the steps of:

- Capturing a first image of a given made-up keratinous portion before make-up removal;

- Capturing a second image of said given keratinous portion after make- up removal;

- Dividing each image into a plurality of evaluation areas;

- Determining a pixel value for each evaluation area;

- Determining a cleaning efficiency (E) by calculating an average of pixel values differences between similar evaluation areas of the first and the second captured images.

3. The method of claim 2, further comprising the steps of:

- Capturing a reference image of the given keratinous portion before make-up application;

- Dividing the reference image into a plurality of evaluation areas;

- Calculating an average (Emax) of pixel values differences between similar evaluation areas of the reference and the first captured images;

- Determining a percentage of cleaning efficiency (Ε') as a percentage of the average of differences between the first and the second captured images over the average of differences between the reference and the first captured images.

4. The method of any one of preceding claims, wherein the pixel value for each evaluation area is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.

5. The method of any one of claims 1 to 4, wherein the images are captured in vitro on dead or reconstructed keratinous, and wherein the evaluation areas are defined as image pixels of the captured images.

6. The method of any one of claims 1 or 2, wherein the images are captured in vivo on living human keratinous, and wherein the evaluation areas are defined as quadrants on the captured images.

7. The method of any one of claims 1 to 6, wherein the make-up is applied on skin.

8. The method of any one of claims 1 to 6, wherein the make-up is dye applied on hair.

9. The method of any one of claims 1 to 6, wherein the make-up polish is applied on nails.

Description:
"METHOD FOR DETERMINING MAKE-UP REMOVAL EFFICIENCY"

Field of the invention

Present invention relates to a method to assert efficiency of make-up removal.

Background of the invention

Make-up can be removed by manual cleansing or by different electronic devices such as cleansing brushes or by impregnated or dry wipes or pads.

In the context of the present invention, make-up is understood to be facial make-up applied over skin or hair dye or nail polish. The present intention addresses evaluation of efficiency of removal of such make-up.

One known scientific approach to quantify the amount of make-up removed by any of above means is colorimetry. For instance US 2011/0088711 or FR 2 909 871 disclose use of colorimetry to check the efficiency of make-up removal compositions. Notably, a test protocol is disclosed to measure colorimetry differences before make-up application and after make-up removal.

The colorimetry test protocol request that several areas be marked on the skin (typically four areas of 4*4 cm 2 ) and each area is subjected to 3 L,a,b, measurements using a colorimeter (for instance CR300 colorimeter). The made-up skin colorimetry is compared with bare skin colorimetry to provide a AE max for each measured area and the cleaned up skin colorimetry is compared with made-up skin colorimetry to provide a ΔΕ for each measured area. The efficiency of makeup removal is then provided as a percentage corresponding to the average of the values (AE/AE max )* 100. The colorimetry measurement protocol is widely used to quantify make up removal efficiency of a composition or device such as wipe or brush. However, this colorimetry protocol does not provide very accurate results.

Moreover, uniformity of make-up removal cannot be quantified using the above mentioned test protocol. Summary of the invention

There is a need for a new approach of evaluating the performance of make-up removal means.

In one aspect, the invention relates to an image processing method wherein images of human skin or artificial bio skin are compared before and after make-up removal. More specifically, the invention relates to a method for evaluating make-up removal efficiency comprising the steps of:

- Capturing an image of a keratinous portion after make-up removal;

- Dividing said image into a plurality of evaluation areas;

- Determining a pixel value for each evaluation area;

- Determining a uniformity index by calculating an average of pixel values differences between neighboring evaluation areas.

The invention also relates to a method for evaluating make-up removal efficiency comprising the steps of:

- Capturing a first image of a given made-up keratinous portion before make-up removal;

- Capturing a second image of said given keratinous portion after makeup removal;

- Dividing each image into a plurality of evaluation areas;

- Determining a pixel value for each evaluation area;

- Determining a cleaning efficiency by calculating an average of pixel values differences between similar evaluation areas of the first and the second captured images.

According to an embodiment, the method further comprises the steps of:

- Capturing a reference image of the given keratinous portion before make-up application;

- Dividing the reference image into a plurality of evaluation areas;

- Calculating an average of pixel values differences between similar evaluation areas of the reference and the first captured images;

- Determining a percentage of cleaning efficiency as a percentage of the average of differences between the first and the second captured images over the average of differences between the reference and the first captured images.

According to an embodiment, the pixel value for each evaluation area is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.

According to an embodiment, the images are captured in vitro on dead or reconstructed keratinous, and wherein the evaluation areas are defined as image pixels of the captured images.

According to another embodiment, the images are captured in vivo on living human keratinous, and wherein the evaluation areas are defined as quadrants on the captured images.

According to the invention, the make-up can be applied on skin, or the make-up can be dye applied on hair, or the make-up can be polish applied on nails.

Brief description of the figures

Further features and advantages of the invention will become apparent upon reading the following description and with reference with the appended figures which show:

- Figure 1 illustrates the image of a skin portion after applying makeup ;

- Figure 2 illustrates the image of said skin portion after removing makeup;

- Figure 3 illustrates the image of said skin portion before applying makeup;

- Figure 4 illustrates division of a captured image into quadrants;

- Figure 5 illustrates a quadrant divided image of a skin portion after applying makeup;

- Figure 6 illustrates a quadrant divided image of said skin portion after removing makeup;

- Figure 7 illustrates a quadrant divided image of said skin portion for make-up removal uniformity determination. Detailed description of the invention

The method for evaluating make-up removal according to the invention relies on image processing. Figures 1 and 2 show images of a skin portion after applying makeup and after removing makeup respectively. Figure 1 is referred to as a first image of a given made-up skin portion before make-up removal and figure 2 is referred to as a second image of said given skin portion after make-up removal.

Each image is a digital image and can be divided in a two dimensional array of evaluation areas. Each area can be a unique pixel or a group of pixels depending on the image resolution of the image capturing device. In figures 1 and 2, 16 evaluation areas have been defined but it is understood that more or less than 16 areas can be defined while implementing the present invention.

Each area, in each image, is assigned a pixel number (i=l,..., n), with n=16 (n= a x b) in the illustrated example, where a and b are number of rows and columns . Each area is also assigned a pixel value ¾, bi in Figure 1 and Figure 2 respectively, over a gray scale representing the intensity of light over the given area. In a preferred embodiment, the pixel value is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.

The make-up removal efficiency of a device or substrate can be quantified by two parameters: a Cleaning efficiency (E) defined by the average of absolute value of the difference between corresponding pixel values in the first image after applying makeup and the second image after removing the makeup; and a Uniformity Index (U) defined by the average of absolute value of difference between neighboring pixels in the image after removing the makeup. Higher value of E represents a better performance of makeup removal mean; and lower value of U represents more uniform removal of makeup.

The Cleaning efficiency (E) may be calculated as follow:

(E) = ^ with a; and bi the pixel values of each area (pixel) in first and second images and n the number of areas (pixels) considered in each image.

The Uniformity Index (U) may be calculated as follow:

/=!,(«-!)

(a - l) *b

(2) with bi the pixel values of each area (pixel) in second image and n the number of areas (pixels) considered in the image.

In an embodiment, the cleaning efficiency can be further expressed as a percentage of cleaning with respect to bare skin before make-up is applied.

To that purpose, a reference image can be captured, illustrated in figure 3. The reference image represents the same skin portion as figures 1 and 2 and was taken before make-up was applied, preferably once skin portion was cleaned with soap. Similarly to first and second images (figures 1 and 2) evaluation areas are defined in the referenced image (figure 3) and are assigned a pixel number (i=l,..., n), with n=16 in the illustrated example, and a pixel value c;.

An average of pixel values differences between similar evaluation areas of the reference and the red images can thus be calculated as:

n

Then, cleaning efficiency E' can then be defined as a percentage of the first calculated average over the second calculated average:

E'=E/E max * 100 (4)

The images may be captured in vitro on reconstructed skin, in which case the evaluation areas are defined as image pixels of the captured images as described above. The reconstructed skin can be bioskin® or any other artificial skin for evaluation purposes.

When the method of the invention is to be applied, one need to make sure that the images of the skin portion before and after removing the makeup are captured exactly at the same position to be able to make the correspondence between pixels when calculating the cleaning efficiency (E) and the Uniformity Index (U) as defined above. In other words the corresponding pixel's position should not be changed between first and second image and between first and referenced image. This can be quite easily achieved when performing in vitro test.

The images may also be captured in vivo on living human skin, in which case evaluation areas are defined as quadrants on the captured image. Indeed, when performing in vivo test using instruments such as chromospheres, it is difficult to capture two images maintaining the exact same position of corresponding pixels.

Therefore, each image can be divided into number of windows (4, 9, 16 k 2 ) as illustrated in figure 4 to define quadrants that can be substantially equivalent from one image to the other.

Figures 5 and 6 illustrate the calculation of the two parameters of Cleaning efficiency (E) and Uniformity Index (U) when using the quadrant decomposition of the captured images.

Each image is a digital image and is divided in a two dimensional array of evaluation areas. Each evaluation area is a quadrant defined on the captured image and each quadrant includes a group of pixels. In figures 5 and 6, four evaluation areas have been defined, each having 16 pixels, but it is understood that more or less than 4 areas can be defined including more or less than 16 pixels while implementing the present invention.

In each image, each pixel is assigned a pixel number (i=l,..., n) and each quadrant is assigned a quadrant number (j=l, k). Each pixel in each quadrant is also assigned a pixel value pi, φ, r;, ¾, Pi, Qi, Ri, Si, as defined above with respect to ¾

The Cleaning efficiency (E) may then be calculated as follow:

(* - !) * * (5)

with pi and Pi the pixel values of each pixel in a given quadrant of first and second images and n the number of pixels in each quadrant; and k the number of quadrant considered in each image.

When considering the quadrant decomposition of captured images, the Uniformity Index (U) of the makeup removal can be defined by average of difference between sum of pixels of neighboring quadrants in the image of skin captured after removing the makeup, as illustrated in figure 7.

The Uniformity Index (U) may then be calculated as follow: ∑(B N - B N+l )

N=\to((kk--\l))

(U)

(6)

with B N the pixel values of each quadrant (defined as the sum of color values of each pixel in the given quadrant) in second image and k the number of areas (quadrants) considered in the image.

In above illustrated examples uniformity index was calculated only in horizontal direction (average of absolute value of difference between horizontal neighboring pixels) only. In other embodiment of this method, uniformity index can be also calculated in vertical direction (average of absolute value of difference between vertical neighboring pixels). Yet in another embodiment of this invention overall uniformity index can be defined as average of uniformity index in horizontal and vertical direction.

An in vivo example of implementation of the invention is being given below, where images of human chick were captured and corresponding area of 200 x 200 pixels was selected on each image for the evaluation.

- an image is taken of a portion of left chick of the human before applying make-up;

- an image is taken of a portion of left chick of the human after applying make-up.

- an image is taken of a portion of left chick of the human after removing make-up with make-up removal wipes.

- an image is taken of a portion of right chick of the human before applying make-up.

- an image is taken of a portion of right chick of the human after applying make-up.

- an image is taken of a portion of right chick of the human after removing make-up by cleansing brushes.

The images were captured using Chromasphere instrument capturing digital images of skin in standardized light. Standard foundation and blusher was applied on both left and right chick of the consumer. On the left side, the make-up was removed using make-up remover wipe comprising a nonwoven substrate and formulation for removing the make-up. On the right side chick, the make-up was removed by electronic oscillating cleansing brush and same formulation as used for wipes.

Following results were obtained using the quadrant method described in this invention:

The method of the invention allows drawing the following conclusions:

- In above example, cleansing efficiency of cleansing brush is better than cleansing wipes;

- Cleansing brushes remove the make-up more uniformly in horizontal direction (lower value of uniformity index);

- Wipes remove the make- up more uniformly in Vertical direction

(lower value of uniformity index);

- Overall, cleansing brushes remove make-up more uniformly than wipes.

The method of the invention considers each micro evaluation areas (pixel) and is therefore more reliable when compared to classical colorimetry methods; it also provides additional information about uniform performance of a makeup removal mean, which information is not available when using colorimetry.

The method of the invention is also easy to operate and little time consuming When implementing the method of the invention, one can defined either the two parameters consisting of Cleaning efficiency (E) and Uniformity Index (U) or only one of the two parameters. While the invention was described in conjunction with the detailed description above and illustrated figure, the foregoing description is only intended to illustrate and not limit the scope of the invention, which is defined by the scope of the appended claims. Other aspects, advantages and modifications are within the claims. Notably, the description was mainly made with respect to skin make-up removal efficiency, but the same approach can be used for any other keratinous portion to which color is applied such as hair dye or nail polish remover.