Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM FOR MATCHING COLOR AND APPEARANCE OF COATINGS CONTAINING EFFECT PIGMENTS
Document Type and Number:
WIPO Patent Application WO/2013/049796
Kind Code:
A1
Abstract:
The present invention is directed to a system for matching color and appearance of a target coating of an article. The system comprises a color measuring device; a sparkle measuring device; a color database; a computing device; and a computer program product that causes the computing device to perform a computing process comprising the steps utilizing sparkle values of the target coating; color data of the target coating; and flop values based on the color data; to identify and select matching formulas based on sparkle differences (ΔSg), flop value differences (ΔF), and color difference indexes (CDI). The system can be used for matching color and appearance of target coatings having effect pigments. The system can be particularly useful for vehicle refinish repairs.

Inventors:
PRAKASH ARUN (US)
STEENHOEK LARRY EUGENE (US)
MOHAMMADI MAHNAZ (US)
RODRIGUES ALLAN BLASE JOSEPH (US)
OBETZ JUDITH ELAINE (US)
Application Number:
PCT/US2012/058258
Publication Date:
April 04, 2013
Filing Date:
October 01, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DU PONT (US)
International Classes:
G06F19/00; B60S5/00; G06F17/30
Foreign References:
US8065314B22011-11-22
US20040252308A12004-12-16
US20050128484A12005-06-16
Attorney, Agent or Firm:
XU, Gann (Legal Patent Records Center4417 Lancaster Pik, Wilmington Delaware, US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1 . A system for matching color and appearance of a target coating of an

article, said system comprising:

a) a color measuring device;

b) a sparkle measuring device;

c) a color database comprising formulas for coating compositions and interrelated sparkle characteristics, color characteristics, and one or more identifiers of articles;

d) a computing device comprising an input device and a display device, said computing device is functionally coupled to said color measuring device, said sparkle measuring device, and said color database; and e) a computer program product residing in a storage media functionally coupled to said computing device, said computer program product causes said computing device to perform a computing process comprising the steps of:

C1 ) receiving specimen sparkle values of the target coating from said sparkle measuring device, said specimen sparkle values are measured at one or more sparkle viewing angles, one or more sparkle illumination angles, or a combination thereof;

C2) receiving specimen color data of the target coating from said color measuring device, said specimen color data are measured at two or more color viewing angles, one or more illumination angles, or a combination thereof;

C3) receiving an identifier of said article from said input device;

C4) generating specimen flop values based on said specimen color data;

C5) retrieving from said color database one or more preliminary

matching formulas based on said specimen color data, said identifier of said article, or a combination thereof; generating one or more sparkle differences (ASg) between sparkle characteristics of each of said preliminary matching formulas and said specimen sparkle values at each of said one or more sparkle viewing angles;

generating one or more flop differences (AF) between flop characteristics derived from color characteristics of each of said preliminary matching formulas and said specimen flop values; generating one or more color difference indexes (CDI) between said specimen color data and color characteristics of each of said preliminary matching formulas; and

producing a ranking list of said preliminary matching formulas based on said sparkle differences (ASg), said flop differences (AF), and said color difference indexes (CDI). 2. The system of claim 1 , wherein said computing process further comprises a ranking process for producing said ranking list, said ranking process comprises the steps of:

B1 ) grouping said one or more preliminary matching formulas into one or more category groups based on said sparkle differences (ASg) and said flop differences (AF) according to predetermined ranges of ASg values and AF values; and

B2) ranking the preliminary matching formulas in each of the category groups based on said color difference index (CDI). 3. The system of claim 1 , wherein said computing process further comprises the steps of:

C10) displaying on said display device said ranking list, one or more preliminary matching formulas based on predetermined values of sparkle differences, flop differences, or color difference indexes, said sparkle differences (ASg), said flop differences (AF), said color difference indexes (CDI), or a combination thereof; C1 1 ) receiving a selection input from said input device to select one or more matching formulas from said ranking list; and

C12) displaying said one or more matching formulas on said display device.

The system of claim 1 , wherein said computing process further comprises the steps of:

C13) generating matching images having matching display values

based on appearance characteristics and the color characteristics of at least one of said preliminary matching formulas at at least one of said one or more color viewing angles, and generating at least one specimen image having specimen display values based on specimen appearance data and said specimen color data;

C14) displaying said matching images and said at least one specimen image on said display device; and

C15) receiving a selecting input from said input device to select one or more matching formulas; and

C16) displaying said one or more matching formulas on said display device.

The system of claim 4, wherein said appearance characteristics are stored in said color database and comprise the sparkle characteristics associated with each of said preliminary matching formulas, matching texture functions associated with each of said preliminary matching formulas, or a

combination thereof, said matching texture functions being selected from measured matching texture function, predicted matching texture function, or a combination thereof.

The system of claim 4, wherein said specimen appearance data comprise the specimen sparkle data, a specimen texture function, or a combination thereof, said specimen texture function being selected from measured specimen texture function, derived specimen texture function based on the specimen sparkle data and specimen color data, or a combination thereof.

The system of claim 1 further comprising a mixing system functionally coupled to said computing device, and said computing process further comprises the steps of outputting one of said one or more matching formulas to said mixing system to produce a matching coating composition based on said matching formula.

The system of claim 1 further comprising a coating application device to applying said matching coating composition over a damaged coating area of said target coating to form a repair coating.

The system of claim 1 , wherein said display device is a video display device.

Description:
TITLE

SYSTEM FOR MATCHING COLOR AND APPEARANCE OF COATINGS CONTAINING EFFECT PIGMENTS FIELD OF DISCLOSURE

[01] The present disclosure is directed to a method for matching color and appearance of a target coating of an article, particularly a target coating comprising one or more effect pigments. The present invention is also directed to a system for matching color and appearance of the target coating.

BACKGROUND OF DISCLOSURE

[02] Surface coatings containing effect pigments, such as light absorbing pigments, light scattering pigments, light interference pigments, and light reflecting pigments are well known. Metallic flake pigments, for example aluminum flakes, are examples of such effect pigments and are especially favored for the protection and decoration of automobile bodies. The effect pigments can produce visual appearance effects, such as differential light reflection effects, usually referred to as "flop"; flake appearance effects, which include flake size distribution and the sparkle imparted by the flake; and also the effects of enhancement of depth perception in coatings. The flop effect is dependent upon the angle from which the coating is illuminated and viewed. The flop effect can be a function of the orientation of the metallic flakes with respect to the outer surface of the coating and the surface smoothness of the flake. The sparkle can be a function of the flake size, surface smoothness, orientation, and uniformity of the edges. The flop and sparkle effects produced by flakes can further be affected by other pigments in the coating, such as light absorbing pigments, light scattering pigments, or flop control agents. Any light scatter from the pigments or the flakes themselves, e.g., from the flake edges, can diminish both the flop and the sparkle of the coating.

[03] For repairing a previously coated substrate, for example, of an automotive body, it is necessary to choose the correct pigments to match the color of the coated substrate as well as the correct effect pigments such as flakes to match the color and appearance of the coated substrate. Many coating formulas are made available by paint suppliers to match various vehicles and objects to be coated. Often there are multiple coating formulas available for the same vehicle make and model because of vehicle coating color and appearance variability due to slight variations in formulations, ingredients used, coating application conditions such as coating application techniques or locations used by vehicle original equipment manufacturers. These color and appearance variations make it difficult to identify the best formula to attain excellent matches in vehicle shops. A number of methods have been developed to identify formulas of correct pigments to achieve color match. Some attempts were made to match both color and appearance of a target coating .

[04] There are still needs for a method for the selection, from multiple existing coating formulas, of one or more matching formulas that closely match both the color and appearance of the target coating.

STATEMENT OF DISCLOSURE

[05] This invention is directed to a method for matching color and appearance of a target coating of an article, said method comprising the steps of:

A1 ) obtaining specimen sparkle values of the target coating measured at one or more sparkle viewing angles, one or more sparkle illumination angles, or a combination thereof;

A2) obtaining specimen color data of the target coating measured at two or more color viewing angles, one or more illumination angles, or a combination thereof;

A3) generating specimen flop values based on said specimen color data;

A4) retrieving from a color database one or more preliminary match formulas based on said specimen color data, an identifier of said article, or a combination thereof, said color database comprises formulas for coating compositions and interrelated sparkle characteristics, color characteristics, and one or more identifiers of articles; A5) generating one or more sparkle differences ( S g ) between sparkle characteristics of each of said preliminary matching formulas at each of said one or more sparkle viewing angles and said specimen sparkle values;

A6) generating one or more flop differences ( F) between flop

characteristics derived from color characteristics of each of said preliminary matching formulas and said specimen flop values; A7) generating one or more color difference indexes (CDI) between said specimen color data and color characteristics of each of said preliminary matching formulas; and

A8) selecting from said preliminary matching formulas one or more matching formulas based on said sparkle differences {AS g ), said flop differences ( F), and said color difference indexes (CDI).

[06] This invention is also directed to a system for matching color and appearance of a target coating of an article, said system comprising:

a) a color measuring device;

b) a sparkle measuring device;

c) a color database comprising formulas for coating compositions and interrelated sparkle characteristics, color characteristics, and one or more identifiers of articles;

d) a computing device comprising an input device and a display device, said computing device is functionally coupled to said color measuring device, said sparkle measuring device, and said color database; and e) a computer program product residing in a storage media functionally coupled to said computing device, said computer program product causes said computing device to perform a computing process comprising the steps of:

C1 ) receiving specimen sparkle values of the target coating from said sparkle measuring device, said specimen sparkle values are measured at one or more sparkle viewing angles, one or more sparkle illumination angles, or a combination thereof; C2) receiving specimen color data of the target coating from said color measuring device, said specimen color data are measured at two or more color viewing angles, one or more illumination angles, or a combination thereof;

C3) receiving an identifier of said article from said input device;

C4) generating specimen flop values based on said specimen color data;

C5) retrieving from said color database one or more preliminary

matching formulas based on said specimen color data, said identifier of said article, or a combination thereof;

C6) generating one or more sparkle differences ( S g ) between sparkle characteristics of each of said preliminary matching formulas and said specimen sparkle values at each of said one or more sparkle viewing angles;

C7) generating one or more flop differences ( F) between flop

characteristics derived from color characteristics of each of said preliminary matching formulas and said specimen flop values; C8) generating one or more color difference indexes (CDI) between said specimen color data and color characteristics of each of said preliminary matching formulas; and

C9) producing a ranking list of said preliminary matching formulas based on said sparkle differences (AS g ), said flop differences (AF), and said color difference indexes (CDI).

BRIEF DESCRIPTION OF DRAWING

[07] Figure 1 shows examples of various illumination angles and viewing angles.

[08] Figure 2 shows an example of a fixed viewing angle and 3 illumination angles for measuring sparkle values.

[09] Figure 3 shows an example of a fixed illumination angle and various viewing angles for measuring sparkle values. [10] Figure 4 shows an example of a representative image display on a digital display.

[11] Figure 5 shows an example of a representative video display of the images.

DETAILED DESCRIPTION

[12] The features and advantages of the present invention will be more readily understood, by those of ordinary skill in the art, from reading the following detailed description. It is to be appreciated that certain features of the invention, which are, for clarity, described above and below in the context of separate embodiments, may also be provided in combination in a single embodiment.

Conversely, various features of the invention that are, for brevity, described in the context of a single embodiment, may also be provided separately or in any subcombination. In addition, references in the singular may also include the plural (for example, "a" and "an" may refer to one, or one or more) unless the context specifically states otherwise.

[13] The use of numerical values in the various ranges specified in this application, unless expressly indicated otherwise, are stated as approximations as though the minimum and maximum values within the stated ranges were both proceeded by the word "about." In this manner, slight variations above and below the stated ranges can be used to achieve substantially the same results as values within the ranges. Also, the disclosure of these ranges is intended as a continuous range including every value between the minimum and maximum values.

[14] As used herein:

[15] The term "dye" means a colorant or colorants that produce color or colors and is usually soluble in a coating composition.

[16] The term "pigment" or "pigments" used herein refers to a colorant or colorants that produce color or colors and is usually not soluble in a coating composition. A pigment can be from natural and synthetic sources and made of organic or inorganic constituents. A pigment can also include metallic particles or flakes with specific or mixed shapes and dimensions. [17] The term "effect pigment" or "effect pigments" refers to pigments that produce special effects in a coating. Examples of effect pigments can include, but not limited to, light absorbing pigment, light scattering pigments, light interference pigments, and light reflecting pigments. Metallic flakes, for example aluminum flakes, can be examples of such effect pigments.

[18] The term "gonioapparent flakes", "gonioapparent pigment" or

"gonioapparent pigments" refers to pigment or pigments pertaining to change in color, appearance, or a combination thereof with change in illumination angle or viewing angle. Metallic flakes, such as aluminum flakes are examples of gonioapparent pigments. Interference pigments or pearlescent pigments can be further examples of gonioapparent pigments.

[19] "Appearance" used herein refers to (1 ) the aspect of visual experience by which a coating is viewed or recognized; and (2) perception in which the spectral and geometric aspects of a coating is integrated with its illuminating and viewing environment. In general, appearance can include shape, texture, sparkle, glitter, gloss, transparency, opacity, other visual effects of a coating, or a combination thereof. Appearance can vary with varying viewing angles or varying illumination angles.

[20] The term "texture", "textures", or "texture of coating" refers to coating appearances that are resulted from the presence of flakes or other effect pigment or pigments in the coating composition. The flakes can include, such as, metallic flakes like aluminum flakes, coated aluminum flakes, interference pigments, like mica flakes coated with metal oxide pigments, such as, titanium dioxide coated mica flake or iron oxide coated mica flake, diffractive flakes, such as, vapor deposited coating of a dielectric over finely grooved aluminum flakes. The texture of a coating can be represented with a texture function generated statistically by measuring the pixel intensity distribution of an image of the coating captured by a digital imaging device. The texture function can be used to generate an image of the coating by duplicating those pixel intensity statistics in the image. For example, if a specimen texture function comprises the pixel intensity distribution of a captured image of a specimen coating in a Gaussian distribution function having mean intensity of μ and a standard deviation of σ, then the specimen image of the coating can be generated based on the Gaussian distribution function having the mean intensity of μ and the standard deviation of σ. The statistical fit can be dependant on specific coatings. The following devices can be used to generate useful data for the determination of the statistical texture function of a coating: flatbed scanning device, wand type scanner or an electronic camera. The texture function of a coating can also be generated based on color data and sparkle values of the coating.

[21] The term "sparkle", "sparkles", "sparkling" or "sparkle effect" refers to the visual contrast between the appearance of highlights on particles of

gonioapparent pigments and their immediate surroundings. Sparkle can be defined by, for example, ASTM E284-90 and other standards or methods.

[22] The term "flop" refers to a difference in appearance of a material viewed over two widely different aspecular angles. As used herein, the term "flop value", "flop values" or "flop index" refers to a numerical scale of flop obtained by instrumental or visual experiments, or derived from calculations based on color data. In one example, flop index can be defined by ASTM E284 or other standards or methods.

[23] The term "database" refers to a collection of related information that can be searched and retrieved. The database can be a searchable electronic numerical or textual document, a searchable PDF document, an Microsoft Excel® spreadsheet, an Microsoft Access® database (both supplied by Microsoft Corporation of Redmond, Washington), an Oracle® database (supplied by Oracle Corporation of Redwood Shores, California), or a Lynux database, each registered under their respective trademarks. The database can be a set of electronic documents, photographs, images, diagrams, or drawings, residing in one or more computer readable storage media that can be searched and retrieved. A database can be a single database or a set of related databases or a group of unrelated databases. "Related database" means that there is at least one common information element in the related databases that can be used to relate such databases. One example of the related databases can be Oracle® relational databases. In one example, color characteristics comprising color data values such as L,a,b color values, L * ,a * ,b * color values, XYZ color values, L,C,h color values, spectral reflectance values, light absorption (K) and scattering (S) values (also known as "K,S values"), or a combination thereof, can be stored in and retrieved from one or more databases. Other color values such as Hunter Lab color values, ANLAB color values, CIE LAB color values, CIE LUV color values, L * ,C * ,H * color values, any other color values known to or developed by those skilled in the art, or a combination thereof, can also be used. In another example, appearance characteristics, sparkle values and related measurements, coating formulations, vehicle data, or a combination thereof, can be stored and retrieved from one or more databases.

[24] The term "vehicle", "automotive", "automobile", "automotive vehicle", or "automobile vehicle" refers to an automobile such as car, van, mini van, bus, SUV (sports utility vehicle); truck; semi truck; tractor; motorcycle; trailer; ATV (all terrain vehicle); pickup truck; heavy duty mover, such as, bulldozer, mobile crane and earth mover; airplanes; boats; ships; and other modes of transport that are coated with coating compositions.

[25] A computing device used herein can refer to a data processing chip, a desktop computer, a laptop computer, a pocket PC, a personal digital assistant (PDA), a handheld electronic processing device, a smart phone that combines the functionality of a PDA and a mobile phone, or any other electronic devices that can process information automatically. A computing device can be built into other electronic devices, such as a built-in data processing chip integrated into an imaging device, color measuring device, or an appearance measuring device. A computing device can have one or more wired or wireless connections to a database, to another computing device, or a combination thereof. A computing device can be a client computer that communicates with a host computer in a multi-computer client-host system connected via a wired or wireless network including intranet and internet. A computing device can also be configured to be coupled with a data input or output device via wired or wireless connections. For example, a laptop computer can be operatively configured to receive color data and images through a wireless connection. A "portable computing device" includes a laptop computer, a pocket PC, a personal digital assistant (PDA), a handheld electronic processing device, a mobile phone, a smart phone that combines the functionality of a PDA and a mobile phone, a tablet computer, or any other electronic devices that can process information and data and can be carried by a person.

[26] Wired connections can include hardware couplings, splitters, connectors, cables or wires. Wireless connections and devices can include, but not limited to, Wi-Fi device, Bluetooth device, wide area network (WAN) wireless device, local area network (LAN) device, infrared communication device, optical data transfer device, radio transmitter and optionally receiver, wireless phone, wireless phone adaptor card, or any other devices that can transmit signals in a wide range of radio frequency including visible or invisible optical wavelengths and

electromagnetic wavelengths.

[27] An imaging device can refer to a device that can capture images under a wide range of radio frequency including visible or invisible optical wavelengths and electromagnetic wavelengths. Examples of the imaging device can include, but not limited to, a still film optical camera, an X-Ray camera, an infrared camera, a video camera, also collectively known as a low dynamic range (LDR) imaging device or a standard dynamic range (SDR) imaging device, and a high dynamic range (HDR) or wide dynamic range (WDR) imaging device such as those using two or more sensors having varying sensitivities. The HDR and the WDR imaging device can capture images at a greater dynamic range of luminance between the lightest and darkest areas of an image than typical SDR imaging devices. A digital imager or digital imaging device refers to an imaging device captures images in digital signals. Examples of the digital imager can include, but not limited to, a digital still camera, a digital video camera, a digital scanner, and a charge couple device (CCD) camera. An imaging device can capture images in black and white, gray scale, or various color levels. A digital imager is preferred in this invention. Images captured using a non-digital imaging device, such as a still photograph, can be converted into digital images using a digital scanner and can be also suitable for this invention.

[28] Color and sparkle of a coating can vary in relation to illumination angles or viewing angles. Examples for color measurements can include those described in ASTM E-2194. Briefly, when a coating (10) is illuminated by an illumination device (11 ), such as a light emitting or light directing device or sun light, at an illumination angle measured from the normal Z-Z' (13) as shown in FIG. 1 , a number of viewing angles can be used, such as, 1 ) near aspecular angles that are the viewing angles in a range of from 15° to 25° from the specular reflection (12) of the illumination device (11 ); 2) mid aspecular angles that are the viewing angles around 45° from the specular reflection (12); and 3) far aspecular angles (also known as flop angle) that are the viewing angles in a range of from 75° to 1 10° from the specular reflection (12). In general, color appears to be brighter at near aspecular angles and darker at far aspecular angles. As used herein, the viewing angles are the angles measured from the specular reflection (12) and the illumination angles are the angles measured from the normal direction shown as Z-Z' (13) (FIG. 1 - FIG. 3) that is perpendicular to the surface of the coating or the tangent of the surface of the coating. The color and sparkle can be viewed by a viewer or one or more detectors (14) at the various viewing angles.

[29] Although specific viewing angles are specified above and can be preferred, viewing angles can include any viewing angles that are suitable for viewing the coating or detecting reflections of the coating. A viewing angle can be any angles, continuously or discretely, in a range of from 0° from the specular reflection (12) to the surface of the coating (10) on either side of the specular reflection (12), or in a range of from 0° from the specular reflection (12) to the tangent of the surface of the coating. In one example, when the specular reflection (12) is at 45° from the normal (Ζ-Ζ') (13), viewing angles can be any angles in the range of from 0° to -45° from the specular reflection, or from 0° to 135° from the specular reflection (FIG. 1 ). In another example, when the specular reflection (12) is at 75° from the normal (Ζ-Ζ'), viewing angles can be any angles in a range of from 0° to -15° from the specular reflection, or from 0° to 165° from the specular reflection. Depending on the specular reflection (12), the range of viewing angles can be changed and determined by those skilled in the art. In yet another example, a detector (16), such as a camera or a spectral sensor can be fixed at the normal (Ζ-Ζ') facing towards the coating surface (10) (FIG. 2). One or more illumination sources (21 ) can be positioned to provide illuminations at one or more illumination angles, such as at 15°, 45°, 75°, or a combination thereof, from the normal (Z-Z) (13).

[30] This disclosure is directed to a method for matching color and appearance of a target coating of an article. The method can comprise the steps of:

[31] A1 ) obtaining specimen sparkle values of the target coating measured at one or more sparkle viewing angles, one or more sparkle illumination angles, or a combination thereof;

[32] A2) obtaining specimen color data of the target coating measured at two or more color viewing angles, one or more illumination angles, or a combination thereof;

[33] A3) generating specimen flop values based on said specimen color data;

[34] A4) retrieving from a color database one or more preliminary matching formulas based on said specimen color data, an identifier of said article, or a combination thereof, said color database comprises formulas for coating compositions and interrelated sparkle characteristics, color characteristics, and one or more identifiers of articles;

[35] A5) generating one or more sparkle differences ( S g ) between sparkle characteristics of each of said preliminary matching formulas at each of said one or more sparkle viewing angles and said specimen sparkle values;

[36] A6) generating one or more flop differences (AF) between flop characteristics derived from color characteristics of each of said preliminary matching formulas and said specimen flop values;

[37] A7) generating one or more color difference indexes (CDI) between said specimen color data and color characteristics of each of said preliminary matching formulas; and [38] A8) selecting from said preliminary matching formulas one or more matching formulas based on said sparkle differences said flop differences ( F), and said color difference indexes (CDI).

[39] The target coating can comprise one or more effect pigments. Any of the aforementioned effect pigments can be suitable.

[40] The specimen sparkle values can be obtained from a separate data source, such as provided by a manufacturer of the article, provided by a measurement center, measured using a sparkle measuring device, or a combination thereof.

[41] Sparkle values can be a function of sparkle intensity and sparkle area such as a sparkle function defined below: wherein, S g , Si and S a are sparkle value, sparkle intensity, and sparkle area, respectively. To measure the sparkle value at a predetermined illumination angle, a predetermined viewing angle, or a combination thereof, the sparkle intensity and sparkle area of the coating are measured at the chosen angle or a

combination of angles and then calculated based on a chosen algorithm. In one example, the sparkle intensity, and sparkle area can be measured from one or more images of the coating captured with an imaging device, such as a digital camera at a chosen angle or a combination of angles. One or more algorithms can be employed to define the function to calculate the SG from S, and S A . In one example, sparkle values can be obtained from commercial instruments, such as BYK-mac available from BYK-Gardner USA, Columbia, Maryland, USA. In yet another example, images captured by the imaging device can be entered into a computing device to generate sparkle values.

[42] The specimen sparkle values can be measured at one or more

illumination angles, one or more viewing angles, or a combination thereof. In one example, the specimen sparkle values can be measured with one detector (16) at a fixed viewing angle with two or more illumination angles such as 15°, 45°, 75°, or a combination thereof such as those as shown in FIG. 2. In another example, the specimen sparkle values can be measured at two illumination angles such as 15° and 45°. In yet another example, the specimen sparkle values can be measured at one or more viewing angles with a fixed illumination angle, such as those illustrated in FIG. 3. One or more detectors (16), such as digital cameras can be places at one or more of the viewing angles, such as at - 15°, 15°, 25°, 45°, 75°, 1 10° or a combination thereof. In yet another example, a plurality of detectors can be placed at the viewing angles to measure sparkle values simultaneously. In a further example, one detector can measure sparkle values at the one or more viewing angles sequentially.

[43] The sparkle differences (AS g ) can be defined as:

SgSpec)

wherein, S g . Ma tch and S g -s P ec are sparkle characteristics of matching formulas and specimen sparkle values, respectively.

[44] Since S g is a function of S t and S a , the sparkle differences (AS g ) can also be defined as:

or

Sj-Spec, S a -Match, S a -Spec)

wherein, ASi and AS a are differences in sparkle intensities and sparkle areas between the matching formula and the specimen, respectively; and Si- Ma tch, S spec, S a -Match and S a-¾ , ec are sparkle intensities and sparkle areas of the matching formula and the specimen, respectively. Any functions suitable for calculating differences can be suitable. A number of constants, factors, or other

mathematical relations can be determined empirically or through modeling.

[45] The color data, either the specimen color data or color characteristics of formulas for coating compositions in the color database, can comprise color data values such as L,a,b color values, L*,a*,b* color values, XYZ color values, L,C,h color values, spectral reflectance values, light absorption (K) and scattering (S) values (also known as "K,S values"), or a combination thereof, can be stored in and retrieved from one or more databases. Other color values such as Hunter Lab color values, ANLAB color values, CIE LAB color values (also known as L*,a*,b* color values), CIE LUV color values, L*,C*,H* color values, any other color values known to or developed by those skilled in the art, or a combination thereof, can also be used. The specimen color data can be measured at two or more of the aforementioned viewing angles, such as at -15°, 15°, 25°, 45°, 75°, 1 10°, or a combination thereof. The specimen color data can be measured at 5 of the aforementioned viewing angles in one example, measured at 4 of the aforementioned viewing angles in another example, or measured at 3 of the aforementioned viewing angles in yet another example. In further example, the specimen color data can be measured at 15°, 45°, and 1 10° viewing angles, at 15°, 45°, and 75° viewing angles, or at -15°, 25°, and 75° viewing angles. The specimen color data can also be measured at two or more of the aforementioned viewing angles in combination with one or more of the aforementioned

illumination angles.

[46] Flop values of a coating can represent lightness changing at different viewing angles. The specimen flop values can be generated based on the specimen color data measured at the aforementioned viewing angles. The specimen color data can comprise L,a,b or L * ,a * ,b * color data as specified in CIELAB color space system in which L or L * is for lightness. In this disclosure, L values or L * values at certain viewing angles can be used for generating the flop values, either the specimen flop values or the flop characteristics of matching (or preliminary matching) formulas. The specimen flop values can be generated based on the L values or L * values of the specimen color data. Color data at at least two viewing angles can be needed for generating the flop values. In one example, the flop values can be generated based on the lightness values, such as the specimen L * values at 2, 3, 4, or 5 of the above mentioned viewing angles or a combination thereof. In another example, the flop values can be generated based on the viewing angles selected from any 2 of the above mentioned viewing angles. In yet another example, the specimen flop values can be generated based on the specimen color data measured at three of any of the

aforementioned color viewing angles. In a further example, the specimen flop values can be generated based on the specimen color data measured at three color viewing angles selected from 15°, 45°, and 1 10° viewing angles. [47] The flop values can be defined with the following equation:

HAL * )

Flop Value =

f 2 (£ *m) wherein, L * is the lightness difference between two widely different viewing angles. The fi , h are functions of the quantity that can include one or more weighting factors, exponent functions, or a combination thereof, and can be determined empirically, via mathematical fitting, modeling, or a combination thereof. The L* m is the lightness at an intermediate angle m that is a viewing angle between the two widely different viewing angles. The L* m can be used as a normalizing value Typically, lightness at 45° viewing angle can be used if the 45° viewing angle is between the two widely different viewing angles.

[48] In one example, the flop values can be generated based on viewing angles selected from 15°, 45°, and 1 10° according to following equation:

2.69(1*15° - I*i io°) 1 - 11

Flop Value =

* 0.86

i 45° wherein, the 15° and 1 10° are the two widely different viewing angles and the 45° viewing angle is the intermediate angle. Color data at other viewing angles can also be suitable for generating flop values. In yet another example, the flop characteristics derived from color characteristics of each of the preliminary matching formulas can be generated according to the equation above based on the lightness values at the viewing angles. Lightness values or lightness characteristics at other viewing angles, or a combination thereof, can also be suitable for generating the flop values or flop characteristics. As understood by those skilled in the art, the specimen flop values and the flop characteristics should have compatible data, such as from compatible or same angles.

[49] Flop values of a coating can also comprise lightness change, chroma change, hue change, or a combination thereof, at different viewing angles. The specimen flop values can be generated based on the specimen color data comprising lightness, hue or chroma measured at the aforementioned viewing angles, or a combination thereof. The flop characteristics of coating formulas can be generated based on the color characteristics comprising lightness, hue or chroma measured at the different viewing angles, or a combination thereof. In one example, the flop values can comprise hue flop values based on hue changes, such as AH* ab . In another example, the flop values can comprise chroma changes, such as AC* ab - In yet another example, the flop values can comprise lightness change, such as AL*, chroma change, such as AC* ab , hue change, such as AH* a b, or a combination thereof. The AL*, AC* a b, and AH* ab are described in detail hereafter.

[50] In considering lightness, chroma and hue, the flop values can be defined with the following equation:

HAL*, AC*, AH*)

Flop Value =

{L* a* b*) m wherein, wherein, AL*, AC*, AH* are the lightness difference, chroma difference and hue difference at two widely different viewing angles, respectively. The and f 4 are functions of the quantity that can include one or more weighting factors, exponent functions, or a combination thereof, and can be determined empirically, via mathematical fitting, modeling, or a combination thereof. The (L*, a*, b*) m are L*, a*, b* color data at an intermediate angle m that is a viewing angle between the two widely different viewing angles. Typically, color data at 45° viewing angle can be used if the 45° viewing angle is between the two widely different viewing angles. In one example, the flop values can be generated based on AL*, AC*, AH* at viewing angles selected from 15° and 1 10°, and color data at the 45° viewing angle (L* a* b*) 4 5°- [51] The flop difference (AF) can be generated based on a function that calculates the difference between the specimen flop value {F Spec ) and the flop characteristic derived from color characteristics of one of said preliminary matching formulas (or matching formulas) {F Match )- The flop difference can be defined by the following function: F - flFspec, F Match)-

[52] In one example, the flop differences (AF) can be calculated according to the equation:

AF = (F Match - F Spec) IF Spec.

[53] Other equations or mathematic formulas, such as those comprising simple difference, normalized difference, square, square roots, weighted difference, or a combination thereof, can also be used to calculate the flop differences.

[54] The color database can contain formulas interrelated with appearance characteristics and color characteristics that are compatible with the specimen color data and specimen appearance data. The specimen appearance data can comprise the specimen sparkle values. For example, when the specimen color data are measured at two or more viewing angles, the color characteristics associated with formulas in the color database should contain values at at least the corresponding two or more viewing angles. Each formula in the color database can be associated with color characteristics and appearance

characteristics at one or more viewing angles, one or more illumination angles, or a combination thereof; and color characteristics at one or more viewing angles, one or more illumination angles, or a combination thereof. The appearance characteristics can comprise sparkle characteristics, gloss, texture, or a combination thereof. The appearance characteristics, such as the sparkle characteristics can be obtained from measurements of test panels coated with the formulas, predicted from prediction models based on the formulas, or a combination thereof. Suitable prediction models can include the neural network described hereafter for predicting sparkle characteristics. The formulas can further be associated with one or more identifiers of the article. The term

"interrelated" means that the formulas, the sparkle characteristics, the color characteristics, the identifiers of articles, and other contents of the database, are associated to each other, or having mutual or reciprocal relations to each other. In one example, each formula in the database can be associated with color characteristics, flop characteristics, sparkle characteristics, texture

characteristics, identifiers of articles, VINs, parts of the VINs, color codes, formulas codes, other data that can be used to identify or retrieve the color formulas, or a combination thereof.

[55] The preliminary matching formulas can be retrieved from the color database based on the specimen color data in one example, based on an identifier of the article in another example, and based on a combination of the color data and the identifier in yet another example. The preliminary matching formulas can also be retrieved from the color database based on sparkle values, texture, or a combination thereof. The preliminary matching formulas can also be retrieved from the color database based on color data, flop values, sparkle values, texture data, identifiers of articles, VINs, parts of the VINs, color codes, formulas codes if known, or a combination thereof

[56] The article can be a vehicle or any other products or items that have a layer of coating thereon. The identifier of the article can comprise an article identification number or code, a vehicle identification number (VIN) of the vehicle, part of the VIN, color code of the vehicle, production year of the vehicle, or a combination thereof. Depending on geopolitical regions, the VIN can typically contain data on a vehicle's type, model year, production year, production site and other related vehicle information. The formulas in the color database can also be associated with the VINs, parts of the VINs, color codes of vehicles, production year of vehicles, or a combination thereof.

[57] The color difference indexes (CD I) can be generated based on total color differences, such as the ones selected from ΔΕ, AE* a b, ΔΕ* 94 , or one or more other variations described herein, between the specimen color data and color characteristics of each of the preliminary matching formulas in considerations of one or more illumination angles, one or more viewing angles, or a combination thereof.

[58] Color difference can be produced at a selected viewing angle, a selected illumination angle, or a pair of a selected illumination angle and a viewing angle, and can be defined by their differences in lightness {AL*), redness-greenness ( a *), and yellowness-blueness ( b *)\

* r *

Δ\L· L· Match " ^ Spec Δα * a * Match - CI *Spec

Ab* = b*Match - b*S P ec

wherein, L* Spe c and L* Ma tch are lightness of the specimen color data and that of one of the matching formulas, respectively; a* Spe c and a* Ma tch are redness- greenness of the specimen color data and that of the matching formula, respectively; and b* Spec and b* Ma tch are yellowness-blueness of the specimen color data and that of the matching formula, respectively, at the selected angle or the pair of angles.

[59] The total color difference between the specimen and one of the matching formulas (or preliminary matching formulas) can be defined as AE* ab in CIELAB:

AE* ab = [(AL *) 2 + (Aa*f + (Ab*) 2 ] 1 ' 2

[60] The color differences can also be defined by differences in lightness {AL*), chroma {AC* ab ), and hue {AH* ab )

AL * = L * Match - L *Spec

AC* ab = ab Match " ^ ab Spec

( ά ί 2 2 /2 ά ί 2 2 /2

l Match + D Match ) " {a ^Spec + D ^Spec )

AH* ab - (AL*) 2 - (AC* ab f] m

[61] Based on the lightness, chroma and hue, the total color difference AE* ab can also be calculated as:

AE* ab = [(AL*) 2 + (AC* ab f + (AH* ab f] m

[62] One or more constants or other factors can be introduced to further calculate the total color difference. One of the examples can be the CIE 1994 (AL* AC*ab AH*ab) color-difference equation with an abbreviation CIE94 and the symbol AE* 94

AE* 94 = [(AL */k L S L f + (AC* ab /kcScf + (AH* ab /k H S H ) 2 ] wherein, S L , S c , S H , , k c , and fe are constants or factors determined according to CIE94.

[63] The color difference indexes (CDI) can be generated based on a function of the AE* ab or the AE* 94 at one or more selected angles (angle 1 , angle 2, ... through angle n):

CDI -f( AE*ab-angle 1 , AE* a b-angle 2, · · · AE* ab . an gl e n ) or

CDI =J[ AE 94-angle 1 > AE* 9 4-angIe 2,■■■ E* 94 . a ngle n) wherein the angles can be selected from any of the above mentioned illumination angles, viewing angles, or a combination thereof as determined necessary. The function can comprise a simple summation, weighted summation, means, weighted means, medians, squares, square roots, logarithmic, deviation, standard deviation, other mathematics functions, or a combination thereof.

[64] The color difference indexes (CDI) can also be generated based on other color difference definitions or equations, such as the color differences ( E) based on BFD, CMC, CIE 1976, CIE 2000 (also referred to as CIEDE 2000), or any other color difference definitions or equations known to or developed by those skilled in the art.

[65] In one example, the CDI can be a weighted summation of ΔΕ* 94 for the color differences between the specimen color data and the color characteristics of one matching formula (or a preliminary matching formula) at a plurality of viewing angles, such as any 3 to 6 viewing angles selected from -15°, 15°, 25°, 45°, 75° or 1 10° or a combination thereof. In another example, the CDI can be a weighted summation o E* ab for the color differences between the specimen color data and the color characteristics of one matching formula (or a preliminary matching formula) at a plurality of viewing angles, such as any 3 to 6 viewing angles selected from -15°, 15°, 25°, 45°, 75° or 1 10° or a combination thereof. In yet another example, the CDI can be a weighted summation of ΔΕ* 94 for the color differences between the specimen color data and the color characteristics of one matching formula (or a preliminary matching formula) at 3 viewing angles, such as any 3 viewing angles selected from -15°, 15°, 25°, 45°, 75° or 1 10°. In yet another example, the CDI can be a weighted summation of ΔΕ* 94 for the color differences between the specimen color data and the color characteristics of one matching formula (or a preliminary matching formula) at 3 viewing angles selected from 15°, 45°, and 1 10°.

[66] The preliminary matching formulas can be ranked based on one or more of the AS g , the AF, and the CDI. The one or more preliminary matching formulas having the smallest values, or predetermined values, of the AS g , the AF, or the CDI can be selected as the matching formula (or formulas if more then one formulas fit the predetermined values). A preference or weight can also be given to one or more of the differences. In one example, the flop difference can be used first or given more weight in ranking or selecting the formulas. In another example, sparkle difference can be used first or given more weight in ranking or selecting the formulas. In yet another example, the CDI can be used first or given more weight in ranking or selecting formulas. In yet another example, a combination of any two of the differences can be used first or given more weight in ranking or selecting formulas.

[67] The one or more matching formulas can be selected by a selection process comprising the steps of:

[68] B1 ) grouping said one or more preliminary matching formulas into one or more category groups based on said sparkle differences (AS g ) and said flop differences {AF) according to predetermined ranges of AS g values and AF values;

[69] B2) ranking the preliminary matching formulas in each of the category groups based on said color difference indexes (CDI);

[70] B3) selecting said one or more matching formulas having the minimum values in CDI.

[71] In one example, the preliminary matching formulas can be grouped into category groups based on the AF and AS g at 15° sparkle illumination angles {AS g 15 ) and AS g at 45° sparkle illumination angles {AS g 45 ). Within each of the groups, the formulas can be ranked based on the color difference indexes (CDI). In another example, the preliminary matching formulas can be grouped into category groups based on the AF and CDI. Within each of the groups, the formulas can be ranked again based on AS g at 15° sparkle illumination angles {AS g 15 ) and AS g at 45° sparkle illumination angles {AS g 45 ). In yet another example, the preliminary matching formulas can be grouped into category groups based on the CDI and AS g at 15° sparkle illumination angles {AS g 15 ) and AS g at 45° sparkle illumination angles {AS g 45 ). Within each of the groups, the formulas can be ranked again based on the flop difference values {AF). [72] The prelinninary formulas having the minimum differences values with the specimen values can be selected as the matching formulas, and can be selected automatically by a computer or manually by an operator.

[73] The selection process can further comprise the steps of:

[74] B4) modifying one or more of said preliminary matching formulas to produce one or more subsequent preliminary matching formulas each having a subsequent color difference index (sub-CDI) if said color difference indexes (CDI) are greater than a predetermined CDI value; and

[75] B5) repeating the steps B1 ) - B5) until said sub-CDI is equal to or less than said predetermined CDI value to produce said matching formula.

[76] The formulas can be modified according to a linear vector or function, or a non-linear vector or function, or a combination thereof. Examples of those vectors or functions can include the ones disclosed in US Patent No.: 3,690,771 and WO2008/150378A1 .

[77] The selection process can further comprise the steps of:

[78] B6) producing predicted sparkle characteristics of one or more of the subsequent preliminary matching formulas based on said subsequent preliminary matching formulas and color characteristics associated with said subsequent preliminary matching formulas;

[79] B7) modifying said subsequent preliminary matching formulas; and

[80] B8) repeating the steps of B1 ) - B8) until said predicted sparkle characteristics are equal to or less than a predetermined sparkle value and said sub-CDI is equal to or less than said predetermined CDI value.

[81] The predicted sparkle characteristics can be produced by using an artificial neural network that is capable of producing a predicted sparkle value based on a coating formula and color characteristics associated with that coating formula. Briefly, the artificial neural network can be a data modeling system that can be trained to predict sparkle values of a coating. The artificial neural network can be trained based on measured color characteristics, measured sparkle values and individual training coating formula associated with each of a plurality of training coatings. In one example, the predicted sparkle characteristics can be produced by using the artificial neural network disclosed in US Patent Application No. 61498748 and No. 61498756, herein incorporated by reference.

[82] Some of the steps or a combination of the steps of the method can be programmed to be performed by a computer. In one example, the specimen sparkle values and the specimen color data can be obtained from the respective measuring devices and manually entered into a computer or automatically transferred from the measuring devices to the computer. In another example, the preliminary matching formulas can be retrieved automatically by a computer once the required data have been received by the computer. In yet another example, the sparkle differences, the flop differences, the color difference indexes, or a combination thereof, can be generated by a computer.

[83] The method can further comprise the steps of:

[84] A9) generating matching images having matching display values based on appearance characteristics and the color characteristics of each of said preliminary matching formulas at each of said one or more color viewing angles, one or more illumination angles, or a combination thereof, and optionally generating specimen images having specimen display values based on specimen appearance data and said specimen color data;

[85] A10) displaying said matching images and optionally said specimen images on a display device; and

[86] A1 1 ) selecting a best matching formula from said one or more matching formulas by visually comparing said matching images to said article, and optionally visually comparing said matching images to said specimen images.

[87] In one example, only the matching images are generated and displayed. In another example, both the matching images and the specimen images are generated and displayed. In yet another example, one specimen image (41 ) and one matching image (42) can be displayed side-by-side as curved realistic images having a background color (43) on a digital display device (44) (FIG. 4), such as a laptop screen. The matching images can be visually compared to the article, and optionally to the specimen images, by an operator. [88] The method can further comprise the steps of generating animated matching images and display the animated matching images on the display device. The animated matching images can comprise animated matching display values based on the appearance characteristics and the color characteristics, animated appearance characteristics and animated color characteristics interpolated based on the appearance characteristics and the color

characteristics. The animated matching display values can comprise R,G,B values based on the appearance characteristics and the color characteristics of the matching formula, animated appearance characteristics and animated color characteristics interpolated based on the appearance characteristics and the color characteristics. The animated matching images can be displayed at a plurality of matching display angles that can include the one or more color and sparkle viewing angles, one or more color and sparkle illumination angles, or a combination thereof, associated with the matching formulas. The matching display angles can also include viewing angles, illumination angles, or a combination thereof, interpolated based on the one or more color or sparkle viewing angles, one or more color or sparkle illumination angles, or a

combination thereof, associated with the matching formulas. The animated matching images can be displayed as a video, a movie, or other forms of animated display.

[89] The method can further comprise the steps of generating animated specimen images and display the animated specimen images on the display device. The animated specimen images can comprise animated specimen display values based on the specimen appearance data and the color data, animated appearance data and animated color data interpolated based on the specimen appearance data and the color data. The animated specimen display values can comprise R,G,B values based on the specimen appearance data and the color data, animated appearance data and animated color data interpolated based on the specimen appearance data and the color data. The animated specimen images can be displayed at a plurality of specimen display angles that can include the one or more viewing angles, one or more illumination angles, or a combination thereof, associated with the specimen color data and appearance data. The specimen display angles can also include viewing angles, illumination angles, or a combination thereof, interpolated based on the one or more viewing angles, one or more illumination angles, or a combination thereof, associated with the specimen color data and appearance data. The animated specimen images can be displayed as a video, a movie, or other forms of animated display.

[90] The animated images, either the animated matching images or animated specimen images, can be combined with a coated article or a part of the coated article (51 ), and can be displayed on a display device (51 ) (FIG. 5), such as a laptop screen, over a background or environment (56). The animated images can represent movements of the article, such as rotating or moving in space at any of the dimensions such as s-s' (53), v-v' (54) and h-h' (55) and to display color and appearance at different viewing angles, illumination angles, or a combination thereof. The animated images can comprise a series of images (also referred to as frames) and can be displayed continuously or frame-by-frame. The animated images can also be modified or controlled by an operator, such as by dragging or clicking on the images to change the direction or speed of rotation. The animated images can also comprise data on shape and size of the article, such as a vehicle, and environment of the article.

[91] The appearance characteristics can comprise the sparkle characteristics associated with each of said preliminary matching formulas, matching texture functions associated with each of said preliminary matching formulas, or a combination thereof, wherein the matching texture functions can be selected from measured matching texture function, predicted matching texture function, or a combination thereof. The appearance characteristics can further comprise shape or contour characteristics, environmental characteristics, one or more images such as images of a vehicle, or a combination thereof, associated with the matching formulas. In one example, the appearance characteristics can comprise the sparkle characteristics associated with each of said preliminary matching formulas. In another example, the appearance characteristics can comprise matching texture functions associated with each of said preliminary matching formulas. In yet another example, the appearance characteristics can comprise a combination of both the sparkle characteristics and the matching texture functions. The measured matching texture function associated with a formula can be generated statistically, as described above, by measuring the pixel intensity distribution of an image of the coating of one or more test panels each coated with a coating composition determined by the formula. The predicted matching texture function can be generated using a prediction model based on the formula, color data and sparkle data associated with the formula, or a combination thereof. The prediction model can be trained with a plurality of coating formulas, measured data of textures, measured data of sparkles, measured data of color, or a combination thereof. In one example, the prediction model can be a neural network trained with the aforementioned measured data. The appearance characteristics can be stored in the color database.

[92] The specimen appearance data can comprise the specimen sparkle data, a specimen texture function, or a combination thereof. The specimen texture function can be selected from measured specimen texture function, derived specimen texture function, or a combination thereof. The specimen appearance data can further shape or contour data, environmental data, one or more images, or a combination thereof, associated with the target coating or the article. The measured specimen texture function can be generated statistically, as described above, by measuring the pixel intensity distribution of an image of the target coating. The derived specimen texture function can be generated based on the specimen sparkle data and specimen color data, the identifier of the article, or a combination thereof. The derived specimen texture function can be generated based on the specimen sparkle data and specimen color data using a model, such as a neural network. In one example, a neural network can be trained using measured sparkle data, color data and texture data of a plurality of known coatings to predict texture function of a new coating based on measured color data and sparkle data of the new coating. In another example, one or more measured or derived texture functions are available and associated with the identifier of the article. In yet another example, the identifier is a vehicle identification number (VIN) and one or more measured or derived texture functions are available and associated with the VIN or part of the VIN. The measured or derived texture functions can be retrieved based on the identifier and use for generating the specimen image.

[93] Methods and systems described in US Patent Nos. 7,743,055, 7,747,615 and 7,639,255 can be suitable for generating and display the matching images and the specimen images. The process described in US Patent No. 7,991 ,596 for generating and display digital images via bidirectional reflectance distribution function (BRDF) can also be suitable.

[94] The matching formula can be selected by an operator via visual

comparison or by a computer based on predetermined selection criteria programmed into the computer.

[95] The matching display values can comprise R,G,B values based on the appearance characteristics and the color characteristics. The specimen display values can comprise R,G,B values based on the specimen appearance data and said specimen color data. The R,G,B values are commonly used in the industry to display color on digital display devices, such as cathode ray tube (CRT), liquid crystal display (LCD), plasma display, or LED display, typically used as a television, a computer's monitor, or a large scale screen.

[96] The matching images can be displayed based on one or more illumination angles, one or more viewing angles, or a combination thereof. The specimen images can also be displayed based on one or more illumination angles, one or more viewing angles, or a combination thereof. In one example, a simulated curved object can be displayed on a single display to represent a matching image or a specimen image at one or more viewing angles. The images can be displayed as realistic images of coating color and appearance, such as being displayed based on the shape of a vehicle, or a portion thereof. Any of the aforementioned vehicles can be suitable. The environment that a vehicle is situated within can also be reflected in the specimen images or the matching images. Examples of the environment data or the environmental characteristics can include environmental lighting, shades, objects around the vehicle, ground, water or landscape, or a combination thereof.

[97] To better represent color and sparkle associated with the matching image, at least one of said matching images or the specimen images can be generated as a high dynamic range (HDR) matching image or HDR specimen images, respectively. The HDR matching image can be generated using the

aforementioned bidirectional reflectance distribution function (BRDF) described in the US patent No. 7,991 ,596. The BRDF can be particularly useful for generating HDR images having sparkles that have very high intensity together with color characteristics. The matching images and the specimen images can also be generated directly based on the sparkle characteristics and the color

characteristics, or the specimen sparkle data and specimen color data, respectively. When sparkles are to be displayed in the high dynamic range (HDR) matching image or the HDR specimen images, a HDR display device can be preferred.

[98] The display device can be a computer monitor, a projector, a TV screen, a tablet, a personal digital assistant (PDA) device, a cell phone, a smart phone that combines PDA and cell phone, an iPod, an iPod/MP Player, a flexible thin film display, a high dynamic range (HDR) image display device, a low dynamic range (LDR), a standard dynamic range (SDR) display device, or any other display devices that can display information or images based on digital signals. The display device can also be a printing device that prints, based on digital signals, information or image onto papers, plastics, textiles, or any other surfaces that are suitable for printing the information or images onto. The display device can also be a multi-functional display/input/output device, such as a touch screen. The HDR images, either the HDR matching images or the specimen HDR images, can be displayed on a HDR image display device, a non-HDR image display device mentioned herein, or a combination thereof. The non-HDR image display device can be any of the display devices such as those standard display devices, low dynamic range (LDR) or standard dynamic range (SDR) display devices. The HDR image needs to be modified to display on a non-HDR image display device. Since the sparkles can have very high intensity, they can be difficult to display together with color characteristics in a same image. The HDR target image can be used to improve the display of sparkles and colors.

[99] The method can further comprise the steps of:

[100] A12) producing at least one matching coating composition based on one of the matching formulas; and

[101] A13) applying said matching coating composition over a damaged coating area of said target coating to form a repair coating.

[102] The matching coating composition can be produced by mixing the ingredients or components based on the matching formula. In one example, the matching coating composition can be produced by mixing polymers, solvents, pigments, dyes, effect pigments such as aluminum flakes and other coating additives, components based on a matching formula. In another example, the matching coating composition can be produced by mixing a number of premade components, such as crosslinking components having one or more crosslinking functional groups, crosslinkable components having one or more crosslinkable functional groups, tints having dispersed pigments or effect pigments, solvents and other coating additives or ingredients. In yet another example, the matching coating composition can be produced by mixing one or more radiation curable coating components, tints or pigments or effect pigments and other components. In yet another example, the matching coating composition can be produced by mixing one or more components comprising latex and effect pigments. Any typical components suitable for coating composition can be suitable. The solvents can be one or more organic solvents, water, or a combination thereof.

[103] The coating composition can be applied over the an article or the damaged coating area by spraying, brushing, dipping, rolling, drawdown, or any other coating application techniques known to or developed by those skilled in the art. In one example, a coating damage on a car can be repaired by spraying the matching coating composition over the damaged area to form a wet coating layer. The wet coating layer can be cured at ambient temperatures in a range of from 15°C to 150°C. [104] This disclosure is further directed to a system for matching color and appearance of a target coating of an article. The system can comprise:

[105] a) a color measuring device;

[106] b) a sparkle measuring device;

[107] c) a color database comprising formulas for coating compositions and interrelated sparkle characteristics, color characteristics, and one or more identifiers of articles;

[108] d) a computing device comprising an input device and a display device, said computing device is functionally coupled to said color measuring device, said sparkle measuring device, and said color database; and

[109] e) a computer program product residing in a storage media

functionally coupled to said computing device, said computer program product causes said computing device to perform a computing process comprising the steps of:

[110] C1 ) receiving specimen sparkle values of the target coating from said sparkle measuring device, said specimen sparkle values are measured at one or more sparkle viewing angles, one or more sparkle illumination angles, or a combination thereof;

[111] C2) receiving specimen color data of the target coating from said color measuring device, said specimen color data are measured at two or more color viewing angles, one or more illumination angles, or a combination thereof;

[112] C3) receiving an identifier of said article from said input device;

[113] C4) generating specimen flop values based on said specimen color data;

[114] C5) retrieving from said color database one or more preliminary matching formulas based on said specimen color data, said identifier of said article, or a combination thereof;

[115] C6) generating one or more sparkle differences ( S g ) between sparkle characteristics of each of said preliminary matching formulas and said specimen sparkle values at each of said one or more sparkle viewing angles; [116] C7) generating one or more flop differences (AF) between flop characteristics derived from color characteristics of each of said preliminary matching formulas and said specimen flop values;

[117] C8) generating one or more color difference indexes (CDI) between said specimen color data and color characteristics of each of said preliminary matching formulas; and

[118] C9) producing a ranking list of said preliminary matching formulas based on said sparkle differences (AS g ), said flop differences (AF), and said color difference indexes (CDI).

[119] Any color measuring devices capable of measuring color data at the two or more color viewing angles can be suitable. Any sparkle measuring devices capable of measuring sparkle data at the one or more sparkle viewing angles, one or more sparkle illumination angles, or a combination, can be suitable. The color measuring device and the sparkle measuring device can also be combined into a single device. Commercially available devices, such as the aforementioned Byc-mac, can be suitable.

[120] Any computing devices can be suitable. A portable computing device, such as a laptop, a smart phone, a tablet, or a combination, can be suitable. A computing device can also be a built-in processing device of a color measuring device or a sparkle measuring device. The computing device can have shared input and/or display device with another device, such as a color measuring device or a sparkle measuring device.

[121] In the system disclosed above, the computing process can further comprise a ranking process for producing the ranking list. The ranking process can comprise the steps of:

[122] B1 ) grouping said one or more preliminary matching formulas into one or more category groups based on said sparkle differences (AS g ) and said flop differences (AF) according to predetermined ranges of AS g values and AF values; and

[123] B2) ranking the preliminary matching formulas in each of the category groups based on said color difference indexes (CDI). [124] In the system disclosed above, the computing process can further comprise the steps of:

[125] C10) displaying on the display device the ranking list, one or more preliminary matching formulas based on predetermined values of sparkle differences, flop differences, or color difference indexes, said sparkle differences ( S g ), said flop differences (AF), said color difference indexes (CDI), or a combination thereof;

[126] C1 1 ) receiving a selection input from said input device to select one or more matching formulas from said ranking list; and

[127] C12) displaying said one or more matching formulas on said display device.

[128] In one example, the ranking list is displayed. In another example, the ranking list and top one matching formula can be displayed. In yet another example, the ranking list and top 3 matching formulas can be displayed.

[129] In the system disclosed above, the computing process can further comprise the steps of:

[130] C13) generating matching images having matching display values based on appearance characteristics and the color characteristics of at least one of said preliminary matching formulas at at least one of said one or more color viewing angles, and generating at least one specimen image having specimen display values based on specimen appearance data and said specimen color data;

[131] C14) displaying said matching images and said at least one specimen image on said display device; and

[132] C15) receiving a selecting input from said input device to select one or more matching formulas; and

[133] C16) displaying said one or more matching formulas on said display device.

[134] The matching images, the specimen images, the animated matching images, the animated specimen images, or a combination thereof, can also be displayed. A combination of the ranking list, the matching formulas, matching images, and the specimen images can also be displayed on the display devices. The system can also have one or more subsequent display devices. The ranking list, the formulas, the images, or a combination thereof, can also be displayed on one or all of the one or more display devices.

[135] The display device of the system can be a video display device for displaying the animated matching images or the animated specimen images.

[136] The matching formulas can be selected by a computer, an operator, or a combination thereof. In one example, the computing program product can comprise computer executable codes to select the top ranked preliminary matching formula as the matching formula. In another example, the computing program product can comprise computer executable codes to select the top ranked preliminary matching formula and display the formula on the display device, then prompting for input by an operator to select the matching formula. In yet another example, the computing program product can comprise computer executable codes to select the top ranked preliminary matching formula as the matching formula and display the formula and an image of the formula on the display device, then prompting for input by an operator to select the matching formula. In yet another example, the computing program product can comprise computer executable codes to select the top ranked preliminary matching formula as the matching formula and display the formula, an image of the formula, and the specimen image on the display device, then prompting for input by an operator to select the matching formula. In yet another example, one or more matching formulas are displayed on the display device and the operator is prompted to select the matching formula. In yet another example, one or more matching images and at least one specimen image can be displayed on the display device and the operator can be prompted to select or further adjust the formula to produce the matching formulas. The operator can use the input device or other devices such as touch screen, mouse, touch pen, a keyboard, or a combination thereof, to enter his/her selection. The operator can also select the matching formula by noting an identifier of the formula such as a formula code without entering any input into the system. [137] The system disclosed herein can further comprise a mixing system. The mixing system can be functionally coupled to the computing device. The computing process can further comprise the steps of outputting one of the one or more matching formulas to the mixing system to produce a matching coating composition based on said matching formula. The mixing system can also be stand alone. The matching formulas produced herein can be entered into the mixing system manually or via one or more electronic data files. Typical mixing system having capability to store, deliver and mixing a plurality of components can be suitable.

[138] The system disclosed herein can further comprise a coating application device to applying said matching coating composition over a damaged coating area of said target coating to form a repair coating. Typical coating application devices, such as spray guns, brushes, rollers, coating tanks, electrocoating devices, or a combination thereof can be suitable.

EXAMPLES

The present invention is further defined in the following Examples. It should be understood that these Examples, while indicating preferred

embodiments of the invention, are given by way of illustration only. From the above discussion and these Examples, one skilled in the art can ascertain the essential characteristics of this invention, and without departing from the spirit and scope thereof, can make various changes and modifications of the invention to adapt it to various uses and conditions.

EXAMPLE 1

The coating of a 2002 Jeep Cherokee was measured (target coating 1 ). Based on the vesicle's make, model year 2002 and its color code PDR, a number of preliminary matching formulas (F1 -F7) were retrieved from ColorNet®, automotive refinish color system, available from E. I. du Pont de Nemours and Company, Wilmington, DE, USA, under respective trademark or registered trademarks (Table 1 ).

The color data and sparkle values were measured using a BYK-mac, available from BYK-Gardner USA, Maryland, USA. The flop value of the coating of the vehicle was generated based on color data measured at 3 viewing angles selected from 15°, 45°, and 1 10°. The sparkle data were based on images captured at the normal direction as shown in FIG. 2 with illumination angles selected from 15° and 45°.

The flop characteristics of the matching formulas are stored in a color database of the ColorNet® system and have compatible data on viewing angles of the vehicle measured. The sparkle characteristics of the matching formulas are stored in the color database and are have compatible data on illumination angles of the vehicle measured.

The flop differences ( F) was calculated according to the flop value of the target coating {F Spec ) and the flop value of each of the preliminary matching formulas (F Ma tch) based on the equation:

AF = (FMatch - F Spec) /F Spec.

The sparkle differences ( S g ) at the specified angles were provided in Table 1 .

The preliminary matching formulas F1 -F7 were grouped into category groups (Cat. 1 - 4) based on AF AS g (Table 1 ), wherein category 1 having the least difference.

In this Example, preliminary matching formulas in categories 2 and 3 were not considered further.

The preliminary matching formulas in category 1 were ranked based on the color difference index originally obtained from the color database (Ori. CDI). When the Ori. CDI was greater than a predetermined value, such as a value of "2" in this example, the formula was adjusted using the ColorNet® System to produce a subsequent preliminary matching formula having a subsequent color difference index (sub-CDI). The subsequent preliminary matching formulas were ranked again based on the sub-CDI (Table 2). Table 1. Coating and formula data.

Table 2. Ranking List of Matching Formulas.

The top ranked formula F7 was selected as the matching formula.

EXAMPLE 2

The coating of a 2003 Ford Explorer was measured (target coating 2). Based on the vesicle's make, model year 2003 and its color code JP, a number of preliminary matching formulas (F8-F13) were retrieved from the ColorNet®, automotive refinish color system (Table 3). The preliminary matching formulas were analyzed as described above and ranked as shown in Table 4. The formulas in Category group 2 were adjusted to produce subsequent matching formulas having subsequent CDIs (sub-CDI).

Table 3. Flop and sparkle data.

Table 4. Ranking List of Matching Formulas.

The top ranked formula F13 was selected as the matching formula.