**METHOD OF COMPUTING GLOBAL-TO-LOCAL METRICS FOR RECOGNITION**

PIOVANO JEROME (FR)

ROUSSON MIKAEL (FR)

SOLEM JAN ERIK (SE)

PIOVANO JEROME (FR)

ROUSSON MIKAEL (FR)

**G06N7/00**US20090175509A1 | 2009-07-09 | |||

US20080301133A1 | 2008-12-04 | |||

US6636849B1 | 2003-10-21 |

DAVIS, JASON V. AND KULIS, BRIAN AND JAIN, PRATEEK AND SRA, SUVRIT AND DHILLON, INDERJIT S.: "Information-theoretic metric learning", 2007, XP002617812, ISBN: 978-1-59593-793-3, Retrieved from the Internet

D. RAMANAN; S. BAKER.: "International Conference on Computer Vision", September 2009, ICCV, article "Local Distance Functions: A Taxonomy, New Algorithms, and an Evaluation"

K. WEINBERGER; J. BLITZER; L. SAUL.: "Advances in Neural Information Processing Systems 18", 2006, MIT PRESS, article "Distance Metric Learning for Large Margin Nearest Neighbor Classification", pages: 1473 - 1480

C. DOMENICONI; J. PENG; D. GUNOPULOS: "Locally Adaptive Metric Nearest Neighbor Classification", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 24, 2002, pages 1281 - 1285, XP011094826, DOI: doi:10.1109/TPAMI.2002.1033219

T. HASTIE, R.; TIBSHIRANI: "Discriminant Adaptive Nearest Neighbor Classification", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 18, no. 6, June 1996 (1996-06-01), pages 607 - 616, XP000620142, DOI: doi:10.1109/34.506411

J. V. DAVIS; B. KULIS; P. JAIN; S. SRA; I. S. DHILLON.: "Information-theoretic metric learning", PROCEEDINGS OF THE 24TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING, 20 June 2007 (2007-06-20)

Claims 1. A method for global-to-local metric learning for classification and recognition comprising the steps of; -using a tree structure constructed with a clustering algorithm at each level, and -associating a metric to each one of the tree nodes. 2. The method according to claim 1, wherein said clustering algorithm is the K-means clustering. 3. The method according to claim 1 wherein said metric is a symmetric matrix obtained with the ITML algorithm. 4. The method according to claim 1 wherein said clustering algorithm uses the local metric at each node. 5. A computer program stored in a computer readable storage medium and executed in a computational unit for global-to-local metric learning according to claim 1. 6. A computer program according to claim 5, further comprising the steps of; -finding a relevant local metric for a given feature comparison, and; -using this local metric for classification or recognition. 7. A system for recognition comprising of a computer program according to claim 5 further using feature representations which are compared with a method according to claim 6. 8. A system according to claim 7 where the feature representations represent objects in images. 9. A system according to claim 8 where the objects are faces. |

Background of the invention

The classification problem can be formulated as a verification scheme where the objective is to determine if a pair of points is positive or negative, i.e. positive if the points belong to the same class, negative otherwise. Given a set of labeled data, one can try to learn the metric that gives a discriminative distance in a given feature space that would perform the best for this task; which is to give a low distance between points of a positive pair and a high distance to points of a negative pairs. A detailed overview of this type of methods can be found in [1].

Global metric learning has become popular to improve classification algorithms, such as the K-nearest neighbors (KNN) classification algorithm [2] . It often consists of estimating the optimal covariance matrix of the Mahanlobis distance that will be used for classification. While these kinds of global metrics have shown impressive

improvements for classification, they do not capture local properties in the feature space which may be relevant for complex data distributions. To overcome this difficulty, a two step approach is generally employed [3, 4]. Firstly, a global metric is learned and training points in the feature space are transformed accordingly; secondly, a local metric is estimated in the neighborhood of each transformed training point. These local metrics allow for better adaptiveness to the data but often require an heuristic choice of locality. Summary of the invention

The proposed invention is instead a hierarchical global-to-local approach, where the metric is iteratively refined and learned using the data distribution itself. The approach starts by estimating a global metric and applying the corresponding (metric)

transformation to the data. Then the transformed points are clustered to obtain a set of K clusters. These two steps are recursively applied to each cluster until a termination criterion is satisfied on the cluster, such criteria can be e.g. maximal height in the tree, minimal variance of the data points in the cluster or a minimum number of data points in the cluster. This forms a tree with a metric associated to each node.

In other words, a method of computing global-to-local metrics for recognition is provided. Based on training examples with feature representations, the method automatically computes a local metric that varies over the space of feature representations to optimize discrimination and the performance of recognition systems.

Given a set of points in an arbitrary features space, local metrics are learned in a hierarchical manner that give low distances between points of same class and high distances between points of different classes. Rather than considering a global metric, a class- based metric or a point-based metric, the proposed invention applies successive clustering to the data and associates a metric to each one of the clusters.

Brief description of the drawings

Figure 1 illustrates a schematic layout of a global-to-local metric tree.

Fig ure 2 illustrates systems comprising of a camera and

computational unit using the global-to-local metric.

Detailed Description

Below follows a detailed description of the invention .

Given a set of labeled points ^{r }« * ^ }, we successively apply a global metric learning algorithm on sets of hierarchically clustered points. This has the effect of forming a metric tree. For simplicity of presentation, we assume the number of clusters for each instance of the clustering algorithm to be constant and equal to K. Let be the /''cluster at level /and -^>-j be the associated metric transformation matrix. The metric is learned on the set of transformed points { ^{y }i- ^{~ ": I J }^=^ ' ^{~ 1 }^ ^{A , } '.. } e.g . using, but not restricted to, the Information Theoretic Metric Learning a lgorithm (ITML) proposed in [5] .

Before applying the clustering algorithm to a node '-J, we apply the transformation associated to the metric of that to node the points given by its parents nodes.

We can now use our metric tree to evaluate the distance between any two data points, e.g . image descriptors, face feature vectors or other feature representations. First, each point is injected in the tree and its path is recovered . In particular, we identify the last node of the tree that contain both points. The distance between the points is then the one obtained using the metric associated to this node, possibly compounded with the metrics of parent nodes.

In the worst case, the last common node is the root of the tree, and therefore, the distance will use the global metric. The deeper in the tree the common node is, the more local the metric is. Compared to pure local or global methods, this approach has the advantage of refining the metric in dense or complex areas, according to the termination criterion (e.g. maximum leaf size, maximum leaf variance, maximum height limit).

A possible issue with this formulation is the high dependence on the clustering boundary. Points can be close to each other and

separated quite early in the clustering tree. To reduce the influence of this decision, it is possible to construct multiple metric trees and average the distances given by each one of them.

In a preferred embodiment of the invention, a method for global-to- local metric learning is presented, the method comprising the steps of;

-learning a metric tree (as may be illustrated as in Figure 1), -classifying or comparing test data using this metric tree.

In another embodiment of the present invention, a computer program stored in a computer readable storage medium and executed in a computational unit for global-to-local metric learning comprising the steps of; learning a metric tree, classifying or comparing test points using this metric tree.

Yet another embodiment of the present invention, a system for global-to-local metric learning and classification containing a computer program for global-to-local metric learning comprising the steps of;

-learning a metric tree,

-classifying or comparing test data using this metric tree.

In another embodiment of the present invention a system or device is used for obtaining images, analyzing, and responding to results from classification using a global-to-local metric, as may be seen in Figure 2. Such a system may include at least one image acquisition device 101 and a computational device 100.

We have described the underlying method used for the present invention together with a list of embodiments. Possible application areas for the above described invention range from, but are not restricted to, object recognition and face recognition to classification of image content.

References

[1] D. Ramanan, S. Baker. "Local Distance Functions: A

Taxonomy, New Algorithms, and an Evaluation" International Conference on Computer Vision (ICCV) Kyoto, Japan, Sept. 2009.

[2] K. Weinberger, J. Blitzer, L. Saul. "Distance Metric Learning for Large Margin Nearest Neighbor Classification "Advances in Neural Information Processing Systems 18, MIT Press, Cambridge, MA, pp. 1473-1480, 2006.

[3] C. Domeniconi and J. Peng and D. Gunopulos, "Locally Adaptive Metric Nearest Neighbor Classification" IEEE

Transactions on Pattern Analysis and Machine Intelligence, vol. 24, pp. 1281-1285, 2002.

[4] T. Hastie, R. Tibshirani, "Discriminant Adaptive Nearest Neighbor Classification/' IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 6, pp. 607-616, June, 1996.

[5] J. V. Davis, B. Kulis, P. Jain, S. Sra, and I. S. Dhillon.

"Information-theoretic metric learning" In Proceedings of the 24th international Conference on Machine Learning, Corvalis, Oregon, June 20 - 24, 2007.

**Previous Patent:**SEALING SYSTEM OF A SEALED ELECTRIC GENERATOR

**Next Patent: WET LAMINATION METHOD AND FILM TUBE FOR WET LAMINATION**