0

0

1252 days ago,
460 views

Review. Short Metric Learning Overview Eric Xing et al (2002)Kilian Weinberger et al (2005)Local Distance FunctionsAndrea Frome et al (2006). What is a metric?. In fact, a metric on a space X is a capacity Satisfies : non-pessimism, personality, symmetry, triangle imbalance Just disregard the details, and consider it a separation capacity that measures similarityDepending on contex

Regulated Distance Metric Learning Presented at CMU's Computer Vision Misc-Read Reading Group May 9, 2007 by Tomasz Malisiewicz

Overview Short Metric Learning Overview Eric Xing et al (2002) Kilian Weinberger et al (2005) Local Distance Functions Andrea Frome et al (2006)

What is a metric? In fact, a metric on a space X is a capacity Satisfies : non-antagonism, character, symmetry, triangle imbalance Just disregard the details, and consider it a separation capacity that measures comparability Depending on setting and numerical properties other helpful names: Comparison work, remove work, separate metric, closeness measure, piece work, coordinating measure

When are separation capacities utilized? Everywhere Density estimation (e.g. parzen windows) Clustering (e.g. k-implies) Instance-based classifiers (e.g. NN)

How to pick a separation work? From the earlier Euclidean Distance L 1 remove Cross Validation inside little class of capacities e.g. picking request of polynomial for a portion

Learning separation work from information Unsupervised Metric Learning (otherwise known as Manifold Learning) Linear: e.g. PCA Non-straight: e.g. LLE, Isomap Supervised Metric Learning (utilizing marks related with focuses) Global Learning Local Learning

A Mahalanobis Distance Metric Most Commonly Used Distance Metric in Machine Learning Community Equivalent to first applying straight change y = Ax, then utilizing Euclidean separation in new space of y's

Supervised Distance Metric Learning Global versus Local

Supervised Global Distance Metric Learning (Xing et al 2002) Input information has names (C classes) Consider all sets of focuses from a similar class (comparability class) Consider all sets of focuses from various classes (inequivalence class) Learn a Mahalanobis Distance Metric that unites proportional focuses while avoiding inequivalent focuses E. Xing, A. Ng, and M. Jordan, "Remove metric learning with application to grouping with side data," in NIPS, 2003.

Supervised Global Distance Metric Learning (Xing et al 2002)

Supervised Global Distance Metric Learning (Xing et al 2002) Convex Optimization issue Minimize pairwise removes between all comparatively named illustrations

Supervised Global Distance Metric Learning (Xing et al 2002) Anybody see an issue with this approach?

Problems with Global Distance Metric Learning Problem with multimodal classes

Supervised Local Distance Metric Learning Many distinctive administered separate metric learning calculations that don't attempt to bring all focuses from same class together Many methodologies still attempt to learn Mahalanobis remove metric Account for multimodal information by incorporating just nearby requirements

Supervised Local Distance Metric Learning (Weinberger et al 2005) Consider a KNN Classifier: for each Query Point x, we need the K-closest neighbors of same class to end up distinctly nearer to x in new metric K.Q. Weinberger, J. Blitzer, and L.K. Saul, "Separate metric learning for expansive edge closest neighbor arrangement," in NIPS, 2005.

Supervised Local Distance Metric Learning (Weinberger et al 2005) Convex Objective Function (SDP) Penalize extensive separations between each information and target neighbors Penalize little separations between each info and every single other purpose of various class Points from various classes are isolated by substantial edge K.Q. Weinberger, J. Blitzer, and L.K. Saul, "Remove metric learning for extensive edge closest neighbor characterization," in NIPS, 2005.

Distance Metric Learning by Andrea Frome A. Frome, Y. Vocalist, and J. Malik, "Picture Retrieval and Classification Using Local Distance Functions," in NIPS 2006. At last we get something more fascinating than Mahalanobis Distance Metric!

Per-model separation work Goal: rather than taking in a misshapening of the space of models, need to take in a separation work for every model (think KNN classifier)

Distance capacities and Learning Procedure If there are N preparing pictures, we will take care of N separate learning issues (the preparation picture for a specific learning issue is alluded to as the central picture ) Each learning issue comprehended with a subset of the rest of the preparation pictures (called the learning set)

Elementary Distances Distance capacities based on top of rudimentary separation measures between fix based elements Each information is not regarded as a settled length vector

Elementary Patch-based Distances Distance capacity is mix of basic fix based separations (remove from fix to picture) M fixes in Focal Image to Image separation is straight mix of fix to picture separations Distance between j-th fix in F and whole picture I

Learning from triplets Goal is to learn w for each Focal Image from Triplets of Focal Image I, Similar Image, and Different Image Each Triplet gives us one imperative

Max Margin Formulation Learn w for each Focal Image freely Weights must be sure T triplets for each learning issue Slack Vars (like non-distinct SVM)

Visual Features and Patch-to-Image Distance Geometric Blur descriptors (for shape) at two unique scales Naïve Color Histogram descriptor Sampled from 400 or less edge focuses No geometric relations between elements Distance between highlight f and picture I is the littlest L 2 remove amongst f and highlight f' in I of a similar element sort

Image Browsing Example Given a Focal Image, sort all preparation pictures by separation from Focal Image

Image Retrieval Given a novel Image Q (not in preparing set), need to sort all preparation pictures as for separation to Q Problem: nearby separations are not straightforwardly equivalent (weights adapted autonomously and not on same scale)

Second Round of Training: Logistic Regression to the Rescue For each Focal Image I, fit Logistic Regression model to the paired (in-class versus out-of-class) preparing marks and learned separations. To characterize inquiry picture Q, we can get separate from Q to each preparation picture i , then utilize strategic capacity to get likelihood that Q is in an indistinguishable class from i Assign Q to the class with the most elevated mean (or entirety) likelihood

Choosing Triplets for Learning For Focal Image I, pictures in same classification are "comparative" pictures and pictures from various classification are "distinctive" pictures Want to choose pictures that are like the central picture as per no less than one basic separation work

Choosing Triplets for Learning For each of the M rudimentary separations in central picture F, we locate the top K nearest pictures If K contains both in-class and out-of-class pictures, make all triplets from (F,in-class,out-class) blends If K are all in-class pictures, get nearest beat picture then make K triplets (invert if all K are outmaneuver)

Caltech 101

Caltech 101: 60.3% Mean Recognition Rate

Conclusions Metric Learning like Feature Selection Seems like for various visual protest classes, diverse ideas of likeness are required: shading enlightening for a few classes, nearby shape for others, geometry for others Metric Learning appropriate for example based classifiers like KNN Local adapting more significant than worldwide learning

References NIPS 06 Workshop on Learning to Compare Examples Liu Yang's Distance Metric Learning Comprehensive Survey Xing et al, Weinberger et al, Frome et al

Questions? Much obliged for Listening Questions?

SPONSORS

No comments found.

SPONSORS

SPONSORS