0

0

1452 days ago,
525 views

PowerPoint PPT Presentation
Guide. Unsupervised learningClustering categoriesClustering algorithmsK-meansFuzzy c-meansKernel-based Graph-basedQ

Bunching Techniques and Applications to Image Segmentation Liang Shan shan@cs.unc.edu

Roadmap Unsupervised learning Clustering classifications Clustering calculations K-implies Fuzzy c-implies Kernel-based Graph-based Q&A

Unsupervised learning Definition 1 Supervised: human exertion included Unsupervised: no human exertion Definition 2 Supervised: learning contingent circulation P(Y|X), X: highlights, Y: classes Unsupervised: learning dispersion P(X), X: highlights Back Slide credit: Min Zhang

Clustering What is grouping?

Clustering Definition Assignment of an arrangement of perceptions into subsets so that perceptions in a similar subset are comparable in some sense

Clustering Hard versus Delicate Hard: same protest can just have a place with single cluster Soft: same question can have a place with various groups Slide credit: Min Zhang

Clustering Hard versus Delicate Hard: same protest can just have a place with single cluster Soft: same question can have a place with various bunches E.g. Gaussian blend demonstrate Slide credit: Min Zhang

Clustering Flat versus Progressive Flat: groups are level Hierarchical: bunches shape a tree Agglomerative Divisive

Hierarchical grouping Agglomerative (Bottom-up) Compute all match insightful example design comparability coefficients Place each of n examples into its very own class Merge the two most comparable groups into one Replace the two bunches into the new bunch Re-process between bunch similitude scores w.r.t . the new bunch Repeat the above stride until there are k groups left ( k can be 1) Slide credit: Min Zhang

Hierarchical bunching Agglomerative (Bottom up)

Hierarchical grouping Agglomerative (Bottom up) 1 st cycle 1

Hierarchical bunching Agglomerative (Bottom up) 2 nd emphasis 1 2

Hierarchical grouping Agglomerative (Bottom up) 3 rd cycle 3 1 2

Hierarchical bunching Agglomerative (Bottom up) 4 th cycle 3 1 2 4

Hierarchical grouping Agglomerative (Bottom up) 5 th emphasis 3 1 2 5 4

Hierarchical bunching Agglomerative (Bottom up) Finally k groups left 3 9 6 1 2 5 8 4 7

Hierarchical bunching Divisive (Top-down) Start at the top with all examples in one group The group is part utilizing a level bunching calculation This technique is connected recursively until each example is in its own singleton group

Hierarchical bunching Divisive (Top-down) Slide credit: Min Zhang

Bottom-up versus Best down Which one is more mind boggling? Which one is more effective? Which one is more precise?

Bottom-up versus Best down Which one is more perplexing? Best down Because a level bunching is required as a "subroutine" Which one is more productive? Which one is more precise?

Bottom-up versus Best down Which one is more mind boggling? Which one is more proficient? Which one is more precise?

Bottom-up versus Best down Which one is more intricate? Which one is more proficient? Beat down For a settled number of top levels, utilizing a productive level calculation like K-means, divisive calculations are direct in the quantity of examples and groups Agglomerative calculations are slightest quadratic Which one is more precise?

Bottom-up versus Beat down Which one is more intricate? Which one is more productive? Which one is more precise?

Bottom-up versus Beat down Which one is more mind boggling? Which one is more effective? Which one is more exact? Best down Bottom-up techniques settle on grouping choices in light of nearby examples without at first considering the worldwide circulation. These early choices can't be fixed. Beat down bunching profits by entire data about the worldwide appropriation when settling on top-level dividing choices. Back

K-implies Data set: Clusters: Codebook : Partition network: Minimizes utilitarian: Iterative calculation: Initialize the codebook V with vectors haphazardly picked from X Assign each example to the closest bunch Recalculate parcel framework Repeat the over two stages until joining

K-implies Disadvantages Dependent on instatement

K-implies Disadvantages Dependent on introduction

K-implies Disadvantages Dependent on instatement

K-implies Disadvantages Dependent on introduction Select irregular seeds with at any rate D min Or, run the calculation commonly

K-implies Disadvantages Dependent on introduction Sensitive to anomalies

K-implies Disadvantages Dependent on introduction Sensitive to exceptions Use K-medoids

K-implies Disadvantages Dependent on instatement Sensitive to exceptions (K-medoids ) Can bargain just with groups with circular symmetrical point dissemination Kernel trap

K-implies Disadvantages Dependent on instatement Sensitive to anomalies (K-medoids ) Can bargain just with bunches with round symmetrical point appropriation Deciding K

Deciding K Try two or three K Image: Henry Lin

Deciding K When k = 1, the target capacity is 873.0 Image: Henry Lin

Deciding K When k = 2, the target capacity is 173.1 Image: Henry Lin

Deciding K When k = 3, the target capacity is 133.6 Image: Henry Lin

Deciding K We can plot target work values for k=1 to 6 The sudden change at k=2 is very suggestive of two groups "knee finding" or "elbow discovering" Note that the outcomes are not generally as obvious as in this toy case Back Image: Henry Lin

Fuzzy C-implies Data set: Clusters: Codebook : Partition lattice: K-implies: Soft bunching Minimize useful fluffy segment grid fuzzification parameter, typically set to 2

Fuzzy C-means Minimize subject to

Fuzzy C-implies Minimize subject to How to unravel this compelled enhancement problem?

Fuzzy C-implies Minimize subject to How to illuminate this obliged advancement problem? Introduce Lagrangian multipliers

Fuzzy c-implies Introduce Lagrangian multipliers Iterative improvement Fix V , enhance w.r.t . U Fix U , advance w.r.t . V

Application to picture division Original pictures Segmentations Homogenous power ruined by 5% Gaussian clamor Accuracy = 96.02% Sinusoidal inhomogenous force defiled by 5% Gaussian commotion Accuracy = 94.41% Back Image: Dao-Qiang Zhang, Song-Can Chen

Kernel substitution trap Kernel K-implies Kernel fluffy c-implies

Kernel substitution trap Kernel fluffy c-implies Confine ourselves to Gaussian RBF piece Introduce a punishment term containing neighborhood data Equation: Dao-Qiang Zhang, Song-Can Chen

Spatially obliged KFCM : the arrangement of neighbors that exist in a window around : the cardinality of controls the impact of the punishment term The punishment term is limited when Membership esteem for x j is vast and furthermore substantial at neighboring pixels Vice versa Equation: Dao-Qiang Zhang, Song-Can Chen

FCM connected to division FCM Accuracy = 96.02% KFCM Accuracy = 96.51% Original pictures Homogenous power undermined by 5% Gaussian commotion SFCM Accuracy = 99.34% SKFCM Accuracy = 100.00% Image: Dao-Qiang Zhang, Song-Can Chen

FCM connected to division FCM Accuracy = 94.41% KFCM Accuracy = 91.11% Original pictures Sinusoidal inhomogenous power tainted by 5% Gaussian commotion SFCM Accuracy = 98.41% SKFCM Accuracy = 99.88% Image: Dao-Qiang Zhang, Song-Can Chen

FCM connected to division FCM result KFCM result Original MR picture debased by 5% Gaussian clamor SFCM result SKFCM result Back Image: Dao-Qiang Zhang, Song-Can Chen

Graph Theory-Based Use chart hypothesis to tackle bunching issue Graph wording Adjacency network Degree Volume Cuts Slide credit: Jianbo Shi

Slide credit: Jianbo Shi

Slide credit: Jianbo Shi

Slide credit: Jianbo Shi

Slide credit: Jianbo Shi

Problem with min. cuts Minimum cut criteria favors cutting little arrangements of separated hubs in the diagram Not astonishing since the cut increments with the quantity of edges going over the two apportioned parts Image: Jianbo Shi and Jitendra Malik

Slide credit: Jianbo Shi

Slide credit: Jianbo Shi

Algorithm Given a picture, set up a weighted chart and set the weight on the edge interfacing two hubs to be a measure of the comparability between the two hubs Solve for the eigenvectors with the second littlest eigenvalue Use the second littlest eigenvector to bipartition the chart Decide if the present parcel ought to be subdivided and recursively repartition the portioned parts if vital

Example (an) An uproarious "stride" picture (b) eigenvector of the second littlest eigenvalue (c) coming about segment Image: Jianbo Shi and Jitendra Malik

Example (a) Point set created by two Poisson forms (b) Partition of the point set

Example (a) Three picture patches shape an intersection (b)- (d) Top three segments of the segment Image: Jianbo Shi and Jitendra Malik

Image: Jianbo Shi and Jitendra Malik

Example Components of the segment with Ncut esteem under 0.04 Image: Jianbo Shi and Jitendra Malik

Example Back Image: Jianbo Shi and Jitendra Malik

SPONSORS

No comments found.

SPONSORS

SPONSORS