Bunching Techniques and Applications to Image Segmentation

0
0
1452 days ago, 525 views
PowerPoint PPT Presentation
Guide. Unsupervised learningClustering categoriesClustering algorithmsK-meansFuzzy c-meansKernel-based Graph-basedQ

Presentation Transcript

Slide 1

´╗┐Bunching Techniques and Applications to Image Segmentation Liang Shan shan@cs.unc.edu

Slide 2

Roadmap Unsupervised learning Clustering classifications Clustering calculations K-implies Fuzzy c-implies Kernel-based Graph-based Q&A

Slide 3

Unsupervised learning Definition 1 Supervised: human exertion included Unsupervised: no human exertion Definition 2 Supervised: learning contingent circulation P(Y|X), X: highlights, Y: classes Unsupervised: learning dispersion P(X), X: highlights Back Slide credit: Min Zhang

Slide 4

Clustering What is grouping?

Slide 5

Clustering Definition Assignment of an arrangement of perceptions into subsets so that perceptions in a similar subset are comparable in some sense

Slide 6

Clustering Hard versus Delicate Hard: same protest can just have a place with single cluster Soft: same question can have a place with various groups Slide credit: Min Zhang

Slide 7

Clustering Hard versus Delicate Hard: same protest can just have a place with single cluster Soft: same question can have a place with various bunches E.g. Gaussian blend demonstrate Slide credit: Min Zhang

Slide 8

Clustering Flat versus Progressive Flat: groups are level Hierarchical: bunches shape a tree Agglomerative Divisive

Slide 9

Hierarchical grouping Agglomerative (Bottom-up) Compute all match insightful example design comparability coefficients Place each of n examples into its very own class Merge the two most comparable groups into one Replace the two bunches into the new bunch Re-process between bunch similitude scores w.r.t . the new bunch Repeat the above stride until there are k groups left ( k can be 1) Slide credit: Min Zhang

Slide 10

Hierarchical bunching Agglomerative (Bottom up)

Slide 11

Hierarchical grouping Agglomerative (Bottom up) 1 st cycle 1

Slide 12

Hierarchical bunching Agglomerative (Bottom up) 2 nd emphasis 1 2

Slide 13

Hierarchical grouping Agglomerative (Bottom up) 3 rd cycle 3 1 2

Slide 14

Hierarchical bunching Agglomerative (Bottom up) 4 th cycle 3 1 2 4

Slide 15

Hierarchical grouping Agglomerative (Bottom up) 5 th emphasis 3 1 2 5 4

Slide 16

Hierarchical bunching Agglomerative (Bottom up) Finally k groups left 3 9 6 1 2 5 8 4 7

Slide 17

Hierarchical bunching Divisive (Top-down) Start at the top with all examples in one group The group is part utilizing a level bunching calculation This technique is connected recursively until each example is in its own singleton group

Slide 18

Hierarchical bunching Divisive (Top-down) Slide credit: Min Zhang

Slide 19

Bottom-up versus Best down Which one is more mind boggling? Which one is more effective? Which one is more precise?

Slide 20

Bottom-up versus Best down Which one is more perplexing? Best down Because a level bunching is required as a "subroutine" Which one is more productive? Which one is more precise?

Slide 21

Bottom-up versus Best down Which one is more mind boggling? Which one is more proficient? Which one is more precise?

Slide 22

Bottom-up versus Best down Which one is more intricate? Which one is more proficient? Beat down For a settled number of top levels, utilizing a productive level calculation like K-means, divisive calculations are direct in the quantity of examples and groups Agglomerative calculations are slightest quadratic Which one is more precise?

Slide 23

Bottom-up versus Beat down Which one is more intricate? Which one is more productive? Which one is more precise?

Slide 24

Bottom-up versus Beat down Which one is more mind boggling? Which one is more effective? Which one is more exact? Best down Bottom-up techniques settle on grouping choices in light of nearby examples without at first considering the worldwide circulation. These early choices can't be fixed. Beat down bunching profits by entire data about the worldwide appropriation when settling on top-level dividing choices. Back

Slide 25

K-implies Data set: Clusters: Codebook : Partition network: Minimizes utilitarian: Iterative calculation: Initialize the codebook V with vectors haphazardly picked from X Assign each example to the closest bunch Recalculate parcel framework Repeat the over two stages until joining

Slide 26

K-implies Disadvantages Dependent on instatement

Slide 27

K-implies Disadvantages Dependent on introduction

Slide 28

K-implies Disadvantages Dependent on instatement

Slide 29

K-implies Disadvantages Dependent on introduction Select irregular seeds with at any rate D min Or, run the calculation commonly

Slide 30

K-implies Disadvantages Dependent on introduction Sensitive to anomalies

Slide 31

K-implies Disadvantages Dependent on introduction Sensitive to exceptions Use K-medoids

Slide 32

K-implies Disadvantages Dependent on instatement Sensitive to exceptions (K-medoids ) Can bargain just with groups with circular symmetrical point dissemination Kernel trap

Slide 33

K-implies Disadvantages Dependent on instatement Sensitive to anomalies (K-medoids ) Can bargain just with bunches with round symmetrical point appropriation Deciding K

Slide 34

Deciding K Try two or three K Image: Henry Lin

Slide 35

Deciding K When k = 1, the target capacity is 873.0 Image: Henry Lin

Slide 36

Deciding K When k = 2, the target capacity is 173.1 Image: Henry Lin

Slide 37

Deciding K When k = 3, the target capacity is 133.6 Image: Henry Lin

Slide 38

Deciding K We can plot target work values for k=1 to 6 The sudden change at k=2 is very suggestive of two groups "knee finding" or "elbow discovering" Note that the outcomes are not generally as obvious as in this toy case Back Image: Henry Lin

Slide 39

Fuzzy C-implies Data set: Clusters: Codebook : Partition lattice: K-implies: Soft bunching Minimize useful fluffy segment grid fuzzification parameter, typically set to 2

Slide 40

Fuzzy C-means Minimize subject to

Slide 41

Fuzzy C-implies Minimize subject to How to unravel this compelled enhancement problem?

Slide 42

Fuzzy C-implies Minimize subject to How to illuminate this obliged advancement problem? Introduce Lagrangian multipliers

Slide 43

Fuzzy c-implies Introduce Lagrangian multipliers Iterative improvement Fix V , enhance w.r.t . U Fix U , advance w.r.t . V

Slide 44

Application to picture division Original pictures Segmentations Homogenous power ruined by 5% Gaussian clamor Accuracy = 96.02% Sinusoidal inhomogenous force defiled by 5% Gaussian commotion Accuracy = 94.41% Back Image: Dao-Qiang Zhang, Song-Can Chen

Slide 45

Kernel substitution trap Kernel K-implies Kernel fluffy c-implies

Slide 46

Kernel substitution trap Kernel fluffy c-implies Confine ourselves to Gaussian RBF piece Introduce a punishment term containing neighborhood data Equation: Dao-Qiang Zhang, Song-Can Chen

Slide 47

Spatially obliged KFCM : the arrangement of neighbors that exist in a window around : the cardinality of controls the impact of the punishment term The punishment term is limited when Membership esteem for x j is vast and furthermore substantial at neighboring pixels Vice versa Equation: Dao-Qiang Zhang, Song-Can Chen

Slide 48

FCM connected to division FCM Accuracy = 96.02% KFCM Accuracy = 96.51% Original pictures Homogenous power undermined by 5% Gaussian commotion SFCM Accuracy = 99.34% SKFCM Accuracy = 100.00% Image: Dao-Qiang Zhang, Song-Can Chen

Slide 49

FCM connected to division FCM Accuracy = 94.41% KFCM Accuracy = 91.11% Original pictures Sinusoidal inhomogenous power tainted by 5% Gaussian commotion SFCM Accuracy = 98.41% SKFCM Accuracy = 99.88% Image: Dao-Qiang Zhang, Song-Can Chen

Slide 50

FCM connected to division FCM result KFCM result Original MR picture debased by 5% Gaussian clamor SFCM result SKFCM result Back Image: Dao-Qiang Zhang, Song-Can Chen

Slide 51

Graph Theory-Based Use chart hypothesis to tackle bunching issue Graph wording Adjacency network Degree Volume Cuts Slide credit: Jianbo Shi

Slide 52

Slide credit: Jianbo Shi

Slide 53

Slide credit: Jianbo Shi

Slide 54

Slide credit: Jianbo Shi

Slide 55

Slide credit: Jianbo Shi

Slide 56

Problem with min. cuts Minimum cut criteria favors cutting little arrangements of separated hubs in the diagram Not astonishing since the cut increments with the quantity of edges going over the two apportioned parts Image: Jianbo Shi and Jitendra Malik

Slide 57

Slide credit: Jianbo Shi

Slide 58

Slide credit: Jianbo Shi

Slide 59

Algorithm Given a picture, set up a weighted chart and set the weight on the edge interfacing two hubs to be a measure of the comparability between the two hubs Solve for the eigenvectors with the second littlest eigenvalue Use the second littlest eigenvector to bipartition the chart Decide if the present parcel ought to be subdivided and recursively repartition the portioned parts if vital

Slide 60

Example (an) An uproarious "stride" picture (b) eigenvector of the second littlest eigenvalue (c) coming about segment Image: Jianbo Shi and Jitendra Malik

Slide 61

Example (a) Point set created by two Poisson forms (b) Partition of the point set

Slide 62

Example (a) Three picture patches shape an intersection (b)- (d) Top three segments of the segment Image: Jianbo Shi and Jitendra Malik

Slide 63

Image: Jianbo Shi and Jitendra Malik

Slide 64

Example Components of the segment with Ncut esteem under 0.04 Image: Jianbo Shi and Jitendra Malik

Slide 65

Example Back Image: Jianbo Shi and Jitendra Malik

SPONSORS