Commit a958b30c00d6b9f47262aad1b77a9646b0255b22

  • avatar
  • Kimmo Riihiaho <kimmo.riihiaho @gm…l.com> (Committer)
  • Tue May 15 20:35:42 EEST 2018
  • avatar
  • Kimmo Riihiaho <kimmo.riihiaho @gm…l.com> (Author)
  • Tue May 15 20:35:42 EEST 2018
Kuvaajiin akselit
DataMiningProject/Seminar_presentation.pptx
(369 / 385)
Binary files differ
DataMiningProject/t1.m
(10 / 10)
  
1313
1414% ### Tune these and play around ###
1515
16% First feature selection will drop pixels which
17% have less than startDropVar variance.
18startDropVar = 0;
19% Last feature selection will drop pixels which
20% have less than endDropVar variance.
21% USE MAX 40! Otherwise will crash.
22endDropVar = 40;
1623% How many times to split the interignoreVariance i.e. how many times
1724% we'll run the loop. Use small number on slow machine (like 4)!
18droppingSteps = 4;
25droppingSteps = 20;
1926% Crossvalidation repetition i.e. how many times we'll
2027% crossvalidate our classifiers with the same variance treshold
2128crossValRep = 3;
4141Y = num2str(gnd - 1);
4242% Rename feature matrix to X
4343X = fea;
44% First feature selection will drop pixels which
45% have less than startDropVar variance.
46startDropVar = 20;
47% Last feature selection will drop pixels which
48% have less than endDropVar variance.
49% USE MAX 40! Otherwise will crash.
50endDropVar = 40;
5144% Length of a single step.
5245stepLength = (endDropVar - startDropVar) / droppingSteps;
5346% X will contain the number of features used in training.
6262% Feature-selected test data
6363 fsX = fsVar(fea,ignoreVariance);
6464 usedFeatures = size(fsX, 2);
65
65 disp(['Used ' ,num2str(usedFeatures), ' features' ])
6666% ### Desicion tree begin ###
6767 if useDesicionTree
6868% Collect all losses to a vector.
9292% disp(['Training KNN took ', num2str(toc), ' usedFeatures' ])
9393 knnTrainTimes(h) = toc;
9494 knnLosses(h) = kfoldLoss(cvKnn);
95 tKnn(1,i+1) = mean(knnTrainTimes);
9695 end
96 tKnn(1,i+1) = mean(knnTrainTimes);
9797 % xKnn(1,i+1) = usedFeatures;
9898 yKnn(1,i+1) = mean(knnLosses);
9999 end