Upload
others
View
6
Download
0
Embed Size (px)
Citation preview
5/15/17 | 1
Since 1614
Scale selection in Convolutional Neural Networks using Multi-Scale Filters and Dimensional MinMax Pooling
Gereon Vienken Sander Bohte Max Losch Steven Scholte
5/15/17
Overview
›Project description›Results›Lessons learned
5/15/17
CNN
https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/
5/15/17
Pooling
›Filters
http://vaaaaaanquish.hatenablog.com/entry/2015/01/26/060622
5/15/17
MinMax
›Select optimal filter scale
›Smallest Filter with relevant information
›Threshold for value distribution
5/15/17
MinMax
›Original Implementation by Bohte and Gehreab›Shows Min approximates optimum›Normalized Input›Pre-implemented absolute Gabor Filters
5/15/17
Implementation
5/15/17
Results
5/15/17
Results
5/15/17
Conclusion
›Slight overfitting of Min in the end›Min performs slightly better then Max›Threshold not as important
5/15/17
Lessons learned
›Start on Dataset with smaller images› Runtime: 3Days -> 5 Hour
›Make sure you understand restrictions of ground research
› Normalised images›If you use different machines make sure you understand the influence
› Overfitting accoutred on GPU cluster and weak GPU but not on TitanX
›Be aware: Drucker’s Law!› "If one thing goes wrong, everything else will, and at the
same time."
5/15/17
Questions
http://www.consciousentities.com/2014/02/the-hard-problem-problem/