Soft weight networks for few-shot learning
Neural network design has accomplished remarkable advances in complex tasks previously thought to be reserved solely for higher cognitive ability. Neural networks can now accurately classify images, play sophisticated games, translate and speak naturally, and even mimic artistic styles. Unfortunately, neural networks have performed sub-par where human performance often excel, which is true for few-shot learning tasks where the objective is to classify a set of query samples based on only a few support examples. Recent neural networks for few-shot learning have found that they can learn pairwise similarity metrics via stochastic gradient descent when a subset of support and query samples are repeatedly and randomly selected from the training set (known as episodic training). In this work, we propose a novel Soft Weight Network (SWN) that allows the network to generate embedded features and soft weights to cross compare every query-support pairing. The benchmarks of SWNs against the state-of-the-art few-shot learning algorithms show that the performance of SWNs on Omniglot and miniImageNet are more favorable when compared to the current/recent few-shot learning approaches.
Bio: Gregory Ditzler (IEEE S’04–M’15–SM'18) received a BSc from the Pennsylvania College of Technology (2008), an MSc degree from Rowan University (2011), and a PhD from Drexel University (2015). He is currently an Assistant Professor at the University of Arizona in the Electrical & Computer Engineering Department. He received an Outstanding Article Award from the IEEE Computational Intelligence Society Magzine (2018), the best paper at the IEEE International Conference on Cloud and Autonomic Computing (2017), best student paper at the IEEE/INNS International Joint Conference on Neural Networks (2014), a Nihat Bilgutay Fellowship (2013), Koerner Family Engineering Fellowship (2014), Drexel University’s Office of Graduate Studies Research Excellence Award (2015), and Rowan University’s Research Achievement Award (2009). In 2016 and 2018, he was selected as a summer faculty fellow at the Air Force Research Lab to work on adversarial learning algorithms. He is a senior member of the IEEE. His research has been supported by the ARFL, ARO, DOE and ONR.