Events Calendar

FDS Seminar: Priya Panda

Weekly Seminar
Event time: 
Wednesday, November 9, 2022 - 4:00pm
Location: 
DL220 See map
10 Hillhouse Avenue
New Haven, CT 06519
Event description: 

FDS Seminar

“Exploring Robustness and Energy-Efficiency in Neural Systems with Spike-based Machine Intelligence”  

Speaker: Priya Panda
Assistant Professor, Electrical Engineering, Yale University

Abstract: Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning due to their huge energy efficiency benefits on neuromorphic hardware. In this presentation, I will talk about important techniques for training SNNs which bring a huge benefit in terms of latency, accuracy, interpretability, and robustness. We will first delve into how training is performed in SNNs. Training SNNs with surrogate gradients presents computational benefits due to short latency. However, due to the non-differentiable nature of spiking neurons, the training becomes problematic and surrogate methods have thus been limited to shallow networks. To address this training issue with surrogate gradients, we will go over a recently proposed method Batch Normalization Through Time (BNTT) that allows us to train SNNs from scratch with very low latency and enables us to target interesting applications like video segmentation and beyond traditional learning scenarios, like federated training. Another critical limitation of SNNs is the lack of interpretability. While a considerable amount of attention has been given to optimizing SNNs, the development of explainability still is at its infancy. I will talk about our recent work on a bio-plausible visualization tool for SNNs, called Spike Activation Map (SAM) compatible with BNTT training. The proposed SAM highlights spikes having short inter-spike interval, containing discriminative information for classification. Finally, with proposed BNTT and SAM, I will highlight the robustness aspect of SNNs with respect to adversarial attacks. In the end, I will talk about interesting prospects of SNNs for non-conventional learning scenarios such as privacy-preserving distributed learning as well as unraveling the temporal correlation in SNNs with feedback connections. Finally, time permitting, I will talk about the prospects of SNNs for novel and emerging compute-in-memory hardware that can potentially yield order of magnitude lower power consumption than conventional CPUs/GPUs.  

Bio: Priya Panda is an assistant professor in the electrical engineering department at Yale University, USA. She received her B.E. and Master’s degree from BITS, Pilani, India in 2013 and her PhD from Purdue University, USA in 2019. During her PhD, she interned in Intel Labs where she developed large scale spiking neural network algorithms for benchmarking the Loihi chip. She is the recipient of the 2019 Amazon Research Award, 2022 Google Research Scholar Award, 2022 DARPA Riser Award. Her research interests lie in Neuromorphic Computing, energy-efficient accelerators, in-memory processing among others.

In-person talk, but remote access available here: https://yale.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=f20086eb-012d-4ead-a949-af1c01268d6d

Add event to calendar

Apple Google Office 365 Outlook Outlook.com Yahoo

.