Welcome to the BRAIN Systems Lab at NTU! Today is .               

    SNN Tutorial
 Bio-inspired Reconfigurable Analog INtegrated (BRAIN) Systems Lab  
Dr. Arindam Basu
Associate Professor
Division of Circuits and Systems
School of Electrical and Electronic Engineering
Nanyang Technological University
Office: S2-B2C-84, 50 Nanyang Avenue, Singapore 639798.

Email: arindam.basu@ntu.edu.sg

IEEE Distinguished Lecturer for Circuits and Systems Society, 2016-17.
Associate Editor: IEEE Transactions on Biomedical Circuits and Systems, IEEE Sensors Journal, Frontiers in Neuroscience
Guest Associate Editor: Special Issue in IEEE Transactions on Biomedical Circuits and Systems on selected papers from (a) ISCAS 2015 and (b) BioCAS 2015
IEEE CASS Technical Committee Member: Biomedical Circuits and Systems, Neural Systems and Applications, Sensory Systems

NEWS: 1 paper accepted in Nature Communications on using Optogenetics inspired weight transfer for in-memory computing based deep recurent networks. Congratulations Rohit & Jyoti.

NEWS: 1 paper accepted in Nature Communications on memristive robotic nervous systems with 3 levels of robustness--pain receptor guided reflex, self-healing materials and asociative learning based distributed synapses within the peripheral nervous system. Congratulations Rohit.

NEWS: Our survey for ML accelerators accompanying our ASILOMAR 2019 paper is available online. See here.

Research Topics

Our Lab conducts highly interdisciplinary research involving the fields of Analog/Mixed-signal IC Design, Computational neuroscience, Reconfigurable systems and Non-linear dynamics. Our current work focuses on the two areas of (a) Low-power Neuro-inspired or Neuromorphic circuits and algorithms for Machine Learning and (b) Low-power circuits and systems for Neural Interfacing. We are also motivated in commercializing these innovations. Some specific topics of interest are:
1. Low-power Neuro-inspired or Neuromorphic circuits and algorithms for Machine Learning/Pattern Recognition: With the trend of increasing data being collected from myriads of sensors in the age of Internet of Things, it is increasingly more important to find methods to deal with the data as early as possible. Machine Learning is therefore of prime interest since it allows us to extract information or patterns from the data which lead to insights and potential actions based on those insights. We take inspiration from one of the best known low-power pattern recognizers in the world--our brain! We can easily recognize a known face amidst tens of faces in a video--a task that is still difficult for current computer vision algorithms. More importantly, we do it at power levels that are orders of magnitude lower than our current GPU/CPU. Specific projects that we are currently working on include: Computation using Mismatch: Process variation induced mismatch between transistors is a major threat to low-voltage, low-power processing in deep sub-micron CMOS. This is a problem faced by neurons in our brain as well. Inspired by this, we are developing machine learning systems that can utilize this mismatch to perform effective computation at much lower power than their digital counterparts. For example, we have developed microwatt machine learners based on Extreme Learning Machine algorithm and used them for detcting patterns of biomedical signals (e.g. spike sorting, seizure detection etc) as well as classifying images (e.g. handwritten digits) or speech commands (e.g. spoken digits passed through Neuromorphic Cochlea). This work has potential to be applied to smart sensors or wearable devices or Internet of Things. These chips have also been used at the NSF sponsored Annual Neuromorphic Cognition Workshop at Telluride, Colorado.
Dendritic processing and Structural Plasticity Most large scale cortical simulations as well as ANN models have ignored the role of dendrites and reduced them to linear summers. However, there is ample evidence in neuroscientific experiments pointing to the non-linear processing performed by dendrites in the last decade. Also, most models of learning in the AI/Neuroscience communities focus on changing weights. However, there is an alternate medium of learning--structural plasticity--by which connections are formed and eliminated in our brains through learning. We are developing machine learning systems that can utilize nonlinear dendrites (NLD) and strucutral plasticity with 1-bit synapses. Usage of low-resolution synapses helps in designing robust analog learning chips and reduces memory storage. Since the learning operates by changing connections, it can be exploited using address event representation techniques in neuromorphic VLSI implementations without additional overhead.
Learning Synapses with novel devices The synapses in a neural network, both artificial and biological, outnumber the number of neurons by a factor of 100-1000. Hence, it is of utmost importance to make the corresponding circuits compact and low-power. The problem is compounded by the fact that these devices have to exhibit learning through modification of their strengths and have to store this strength in a non-volatile fashion. To achive this, we use flash or floating-gate memories as compact learning synapses and have demonstrated doublet and triplet STDP in these devices. We are also starting to explore usage of spintronic domain wall memories for low voltage learning synapses overcoming the high write votlage required for tunneling floating-gates.
Dynamical systems guided Neuromorphic Design To reduce the footprint of neuronal circuits exhibiting bio-realistic dynamics, we use tools from dynamical systems theory--bifurcation analysis, phase response curves etc.-- to simplify the dfferential equations to be implemented in silicon. We have used this method to design the world's lowest power neuron exhibiting type I dynamics and the smallest central pattern generator for locomotion control.
2. Low-power circuits and systems for Neural Interfacing: Acquiring signals from the brain is extremely important for understanding brain function and providing potential cure for brain diseases and abnormal function(e.g. epilepsy, tremor). In general, interfacing with neurons can be broadly categorized in two classes: Extracellular and Intracellular Electrophysiology. In extracellular methods, we focus on neural signal recording from implants in the brain. These systems need to record uV level signals in the range of 1-100 Hz for LFP and 0.2-5 kHz for spikes from hundreds of electrodes in parallel while dissipating minimal power (to avoid tissue damage). This signal needs to be eventually digitized and transmitted off-chip wirelessly. We focus on novel, micropower signal acquisition and conditioning circuits/techniques to reduce the burden on the ADC and transmitter to increase the scalability of such systems to thoousands of channels in the future. Some topics are: Neural Ampifiers We have designed digitally assisted neural amplifiers that exploit the statistics of neural signals to provide dynamic range beyond the power supply of the chip. Spike Detector We have designed current mode sub-uW(lowest reported so far) neural spike detectors by approximating the Nonlinear Energy Operator (NEO). Current mode design allows lowering of power supply and reusing derivatives of NEO for feature extraction required by spike sorting. Spike Sorting We have designed uW range machine learners based on Extreme Learning Machine (ELM) for supervised spike sorting that is similar to template matching. Intention Decoding We have used ELM based circuits to decode motor intention from spike trains recorded from the motor cortex. This was benchmarked with software simulation of decoding individuated finger movement of monkeys and was comparable. This sub-uW design is the world's first intention decoder with low enough power to be implanted. The chip also features an algorithm to improve performance when some neural channels lose information over time.

Upcoming Events

Talk at IAS@NTU Discovery Science Seminar in Dec, 2020: "Neuromorphic AI: A new era for Robotics "

Recent Invited Talks/ Workshops/ Tutorials

Dec, 2020: Gave invited talk at US-Singapore AI workshop: "Neuromorphics 2.0: A new generation of brain inspired AI"

Nov, 2020: Gave invited talk at A*Star Edge Computing workshop for Microelectronics 2.0: "Neuromorphics@NTU"

Oct, 2020: Gave invited talk at Panel on Assistive Technology, Disability and Ageing, VAIBHAV summit organized by Govt. of India: "Neurological Disorders & Assistive Technology"

Oct, 2020: Gave invited talk at Panel on Full-custom design, VAIBHAV summit organized by Govt. of India: "Neuromorphic In-memory Computing for Low-power & Scalable AI"

July, 2020: Gave invited talk at IEEE International Conference on Signal Processing and Communications (SPCOM), Bangalore, India (held virtually): "Hybrid event-frame approaches to Efficient Object Tracking using Neuromorphic Vision Sensors"

Nov, 2019: Gave invited tutorial at IEEE CASS Seasonal School, Shanghai, China: "Low-power Adaptive Neuromorphic Systems: Devices, Circuits, Algorithms and Architectures"

Nov, 2019: Gave invited talk at IEEE CASS-SH AI4I forum with Alibaba, Shanghai, China: "Neuromorphic Engineering 2.0: AI for Edge Computing"

Nov, 2019: Gave invited talk at ASILOMAR 2019, California, USA: "Is my Neural Network Neuromorphic?"

June, 2018: Gave invited talk at Dendrites 2018, Heraklion, Greece on "Non-linear Dendrites and Structural Plasticity: A Recipe
for Efficient Neuromorphic Machine Learning".

May, 2018: Gave tutorial talk at ISCAS 2018, Florence, Italy on "Low-power Adaptive Neuromorphic Systems: Device, Circuits, Algorithms and Architectures".

April, 2018: Gave invited talk at COSADE 2018, Singapore on "When Physical Unclonable Function (PUF) meets Machine Learning".

Oct, 2017: Gave invited talk at Aristotle University of Thessaloniki, Greece on "Designing Low-Power "Intelligent" Chips in the face of Statistical Variations
of Nanoscale Devices" as part of the IEEE CASS Distinguished Lecturer Program.

Mar, 2017: Gave invited talk at INI, Zurich on "Designing Low-Power "Intelligent" Chips in the face of Statistical Variations of Nanoscale Devices"
as part of the IEEE CASS Distinguished Lecturer Program.

Jan, 2017: Gave talk at STEE advance on "Neuromorphic Engineering ... Past, Present and Future "

Oct, 2016: Gave talk at Cloud Expo Asia on "IoT:Intelligence of Things"

Sep, 2016: We hosted the 2016 IEEE CIS Summer School on Neuromorphic Systems for Machine Learning. More information can be found here.
Gave talk on "Designing Low-Power "Intelligent" Chips in the face of Statistical Variations of Nanoscale Devices" in the same.

Sep, 2016: Gave talk on "Designing Low-Power "Intelligent" Chips in the face of Statistical Variations of Nanoscale Devices"
in the NTU-Mediatek IC Design Workshop.

Aug, 2016: Gave talk at IoT SG on "Implantable Chips for Smart Healthcare"

July, 2016: Gave invited talks at Kolkata and Hyderebad on "Designing Low-Power "Intelligent" Chips in the face of Statistical Variations of Nanoscale Devices"
as part of the IEEE CASS Distinguished Lecturer Program.

May, 2016: Attended IEEE ISCAS 2016 in Montreal. Our group had 3 papers there!

Oct, 2015: Attended IEEE BioCAS 2015 in Atlanta. Our group had 3 papers there!
Also gave invited talks at Georgia Tech, Case Western Reserve and Purdue on "How to live with Statistical Variations the “Neuromorphic” way".

July, 2015: Delivered invited talk on "How can Dendritic Computation be useful in Neuromorphic Systems?" at the Asia-Pacific Summer School
on Bio-Inspired Systems and Prosthetic Devices, 2015 in Taiwan. We also ran a hands-on project. Details are available here.

Aug, 2015: Delivered tutorial on "Spiking Neural Networks in Silicon: From Building Blocks to Architectures of Neuromorphic Systems" at IJCNN 2015
in Ireland. IJCNN is the flagship conference of IEEE Neural Network Society. Details are available here.

June-July, 2015: Co-organized workgroup on "Spike-Based Cognitive Computing: Seeing, Hearing, and Thinking with Spikes" at the NSF funded annual
neuromorphic workshop in Telluride, USA. Details are available here

Jan, 2015, IEEE CASS, Singapore chapter talk: "Neuromorphic Circuits for Scalable Neuroprosthetics"

April, 2014, IEEE Symposium on Bioelectronics and Bioinformatics , Taiwan: "Neuro-inspired Circuits for Smart Sensors "

Dec, 2013, Symposium on Grand Challenges in Neural Technology, Singapore : "Neuromorphic Circuits for Scalable Neuroprosthetics"

Aug, 2013, Texas Instruments, Dallas: "Integrated Circuits for Neural Recording and Processing"

July, 2013, Indian Institute of Technology, Kharagpur: "Bio-inspired and Biomedical "Intelligent" Integrated Circuits"

Jan 2013, Indian Statistical Institute, Kolkata: "Bio-inspired and Biomedical "Intelligent" Integrated Circuits"

Dec 2012, National Taiwan University of Science and Technology, Taiwan: "Neuro-inspired Analog Circuits"

Aug 2012, SiNAPSE, Singapore: "Neuro-inspired Reconfigurable Processors"

June 2012, University of New South Wales, Sydney: "Biomedical and Bio-inspired Electronics"

Apr 2012, Duke NUS, Singapore: "Neuro-inspired Reconfigurable Processors"

July, 2011, Data Storage Institute, Singapore: "Reconfigurable Integrated Circuits for Neural Dynamics & Signal Processing "

Nov, 2010, IEEE BioCAS, Cyprus: "FPAA devices for Biological Modeling, Computing, and Interfacing Applications"


Keshab K. Parhi, University of Minnesota, Minneapolis, USA.
Pantelis Georgiou, Imperial College, London, UK.
Kaushik Roy, Purdue University, West Lafayette, USA.
Stephen P. DeWeerth, Georgia Inst. of Technology, Atlanta, USA.
Brandon Westover, Harvard Medical School, Boston, USA.
Ansuman Banerjee, Indian Statistical Institute, Kolkata, India.
Tobias Delbruck, Institute of Neuroinformatics, University and ETH Zurich, Switzerland.
Vincent J. Mooney, Georgia Inst. of Technology, Atlanta, USA.
Bhargab B. Bhattacharya, Indian Statistical Institute, Kolkata, India.
Shih-Chii Liu, Institute of Neuroinformatics, University and ETH Zurich, Switzerland.
Jennifer Hasler, Georgia Institute of Technology, Atlanta, USA.
Pamela Bhatti, Georgia Institute of Technology, Atlanta, USA.

1. TENURED!! Promoted to Associate Professor with Tenure at NTU, Sept 2016.

2. Serving as Publicity Co-Chair for ISCAS 2017 (May 2016)

3. Selected as IEEE CASS Distinguished Lecturer for 2016-17.

4. Prof. Basu awarded MIT Technology Review's inaugural TR35@Singapore award in 2012 awarded to the top 12 innovators under the age of 35 in SE Asia, Australia and
New Zealand. The official news is available here.
This award was in recognition of his work on Neuro-inspired Reconfigurable Circuits.

5. Our work on smart chips to decode thoughts in brain implants got local and international media coverage (Channelnews Asia, Straits Times, Times of India etc).