Machine learning is one of the fastest-growing and most exciting fields out there, and deep learning represents its true bleeding edge. In this course, you’ll develop a clear understanding of the motivation for deep learning, and design intelligent systems that learn from complex and/or large-scale datasets.
I will show you how to train and optimize basic neural networks, convolutional neural networks, and long short term memory networks. Complete learning systems in MatConvNet will be introduced via projects and assignments. You will learn to solve new classes of problems that were once thought prohibitively challenging, and come to better appreciate the complex nature of human intelligence as you solve these same problems effortlessly using deep learning methods.
I am a Scientist at IBM Research - Australia, working with the TrueNorth team on Machine Learning and its applications to mobile processors. I completed my Ph.D. from the Nanyang Technological University, Singapore on Neural Networks and Deep Learning. Like most researchers, I juggle between performing research and teaching students. My research involves developing novel algorithms that will be applied to automatically program the IBM's revolutionary neural network computer i.e. the TrueNorth system. I have published over 20 papers in international journals and conferences and you can check out my most cited papers below.
On the other hand, I love to teach my students the things I work on. After getting excellent feedback from the students I teach at the local universities, I decided to cater to the huge online community. My motto is to teach Machine Learning in a simple way such that it does not seem difficult and becomes accessible to everyone. I believe that behind each successful Machine Learning algorithm there is a physical significance or intuition that makes it work. Mathematics is required only to validate it. This thought pushed me into the path of creating and publishing courses related to Machine Learning, Big Data and Neural Networks where I shall discover and share those intuitions with you.
Please don't hesitate to drop me a message if you have a suggestion for a topic for one of my courses, or need help with something. I would love to talk to you.
My selected publications:
1. S. Roy and A. Basu, "An Online Structural Plasticity Rule for Generating Better Reservoirs", Neural Computation, MIT Press, 2016.
2. S. Roy and A. Basu, "An Online Unsupervised Structural Plasticity Algorithm for Spiking Neural Networks," IEEE Transactions on Neural Networks and Learning Systems, 2016.
3. S. Roy, P. P. San, S. Hussain, L. W. Wei and A. Basu, "Learning Spike time codes through Morphological Learning with Binary Synapses," IEEE Transactions on Neural Networks and Learning Systems, 2015.
4. S. Roy, A. Banerjee and A. Basu, "Liquid State Machine with Dendritically Enhanced Readout for Low-Power, Neuromorphic VLSI Implementations," IEEE Transactions on Biomedical Circuits and Systems, vol. 8, pp. 681–695, Oct. 2014.
5. S. M. Islam, S. Das, S. Ghosh, S. Roy and P. N. Suganthan, "An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization," IEEE Transactions on Systems, Man, and Cybernetics – Part B, vol. 42, no. 2, pp. 482-500, 2012.