« June 2016 | Main | August 2016 »

July 31, 2016

Misha Mahowald Prize

Misha Mahowald Prize

Here is a press release.

Press Release: Inaugural Misha Mahowald Prize for Neuromorphic Engineering won by IBM TrueNorth Project

The Misha Mahowald Prize recognizes outstanding achievement in the field of neuromorphic engineering. Neuromorphic engineering is defined as the construction of artificial computing systems which implement key computational principles found in natural nervous systems. Understanding how to build such systems may enable a new generation of intelligent devices, able to interact in real-time in uncertain real-world conditions under severe power constraints, as biological brains do.

Misha Mahowald, for whom the prize is named, was a charismatic, talented and influential pioneer of neuromorphic engineering whose creative life unfortunately ended prematurely. Nevertheless, her novel designs of brain-inspired CMOS VLSI circuits for vision and computation have continued to influence a generation of engineers.

For the inaugural 2016 prize, the independent jury led by Prof. Terrence Sejnowski of the Salk Institute evaluated 21 entries worldwide. They have selected the TrueNorth project, led by Dr. Dharmendra S. Modha at IBM Research – Almaden in San Jose, California as the winner for 2016:

“For the development of TrueNorth, a neuromorphic CMOS chip that simulates 1 million spiking neurons with connectivity and dynamics that can be flexibly programmed while consuming only 70 milliwatts. This scalable architecture sets a new standard and brings us closer to achieving the high levels of performance in brains.”

The TrueNorth architecture is a milestone in the development of neuromorphic processors because it achieves the combination of scale, ultra-low-power and high performance that has never before been demonstrated in a real neuromorphic system. It is the first neuromorphic system that can compete with conventional state-of-the-art von Neumann processors on real-world problems on an equal footing. In doing this, it opens the door to future orders-of-magnitude improvements in computing power that will no longer be possible using the von Neumann architecture as its inherent bottlenecks approach physical limits.

The prize and certificate will be presented at the 30th anniversary celebration of the IBM Almaden Research Center in San Jose on 11 August, 2016.

The Misha Mahowald Prize is sponsored and administered by iniLabs (www.inilabs.com) in Switzerland.

Demo at Conference on Computer Vision and Pattern Recognition

Guest Blog by Arnon Amir, Brian Taba, and Timothy Melano

The Conference on Computer Vision and Pattern Recognition (CVPR) is widely considered as the preeminent conference for computer vision. This year the IBM Brain Inspired Computing team had the pleasure of demonstrating our latest technology at the CVPR 2016 Industry Expo, held in the air-conditioned conference halls of Caesars Palace, Las Vegas. The expo was co-located with academic poster presentations, which created an excellent opportunity for us to not only meet very interesting academics, but also to see the latest demos from other amazing companies, both large and small.

Demo booth

We too were excited to demonstrate our new Runtime API for TrueNorth. To showcase it, we connected an event-based vision sensor, the DVS128 (made by iniLabs), over USB to our NS1e board.

Hardware flow

We used our Eedn framework to train a convolutional neural network on hand and arm gestures collected from our team, including air-drums and air-guitar! This Eedn network was used to configure the TrueNorth chip on the NS1e board. Overall, the system received asynchronous pixel events from the DVS128 sensor and passed them to TrueNorth. A new classification was produced every one millisecond, or at 1000 classifications per second.

The reaction to the real-time gesture classifications was very positive and drew large crowds (and other hardware vendors ;). People were blown away by that fact that we were running a convnet in real-time at 1000 classifications per second while consuming only milliwatts of power. We invited anyone who was interested to come behind our table to play with the gesture recognition. With a little bit of adjustment, people were able to interact with TrueNorth and have their gestures recognized. To many in the audience, the entire concept of neuromorphic engineering was new. Their visit to our booth was a great opportunity to introduce them to the DVS128, a spiking sensor inspired by the human retina, and TrueNorth, a spiking neural network chip inspired by the human brain!

Gesture icons
A video can be seen here.

Previously, we have demonstrated that TrueNorth can perform greater than 1000 classifications per second on benchmark datasets. Therefore, the new Runtime API opens the interface to the NS1e board and the TrueNorth chip for many exciting real-time applications, processing complex data at very fast rates, yet consuming very low power.

We give special thanks to our teammates David Berg, Carmelo di Nolfo and Michael Debole for leading efforts to develop the Runtime API, to Jeff Mckinstry for performing the Eedn training, to Guillaume Garreau for his help with data preparation, and to the entire Brain Inspired Computing team for volunteering to create the training data set!