Deep neural-networks require a tremendous amount of power to actually be as effective as the human brain. Sure your Tesla Model S might ave a small DNN powered by NVIDIA inside, but it isn't nearly complex enough to to provide a full true-to-life aritifical intelligence experience. New breakthroughs from MIT might be able to provide full-on human-brain inspired AI experience on your phone.
Researchers at MIT presented a new chip mobile chip that's designed specifically for neural networks and it happens to be 10 times more efficient than any mobile GPU currently in production. They're calling it the "Eyeriss" and the researchers are hoping that it can potentially change small-device computing. Just imagine having Siri or Cortana being that much more useful because they've got the processing power local to them.
And beyond that this innovation could help to further develop the idea of the Internet of Things, where powerful AI programs can communicate with other devices and coordinate tasks to get things done nearly invisible to the user. The possiblities are endless with the way that individual small-machines needn't be connected to the Internet itself for the compute power itself, but instead merely for communication.
Samsung is certainly excited about the future of mobile compute in this new area. "This work is very important, showing how embedded processors for deep learning can provide power and performance optimizations that will bring these complex computations from the cloud to mobile devices," says Mike Polley, a senior vice president at Samsung's Mobile Processor Innovations Lab.
The work is in the early stages, but any innovations are sure to bring advancements into the real-world at some point, whether as a co-processor or even a new architecture itself. When we'll see an infusion of such technology is unknown at the moment. NVIDIA has already laid quite the groundwork, however, with their design win with Tesla.