If you’ve ever been nearly run off the road by some Sunday driver, consider this: Would you feel safer or less safe knowing you were flipping off a robot?
Yale’s website features a release on a computer system being developed by Eugenio Culurciello at Yale and Yann LeCun at NYU, who presented their research yesterday at the High Performance Embedded Computing workshop in Lexington, Massachusetts. Dubbed NeuFlow, it’s a computer that processes information visually and, it’s hoped, may someday drive a car.
Although this feat is a standard in science fiction, it’s actually proven elusive to robot-builders. A three-dimensional world is, apparently, more than previous robot brains can handle. As the Yale release puts it:
Navigating our way down the street is something most of us take for granted; we seem to recognize cars, other people, trees and lampposts instantaneously and without much thought. In fact, visually interpreting our environment as quickly as we do is an astonishing feat requiring an enormous number of computations—which is just one reason that coming up with a computer-driven system that can mimic the human brain in visually recognizing objects has proven so difficult.
Now [Culurciello] has developed a supercomputer based on the human visual system that operates much more quickly and efficiently than ever before. Dubbed NeuFlow, the system takes its inspiration from the mammalian visual system, mimicking its neural network to quickly interpret the world around it….The system uses complex vision algorithms developed by [LeCun] to run large neural networks for synthetic vision applications. One idea—the one Culurciello and LeCun are focusing on, is a system that would allow cars to drive themselves. In order to be able to recognize the various objects encountered on the road—such as other cars, people, stoplights, sidewalks, not to mention the road itself—NeuFlow processes tens of megapixel images in real time.
Check out the video of NeuFlow in action above, or test your comprehension of wacky robot design schematics by looking at some of the details at Yale’s site.