BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How IBM's Cognitive Computer Works

Following
This article is more than 10 years old.

A few days ago, I wrote about IBM's announced SyNAPSE project, where IBM had announced its development of a cognitive computer chip. In that piece, I expressed my frustration that from the press materials, it was difficult to tell what IBM was actually doing, but that with what little information was there, it seemed to be a pretty exciting project.

Since then, IBM has provided me with some more technical information, and I had a chance to hop on the phone with Dr. Dharmendra S. Modha, who is the head of the SyNAPSE project. From the materials and my interview with Dr. Modha, I've gotten a much clearer picture of how their cognitive chip works. Suffice to say, I was impressed. If IBM's team is able to scale up their architecture, I think it has the potential to truly change computing.

Right now, Dr. Modha says, "Our achievements our humble but our aspirations are lofty." So far, the chip is fairly limited in what it can do. For example, if you show it (through programming) what a triangle looks like, then show it just a part of a triangle, its algorithms are able to display a full triangle back. That is, it recognizes a whole triangle from just a part of a triangle. You can also play Pong with it. "And sometimes it will actually win," laughs Dr. Modha.

So what would be the best use for such chips? Well, Dr. Modha isn't kidding when he said he had lofty aspirations. His goal is to use the cognitive computers as the means for computers to better interact with the environment. For example, a grocer might wear a glove keyed into sensors and evaluate whether a piece of fruit the grocer has picked up has gone bad - in real time. Another application might be instruments in the ocean that monitor temperature, pressure, humidity, tides, etc. and determine whether a tsunami warning should be issued.

But what will probably intrigue readers of Forbes the most is taking advantage of the chip's ability to recognize patterns from large amounts of information, such as commodity prices, stock prices, etc. That means better risk and opportunity evaluation. One of the goals of the project is to develop a computer that can help turn a profit in business sectors that right now are too complex to be profitable.

But to do that, the team first has to fundamentally change how computers are designed.

The Problem With Modern Computers

For the past half-century, most computers run on what's known as von Neumann architecture, and the computer I'm writing this on and the computer you're reading this on definitely run on von Neumann architecture. In a von Neumann system, the processing of information and the storage of information are kept separate. Data travels to and from the processor and memory - but the computer can't process and store at the same time. By the nature of the architecture, it's a linear process. That's why software is written as a set of instructions for a computer to follow - it's a linear sequence of events, built for a linear process. This is where clock speed comes in - the faster the clock speed (for example, the 4 cores on my processor run at 3.0GHz), the faster the computer can process those linear instructions.

According to Dr. Modha, von Neumann architecture was essential to develop computers in the days of vacuum tubes and early transistors, but modern chipbuilding techniques have exposed its limitations. "We've gotten to the point where we can pack more transistors on a chip than we can actually power, because if we powered them all, they'd burn out" due to the excess heat created by the electricity in the chip. Dr. Modha likens processing to transporting oranges. The trees are memory, the oranges are bits. The consumers are processors. The oranges have to travel, by highway, to get to the consumer - but the more oranges, the more tied up traffic gets, so you run into problems on the chip.

Solving this problem is the focus of computer scientists around the world. The SyNAPSE team's solution is to bypass von Neumann architecture entirely with cognitive computing. To keep the orange grove analogy, the SyNAPSE team wants to, in Dr. Modha's words, "move people to the orange grove" so the processors can be integrated with the memory. How the Cognitive Chip Works

Dr. Modha's team isn't the first group trying to practically bypass von Neumann architecture. But what makes their approach unique is that they're taking their inspiration from the way human neural architecture works. It's not emulation, per se - as Dr. Modha was quick to point out, "Comparing what we can do to what mother nature can do is quite humbling."

That said, I think their solution to computing is quite elegant. Here's the basics of how the chip works. What they've been able to achieve right now is a chip with 256 processors (which the team has dubbed "neurons") laid out in an array of rows and columns. The neurons process in parallel, rather than relying on linear structures, and are connected to 1024 axons on the chips by synapses - which is where the memory is stored. The axons act to either excite or hinder the power going through the synapses to the processors. Depending on the power and information its getting from the axons and synapses, the neuron determines whether its reached its predetermined "threshold potential" - basically, whether its found a solution to the problem or part of the problem put to it. If it has, it will "spike" - sending a signal back through the synapse - and reset itself.

The synapse then has the solution sent from the neuron, while the neuron goes into a state where it's awaiting further information. Now picture all 256 neurons acting at the same time, with their signals modulated by the actions of the synapses and axons, and you can see the potential. All 256 neurons are working in parallel to each other, rather than simply acting on a set of linear instructions. The goal, says Dr. Modha, is a chip that's able to better handle environmental feedback. "So far, our computers are left-brained, focusing on linearity and computation. What this is is a right-brained computer" capable of recognizing patterns and able to handle ambiguity. And because it's operating in parallel, with an integrated memory, it uses much less power than a traditional processor. When one neuron spikes, the total active energy used is only 45 picojoules - that's 4.5 x 10^-11 Joules, a very small amount of energy.

Applications and Challenges of Cognitive Computing

As I mentioned above, right now what IBM's chip can do is pretty limited. But that's because it only has 256 neurons. For its next phase, the SyNAPSE project hopes to scale up the chips design to 1,000,000 neurons. I asked him what challenges scaling the chip up would pose, and Dr. Modha laughed again. "Ask me again in a year. Right now, we're just moving fast trying to build a dream on a deadline."

Another thing I was curious about was programming. How do you write programs as a sequence of instructions when the hardware is running in parallel? That's still a bit up in the air. "It fundamentally changes what it means to program." Instead of drafting a set of instructions, the hardware would instead, ideally, be taught what it needs to do. "We're at such an early stage – this is a paradigm changing technology that has the potential to open up a whole new industry.”

This doesn't mean the end of the standard computing architecture, though. In Dr. Modha's vision, computers would ship with both a cognitive core and a traditional processor. The "right brain" of the computer would identify patterns in the environment, then ship the information over to the "left brain" for processing and computation. Of course, to my mind, the first thing that sprang to mind was robotics applications - better pattern recognition means faster cookie making. Dr. Modha demurred on the subject of robotics applications, but said that he wants to "change science, IT, and business in a fundamental way."

I'm fascinated to see if he can.

Follow me on Twitter or Facebook. Read my Forbes blog here.