BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Artificial Intelligence and Manufacturing

Following
This article is more than 9 years old.

American manufacturing has come a long way in automating factories with robots and Computers. Over the last 40 years palletizer systems and industrial robots have replaced humans in lots of back breaking and repetitive jobs. Millions of clerical jobs have also been replaced by computers and software.

These advances in technology along with movies like the Terminator and Star Wars has led to a lot of speculation about how far artificial intelligence can be developed. In the Terminator film, someone makes a microprocessor so advanced that it makes machines self aware and then they connect all computers on the internet to cause an atomic war. The suggestion is that microprocessors can become so sophisticated that they can think like humans.

In 1965, DR. Herbert Simon, one of the founders of artificial intelligence (AI) said,” Machines will be capable in 20 years of doing any work a man can do. ”Marvin Minsky, another AI guru from MIT said, “within a generation …the problem of creating artificial intelligence will be substantially solved.” Moshe Vardi, a computer scientist at Rice University in Houston said, “Everything that humans can do machines can do”

Professors at Universities and computer scientists also add to the excitement by promoting artificial intelligence with futuristic potential as they try to get their share of federal grant money. The big question that comes up is, when will computers be able to emulate humans and become self aware and intelligent?

If you evaluate all of the speculative articles on artificial intelligence in the last decade you could conclude that that we are on the verge of building a C3PO robot that is self aware and can think as well as a human. But, contrary to popular belief, very little progress has been made in developing a robot brain that would approximate human intelligence and the capabilities of the brain, and it looks like we are a long way from doing it.

To get some idea on what it is going to take to design something that equals the human brain, it is helpful to review some of the basic facts about digital computers and the human brain.

Digital Computers

A digital computer system is one in which information has discrete values. The design includes transistors (on/off switches); a central processing unit (CPU), some kind of operating system (like windows) and it is based on binary logic (instructions coded as 0s and 1s)

Computers are linear designs and have continually grown in terms of size, speed and capacity. In 1971 the first Intel microprocessor (model 4004) had 2,300 transistors, but by 2011 the Intel Pentium microprocessor had 2.3 billion transistors. One of the fastest super computers today is built by Fujitsu . It has 864 racks containing 88,128 individual CPUs and can operate at a speed of 10.51 peta-flops or 10.5 trillion calculations per second. Supercomputers have led many people to think that we must be finally approaching the capabilities of the human brain in terms of speed and capability, and we must be on the verge of creating a C3PO type robot that can think and converse just like a human. But the fact is that we still have a long way to go which is the subject of this article.

The Human Brain

The human brain is not a digital computer design. It is some kind of analogue neural network that encodes information on a continuum. It does have its own important parts that are involved in the thinking process, such as the pre-frontal cortex, amygdale, thalamus, hippocampus, limbic system that are inter-connected by neurons. However, the way they communicate and work is totally different from a digital computer.

Neurons are the real key to how the brain learns, thinks, perceives, stores memory, and a host of other functions. The average brain has at least 100 billion neurons which are connected to axons, dendrites and glial cells which each have thousands of synapses that transmit signals via electro/chemical connections. It is the synapses that are most comparable to transistors because they turn off or on. Because each neuron contains axons and dendrites that have around 1,000 synapses, the total capability is 100 billion times 1,000 compared to the 2.3 billion transistors in a Pentium Chip.

Each neuron is a living cell and a computer in its own right. A neuron has the “signal processing power of thousands of transistors. A book by Stephen J. Gislasor states;

”Unlike transistors neurons can modify their synapses and modulate the frequency of their signals. Each neuron has the capability to communicate with 10,000 other neurons.

Unlike digital computers with fixed architecture, the brain can constantly re-wire its neurons to learn and adapt. Instead of programs, neural networks learn by doing and remembering and this vast network of connected neurons gives the brain excellent pattern recognition.

Limitations of Digital Computers

 

We have been so successful with Large Scale Integration ( LSI ) in continuously

shrinking microprocessor circuits and adding more transistors year after year that people have begun to believe that we might actually equal the human brain. But, there are problems.

The first problem is that in digital computers all calculations must pass through the CPU which eventually slows down its program. The human brain doesn’t use a CPU and is much more efficient.

The second problem is the limitations to shrinking circuits. In 1971 when Intel introduced the 4004 microprocessor, it could hold 2.300 transistors and they were about 10 microns wide. Today a Pentium chip has 1.4 billion transistors and the transistors are down to 22 nanometers wide (a nanometer is one billionth of a meter). They are currently working at 14 nanometers and hope to reach 10 nanometers in 2015.

The problem is that they are getting close to the size of a few atoms where they will begin to run into problems of quantum physics such as the “uncertainty principle where you wouldn’t be able to determine precisely where the electron is and it could leak out of the wire.” This could end size reduction for digital computers.”

Part Two will explore the idea of robots taking our jobs.

Michael Collins is the author of Saving American Manufacturing. His website is mpcmgtr.com