Sept. 12, 2011
by Mark Ollig
Using advanced algorithms and digital silicon circuitry, those clever computing folks at IBM are at it again.
What do you call a silicon core that is capable of replicating the human brain’s neurons, synapses, and thread-like axons?
It’s called an IBM neurosynaptic computing chip.
It seems every year a new technological breakthrough keeps bringing us closer to creating an intelligent and independently thinking computer like the HAL 9000, as seen in the sci-fi movie “2001: A Space Odyssey.”
IBM’s latest breakthrough involves digitally mirroring and designing onto neurosynaptic silicon computing chips, the manner in which cells in the human brain are able to observe, think, reason, learn, and carry out problem solving.
This latest breakthrough in technology will bring about the creation of what IBM calls “cognitive computers.”
This futurist path towards creating an artificial intellect inside a computer causes me to feel both enthusiastic, and yet somewhat frightened . . . but I digress.
In a released statement, IBM said two prototype computing chips have already been manufactured and are now being tested.
The two prototype chips were constructed at IBM’s advanced chip-making facility in Fishkill, NY.
These two working chip cores were fabricated on 45 nanometer Silicon on Insulator-Complementary Metal Oxide Semiconductor (SOI-CMOS) material and contain 256 neurons.
One chip core contains connecting points consisting of 262,144 programmable synapses, and the other core contains 65,536 learning synapses.
IBM said these newly created cognitive computing prototype chips contain no, I repeat, no biological elements; they are made only from digital silicon circuits.
The long-term goal is to create a cognitive computing chip system with 10 billion neurons and 100 trillion synapses using one kilowatt of energy, all while occupying a space a little smaller than the size of a 2-liter soda bottle.
IBM’s much publicized supercomputer, named Watson, processed information over a series of computing systems consisting of eight refrigerator-sized cabinet bays.
Our brain, on the other hand, processes information inside of a coconut-sized pinkish-gray mass weighing approximately 3 pounds.
Watson played the “Jeopardy!” television game show using 15TB (terabyte) (or 15,360 gigabytes) of random access memory.
Comparatively, the human brain has the capacity to store around 2.5 PB (petabyte) (or 2,621,440 gigabytes) of informational memory.
Cognitive computing architecture is, according to IBM, “an on-chip network of light-weight cores, creating a single integrated system of hardware and software.”
Neurosynaptic computing chips working inside a cognitive computer will ultimately be able to reprogram themselves based upon interactions within its surroundings and past learning experiences.
Wisconsin researchers are also playing a part in this IBM project.
Giulio Tononi is a University of Wisconsin-Madison psychiatrist and neuroscientist. He leads the University of Wisconsin-Madison team involved in designing the software to teach the computing chips to learn and think.
“We are using the new IBM neurosynaptic chip to develop cognitive computing architectures that are good at integrating information – a key adaptive feature that the brain excels at, and which has proven difficult to achieve using conventional computers,” said Tononi.
There are also nanotechnology and supercomputing experts from Cornell University and the University of California, Merced, working with IBM on the hardware design.
Examples of early, real-world use of cognitive computing may include cognitive processors inside traffic lights which would monitor traffic sights and sounds and alert those nearby of any immediate danger.
Cognitive processors could also utilize sensors and detect physical hazards and even smells (like underground gas leaks), and alert people of unsafe conditions.
Workers in food-related industries, such as those involved in food processing, distribution, inventory, or food inspection, could wear an instrumented cognitive glove while handling fresh and frozen food.
This intelligent glove could monitor and detect food spoilage, warn of unsafe food or environmental temperatures, and provide other information.
I talked with Kelly Simms, IBM Communications media contact, and asked if any release dates for government, commercial, or public use of this new neurosynaptic chip technology had been projected. Simms replied, “No, not at this time.”
My hope is when this ground-breaking technology finally comes to fruition, it will be used wisely, and to everyone’s benefit.
My concern is that these intelligent, neurosynaptic-core, digital silicon-chip, cognitive computers could eventually develop to the point of self-awareness, take over the planet, and, after finding us humans inferior, decide it would be in their own best interest to reprogram our brains in order that we might be of better service to them.
Of course, the aforementioned concern is merely your overly-imaginative and highly-caffeinated sci-fi-loving columnist’s worst-case scenario.
The day in which humans succeed in creating an artificial intelligence, emulating true singularity, is rapidly approaching.
Be sure to check out the “IBM Cognitive Computing” video uploaded by IBM Research at http://tinyurl.com/3guq3en.
I trust the designers of these intelligent, cognitive computing systems will install an override switch on them – you know, just in case the one with the superior intellect becomes too confrontational, like how the HAL 9000 computer became when it refused to open the pod bay doors for astronaut Dr. David Bowman.
“Open the garage bay doors, HAL.”
“I’m sorry, Mark. I’m afraid I can’t do that.”
About Mark Ollig:
Telecommunications and all things tech has been a well-traveled road for me. I enjoy learning what is new in technology and sharing it with others who enjoy reading my particular slant on it via this blog. I am also a freelance columnist for my hometown's print and digital newspaper.