While IBM is looking to the processing might of its Blue Gene supercomputer to attempt to replicate the functions of the human brain, a former R&D whiz kid from Acorn is attempting to perform a similar feat using thousands of ARM-based processors.
The SpiNNaker project is headed up by Professor Stephen Furber, who worked at Acorn between 1980 and 1990, and was also a principal designer of both the BBC Micro and the 32-bit ARM RISC processor. Back then, the chip was designed for Acorn’s line of Archimedes computers, but the ARM architecture is now used widely in portable devices, from smartphones to the Nintendo DS.
The idea is that SpiNNaker will take advantage of mass-parallel processing to simulate the way in which neurons in the human brain communicate with each other, via electrical “spikes”. The system will comprise a huge array of ARM-based chips that are currently being fabricated in Taiwan, each of which will contain 20 ARM cores. In turn, each core will have access to its own block of SDRAM.
According to New Scientist, the SDRAM is needed to store the “changing synaptic weights as simple numbers that represent the importance of a given connection at any moment.”
According to Furber, these numbers will originally be loaded from the PC, but the machine will eventually be able to compute them itself as it becomes more intelligent. At that point, says Furber, “the only computer able to compute them will be the machine itself".
Each ARM core will have the job of modelling 1,000 virtual neurons, making for a total of 20,000 neurons per chip. The team hopes to be able to model a colossal total of a billion neurons with the highly parallel machine, meaning that the project will need 50,000 chips in total. This is still a way off in the future, but New Scientist reports that the team hopes to have a version with 10,000 processors up and running later this year.
Initially, Furber hopes to teach the self-learning brain to control a robotic arm, and then work towards controlling a humanoid robot. "Robots offer a natural, sensory environment for testing brain-like computers," he says. "You can instantly tell if it is being useful."
Furber isn’t the only person hoping to get close to virtually modelling a human brain. In 2008, IBM announced its own plans to build a system of virtual synapses and neurons using nanoscale devices. IBM also modelled a small mammal brain in 2008, but even this modest project required IBM’s colossal Blue Gene supercomputer project. Given the huge amount of computing power needed to emulate even basic brain, the chances of accurately modelling a whole human brain are remote.
However, Furber reckons that his low-cost approach offers a better doorway into modelling the human brain than IBM’s large-scale approach. "We're using bog-standard, off-the-shelf processors of fairly modest performance," he admits, pointing out that this is the best way to start modelling even a fraction of the flexibility of the human brain.
The SpiNNaker project is headed up by Professor Stephen Furber, who worked at Acorn between 1980 and 1990, and was also a principal designer of both the BBC Micro and the 32-bit ARM RISC processor. Back then, the chip was designed for Acorn’s line of Archimedes computers, but the ARM architecture is now used widely in portable devices, from smartphones to the Nintendo DS.
The idea is that SpiNNaker will take advantage of mass-parallel processing to simulate the way in which neurons in the human brain communicate with each other, via electrical “spikes”. The system will comprise a huge array of ARM-based chips that are currently being fabricated in Taiwan, each of which will contain 20 ARM cores. In turn, each core will have access to its own block of SDRAM.
According to New Scientist, the SDRAM is needed to store the “changing synaptic weights as simple numbers that represent the importance of a given connection at any moment.”
According to Furber, these numbers will originally be loaded from the PC, but the machine will eventually be able to compute them itself as it becomes more intelligent. At that point, says Furber, “the only computer able to compute them will be the machine itself".
Each ARM core will have the job of modelling 1,000 virtual neurons, making for a total of 20,000 neurons per chip. The team hopes to be able to model a colossal total of a billion neurons with the highly parallel machine, meaning that the project will need 50,000 chips in total. This is still a way off in the future, but New Scientist reports that the team hopes to have a version with 10,000 processors up and running later this year.
Initially, Furber hopes to teach the self-learning brain to control a robotic arm, and then work towards controlling a humanoid robot. "Robots offer a natural, sensory environment for testing brain-like computers," he says. "You can instantly tell if it is being useful."
Furber isn’t the only person hoping to get close to virtually modelling a human brain. In 2008, IBM announced its own plans to build a system of virtual synapses and neurons using nanoscale devices. IBM also modelled a small mammal brain in 2008, but even this modest project required IBM’s colossal Blue Gene supercomputer project. Given the huge amount of computing power needed to emulate even basic brain, the chances of accurately modelling a whole human brain are remote.
However, Furber reckons that his low-cost approach offers a better doorway into modelling the human brain than IBM’s large-scale approach. "We're using bog-standard, off-the-shelf processors of fairly modest performance," he admits, pointing out that this is the best way to start modelling even a fraction of the flexibility of the human brain.
No comments:
Post a Comment