URTH |
Date: Wed, 24 Apr 2002 01:03:39 -0500 From: Jeff WilsonSubject: Re: (urth) Mr Million v Oreb > From: > Grant Peacock > --- Jeff Wilson wrote: > > Measuring raw switching events per second is misleading. The 10^11 > > neurons in the brain might change state 20-30 times a second each, for > > 2.5 2.5 x 10^12 switching events a second. However, almost 10 years ago, > > Pentium processors with 3.3 million transistors were running at 100 > > megahertz, for 3.3 x 10^14 switching events a second, and just consuming > > a few watts. > > I'm not sure this is number makes any sense. A 100 meg processor executes a > maximum of 10^8 operations per second, no matter how much memory it has > available. Not true; many processors have multiple instruction piplines inside that let them do multiple operations each clock cycle. A Pentium I averaged one instruction per Hertz, but that averages includes instances where it can do two operations at once, each taking a couple of cycles to complete. Later processors do even better, as they finish operations sooner and are more likely to be able to handle dual instructions in tandem. Regardless of the external speed, the processor is still made up of millions of interconnected transistors, each of which can change state hundreds of millions, and lately, billions of times per second. > My source of information is _I am Right, You are Wrong_ by Edward de Bono. I > recommend this book. Unfortunately my copy is on loan so I might miss a lot of > details here. The computing knowledge is also shaky. But here goes. A neuron > has 2 states, firing and not firing, but don't let this fool you into thinking > of the brain as binary. A neuron also has dendrites pointing to dozens of > other neurons, and it also has a variable stimulation threshold. That is, it > needs a certain amount of "juice" from neurons that point to it before it > fires. This limit changes depending on the amount of time since it fired > previously, and on levels of whatever chemicals to which it is sensitive. > Finally, the amount of influence one neuron has over another can be increased > over time. This happens when two connected neurons fire at the same time. This sounds similar in scope to the interconnected transistor architecture inside a microprocessor; their exact configuration and levels of influence on one another is also adjustable nowadays, thanks to the embarrassment over bugs publicly discovered in previous generations of chips. > So, without too much effort to be efficient about the coding, let's estimate > how much computing power one would need to try to simulate a brain, using a > standard style of computer. For each neuron, we need a list of other neurons > to which this one has a dendrite, and the sensitivity of each. Here we're > talking probably 3 bytes * 20 or 30 neurons. This is the hard part, > space-wise. Keeping it on a disk would be way too slow -- I think it has to be > in memory. So we need about 100 bytes per neuron, or 10^7 megs of RAM. This > is 10,000 times the memory of today's PC but it is conceivable. > > Now, the real bottleneck is that we only have one processor. We can only look > at one neuron at a time. This is wrong; there's no need to limit the computers to one processor apiece. Multiprocessor computing is old hat, in fact the ancient ENIAC had separate processing in each of its 64 interconnectable modules. Hardwire multiprocessor designs run into the tens of thousands, letting us simulate physical processes at subatomic levels; try googling for ASCI White. You might also try reading about Beowulf clusters; freely downloadable software exists to let you join arbitrary numbers of networked computers into a virtual multiprocessing machine. A Beowulf cluster of 100 rackmounted PCs with 10,000 megs of RAM each would take up about as much room as Mr. Million Even if you are supposing the brain is to be simulated on a single processor, general purpose machine instead of a special purpose mechanism like Mr. Million, procs today are being designed with SIMD instructions; Single Instruction, Multiple Data. A SIMD instruction to add could be defined; it would quickly sum the adjacent neuron's juice x sensitivity values and return true if they exceed the central neuron's threshold. This would not be much different from the digital signal processing used to de-blur photographs and reduce audio noise. > The processor will have to make "passes" through the > entire list of neurons. For each pass, on each neuron which is firing, it will > go down the list of dendrites and add the right amount of juice to all the > neighbors. Let's suppose that a maximum of 10% of a brain's 10^11 neurons can > be firing at any given time. Still, that's 10^10 * 20 additions that must be > performed in the slow part of the pass. (I'm assuming an average of 20 > dendrites/neuron, this may be too low.) Adding probably takes more than 8 or > 10 clock cycles but let's say 10. So each pass takes on the order of 10^12 > clock cycles. > > Now, when we consider that a neuron might go on and off 20 times per second, > I'd say the passes I just described need to be performed at least 1000 times > per second to achieve realistic brain activity. So our processor needs to run > at around 10^16 hertz, that is, do the work of 100 million pentiums. > > The reason we're still behind by such a large factor is that each neuron is not > just a memory cell, it also performs some of the functions assumed by the > processor in a computer. Good reasons for multiprocessing and other options I mentioned above. A neuron is much more complex than a transistor, but much less so than a microprocessor. It would not be inconceivably difficult to design a computing device with thousands or millions of subordinate computing elements to simulate the effects of individual neurons, and to coordinate their action in ways suitable to model the way real neurons are connected. Hans Moravec and his ilk are keen on doing it. -- Jeff Wilson How Am I Posting? 1-800-555-6789 "If your SecOp can see you, so can the enemy." -Cpt Law --