I've been doing some thinking about neural networks lately, because I find it an interesting subject. And with neural networks I don't mean artificial neural networks but real neural networks, like the human brain and it's peripheral nervous system. Starting from there I'm going in the direction of ANN (the artificial version) or, the way I like to think about them, Neural Network Models (NNMs).
Tonight I wrote down about 6 pages of stuff I thought up about them, while trying to sleep and today I was trying to make sense of my nightly thoughts. This way I came to make some calculation of the brains processing power.
Now first, lets have some statistics.
I've always been taught that the brain has about 10 billion neurons and according to  each neuron has a connection to about 10 thousand other neurons. All these connections go through an axon, a synapses and a dendrite. The brain has about 4 kilometres of axon according to ! These neurons communicate over these connections via impulses or electric spikes. A spike is binary, it's either there or it is not. The bottleneck on information transfer probably is the refractive period of a synapse which is the time needed for it to regenerate after having transvered a spike. This part is a bit shakey, because unlike in computer processors, spikes do not come all within a certain clock tick and thus there is no such thing as an atomary event in the brain which I'm assuming does happen in these calculations. I suspect that because of this my calculations can be off by no more than a factor of two, though some more complexity could show up in the synapse cleft.
So this refractory period of a synapse is about 10 miliseconds, thus a synapse cannot fire more than a 100 times per second and thus cannot convey more than 100 bits per second.
Every neuron connects to about 10 thousand other neurons, which implies about 10 thousand synapses per neuron and thus every neuron processes about a million bits per second.
Right about now you might think: "Hey, thats not so bad! My computers memory pipeline does about 8.5 GB/s, which is about 8.5 billion bits per second.".
Bad news, you still have to multiply with 10 billion, the amount of neurons in the brain. Which brings the brain at about 10 quadrillion (which is a 1 followed by 16 zeros) bits per second and this is not the worst part, besides the neuronal environment which also has influence on every synaps, which is very important on the processing done in the brain, but probably has a neglectable amount of bits per synapse, we are currently only speaking about information transfer, not processing power.
My guess is that one neuronal communication "cycle" can probably be emulated in a current computer processor by about 5 instructions per synapse per bit of information, which brings the grand total of the brains processing power to 50 quadrillion instructions per second, or about 50 petaflops. To put this in context, the worlds fastest super computer can reach about 1 petaflops.
Okay, there is some good news. These values are based on a brain that is communicating at its theoretical limit, which is probably totaly unfeasable, it would take to much oxygen and energy and would probably produce way to much heat, also even with reuptake in synapses the brain would probably poison itsself in the process with too many neurotransmitters and too much other waist products. My guess would be that the brains real peak performance is about a tenth of that and the brains everyday performance probably a tenth of that. So my guess would be, that the brain probably can be simulated by a computer running at 500 terraflops. Which means that there is a computer out there that probably can simulate a human brain, cool huh?
O yeah, one obstacle: we don't know how all these neurons are connected inside the brain and every synaps also has it's own characteristics (kind of receptors, types of neurotransmitters, reuptake speeds, etc.). Thus we still have a few problems here... Still, in the mean time, there are simplere brains, like those of cats, which comprises of about 10 million neurons and about a trillion synapses according to , thus in my estimates would probably use about 5 terraflops of computing power.
O well, it's so intressesting though!
 Maass, W. Computing with Spikes, Technische Universitat Graz (2002)
 Vreeken, J. Spiking Neural Networks, an Introduction, Institute for Information and Computing Sciences, Utrecht University (2003)