viernes, 20 de febrero de 2009

Comparing brain versus computer: parallel versus serial.

Von Neumann's summary of the comparison between the human brain and the computer contains, astonishingly enough, a few insights that still apply to today's technology:
First: in terms of the number of actions that can be performed by active organs of the same total size (defined by volume or by energy dissipation) in the same interval, the natural componentry is a factor 104 ahead of the artificial one. This is the quotient of the two factors obtained above, i.e. of 108 to 109 by 104 to 105.

Second: the same factors show that the natural componentry favors automata with more, but slower, organs, while the artificial one favors the reverse arrangement of fewer, but faster, organs. Hence it is to be expected that an efficiently organized large natural automation (like the human nervous system) will tend to pick up as many logical (or informational) items as possible simultaneously, and process them simultaneously, while an efficiently organized large artificial automaton (like a large modern computing machine) will be more likely to do things successively —one thing at a time, or at any rate not so many things at a time. That is, large and efficient natural automata are likely to be highly parallel, while large and efficient artificial automata will tend to be less so, and rather to be serial. (...)

Third: it should be noted, however, that parallel and serial operation are not unrestrictedly substitutable for each other —as would be required to make the first remark above completely valid, with its simple scheme of dividing the size-advantage factor by the speed-disadvantage factor in order to get a single (efficiency) "figure of merit." More specifically, not everything serial can be immediately paralleled —certain operations can only be performed after certain others, and not simultaneously with them (i.e. they must use the results of the latter). In such a case, the transition from a serial scheme to a parallel one may be impossible, or it may be possible but only concurrently with a change in the logical approach and organization of the procedure. Conversely, the desire to serialize a parallel procedure may impose new requirements on the automaton. Specifically, it will almost always create new memory requirements, since the results of the operations that are performed first must be stored while the operations that come after these are performed. Hence the logical approach and structure in natural automata may be expected to differ widely from those in artificial automata. Also, it is likely that the memory requirements of the latter will turn out to be systematically more severe than those of the former.

(von Neumann: pp. 50-52)

It speaks for the excelence of von Neumann's work that at least parts of it are ust as relevant today as they were back in 1956, in spite of all the changes that took place in a field that's as dynamic as this. Specifically, those reflections about parallel versus serialized computing are still plenty relevant.

No hay comentarios: