
Ficha técnica:
Título: El nombre de la rosa.
Autor: Umberto Eco.
Editorial: Debolsillo.
Edición: Séptima edición en formato de bolsillo, Barcelona, octubre 2006 (1980).
Páginas: 781.
ISBN: 84-9759-258-1
First: in terms of the number of actions that can be performed by active organs of the same total size (defined by volume or by energy dissipation) in the same interval, the natural componentry is a factor 104 ahead of the artificial one. This is the quotient of the two factors obtained above, i.e. of 108 to 109 by 104 to 105.
Second: the same factors show that the natural componentry favors automata with more, but slower, organs, while the artificial one favors the reverse arrangement of fewer, but faster, organs. Hence it is to be expected that an efficiently organized large natural automation (like the human nervous system) will tend to pick up as many logical (or informational) items as possible simultaneously, and process them simultaneously, while an efficiently organized large artificial automaton (like a large modern computing machine) will be more likely to do things successively —one thing at a time, or at any rate not so many things at a time. That is, large and efficient natural automata are likely to be highly parallel, while large and efficient artificial automata will tend to be less so, and rather to be serial. (...)
Third: it should be noted, however, that parallel and serial operation are not unrestrictedly substitutable for each other —as would be required to make the first remark above completely valid, with its simple scheme of dividing the size-advantage factor by the speed-disadvantage factor in order to get a single (efficiency) "figure of merit." More specifically, not everything serial can be immediately paralleled —certain operations can only be performed after certain others, and not simultaneously with them (i.e. they must use the results of the latter). In such a case, the transition from a serial scheme to a parallel one may be impossible, or it may be possible but only concurrently with a change in the logical approach and organization of the procedure. Conversely, the desire to serialize a parallel procedure may impose new requirements on the automaton. Specifically, it will almost always create new memory requirements, since the results of the operations that are performed first must be stored while the operations that come after these are performed. Hence the logical approach and structure in natural automata may be expected to differ widely from those in artificial automata. Also, it is likely that the memory requirements of the latter will turn out to be systematically more severe than those of the former.
(von Neumann: pp. 50-52)
Should we simply press past the obvious limitations of biological systems (limitations mostly of speed and reliability) and pursue the dazzling potential of electronic systems, systems that can, in principle and even with a von Neumann architecture, implement or simulate any possible computational activities? Or should we attempt instead, for whatever reasons, to mimic the computational organization displayed in the brains of insects, fishes, birds, and mammals? And what organization is that, anyway? Is it importantly or interestingly different from what goes on in our artificial machines?
Here, the reader may be surprised to discover, John von Neumann weighs in with a prescient, powerful, and decidedly nonclassical answer. He spends the first half of the book leading the reader stepwise through the classical conceptions for which he is so famously responsible, and as he turns finally to address the brain, he hazards the initial conclusion that "its functioning is prima facie digital." But this initial take on the neuronal data is also prima facie procrustean, a fact that von Neumann acknowledges immediately and subsequently turns to pursue at length.
The first problem he notes is that the connections between neurons do not show the telltale "two lines in and one line out" configuration that classical and-gates and or-gates display. Though each cell typically projects exactly one output axon, as the classical take would require, each cell receives more than a hundred, even more than several thousand, inputs from many other neurons. This fact is not decisive —there are, for example, multivalent logics. But it does give him pause.
The plot thickens as von Neumann pursues a point-by-point comparison between the fundamental dimensions of the brain's "basic active organs" (presumably, the neurons) and the computer's "basic active organs" (the various logic gates). Spatially, he observes, neurons have the advantage of being at least 10² times smaller than their presumptive electronic counterparts. (At the time, this estimate was exactly right, but with the unforeseen advent of photo-etched microchips, this size advantage has simply disappeared, at least where two-dimensional sheets are concerned. We can forgive von Neumann this one.)
More important, neurons have a major disadvantage where the speed of their operations is concerned. Neurons are, he estimates, perhaps 105 times slower than vacuum tubes or transistors in the time required to complete a basic logical operation. Here he is portentously correct, in ways about to emerge. If anything, he underestimates the neuron's very considerable disadvantage. If we assume that a neuron can have a "clock frequency" of no better than roughly 10² Hz, then the clock frequencies of almost 1,000 MHz (that is, 109 basic operations per second) now displayed in the most recent generation of desktop machines push the neuron's disadvantage closer to a factor of 107. The conclusion is inescapable. If the brain is a digital computer with a von Neumann architecture, it is doomed to be a computational tortoise by comparison.
Additionally, the accuracy with which the biological brain can represent any variable is also many orders od magnitude below the accuracies available to a digital computer. Computers, von Neumann observes, can easily use and manipulate eight, ten, or twelve decimal places of representation, while the neuron's presumed mode of representation —the frequency of the spike train it sends down its axon— appears limited to a representational accuracy of at most two decimal places (specifically, plus or minus perhaps 1 percent of a frequency maximum of roughly 100 Hz). This is troubling because, in the course of any computation that involves a great many steps, small errors of representation in the early steps regularly accumulate into larger errors at the closing steps. Worse, he adds, for many important classes of computation, even tiny errors in the early steps get exponentially amplified in subsequent steps, which inevitably leads to wildly inaccurate final outputs. Thus, if the brain is a digital computer with only two decimal places of representational accuracy, it is doomed to be a computational dunce.
Conjointly, these two severe limitations —one on speed, and the other on accuracy— drive von Neumann to the conclusion that whatever computational regime the brain is using, it must be one that somehow involves a minimum of what he calls "logical depth". That is, whatever the brain is doing, it cannot be sequentially performing thousands upon thousands of sequentially orchestrated computational steps, as in the super-high frequency, recursive activity of a digital machine's central processor. Given the slowness of its neuronal activities, there isn't enough time for the brain to complete any but the most trivial of computations. And given the low accuracy of its typical representations, it would be computationally incompetent even if it did have enough time.
(Paul & Patricia Churchland: pp. XV - XVIII, foreword)