Economics Dictionary of Arguments

Home Screenshot Tabelle Begriffe

Hardware: Hardware is the physical components of a computer system. This includes all of the internal components, such as the CPU, motherboard, RAM, and hard drive, as well as all of the external devices. See also Software.
Annotation: The above characterizations of concepts are neither definitions nor exhausting presentations of problems related to them. Instead, they are intended to give a short introduction to the contributions below. – Lexicon of Arguments.

Author Concept Summary/Quotes Sources

Nick Bostrom on Hardware - Dictionary of Arguments

I 71
Hardware/superintelligence/Bostrom: advantages for digital intelligences:
- Speed of computational elements. Biological neurons operate at a peak speed of about 200 Hz, a full seven orders of magnitude slower than a modern
I 72
microprocessor (~ 2 GHz).)(*) As a consequence, the human brain is forced to rely on massive parallelization and is incapable of rapidly performing any computation that requires a large number of sequential operations.
- Internal communication speed. Axons carry action potentials at speeds of 120 m/s or less, whereas electronic processing cores can communicate optically at the speed of light (300,000,000 m/s).(3)
- Number of computational elements. The human brain has somewhat fewer than 100 billion neurons.
I 339
The number of neurons in an adult human male brain has been estimated at 86.1 ± 8.1 billion, a number arrived at by dissolving brains and fractionating out the cell nuclei, counting the ones stained with a neuron-specific marker. In the past, estimates in the neighborhood of 75–125 billion neurons were common. These were typically based on manual counting of cell densities in representative small regions (Azevedo et al. 2009(4)).
I 72
By contrast, computer hardware is indefinitely scalable up to very high physical limits.
I 339
The ultimate physical limits to computation set by quantum mechanics, general relativity, and thermodynamics are, however, far beyond this “Jupiter brain” level (Sandberg 1999(5); Lloyd 2000(6)).
- Storage capacity. Human working memory is able to hold no more than some four or five chunks of information at any given time.
I 340
The number of chunks working memory can maintain is both information- and task-dependent; however, it is clearly limited to a small number of chunks. See Miller (1956)(7) and Cowan (2001)(8).
I 73
- Reliability, lifespan, sensors, etc. Machine intelligences might have various other hardware advantages.
I 340
For example, biological neurons are less reliable than transistors. Channel noise can trigger action potentials, and synaptic noise produces significant variability in the strength of transmitted signals. Nervous systems appear to have evolved to make numerous trade-offs between noise tolerance and costs (mass, size, time delays); see Faisal et al. (2008)(9). For example, axons cannot be thinner than 0.1 µm lest random opening of ion channels create spontaneous action potentials (Faisal et al. 2005)(10).
, >Superintelligence, >Artificial intelligence, >Artificial neural networks, >Machine learning.

* This mainly occurs in short bursts in a subset of neurons—most have more sedate firing rates (Gray and McCormick 1996)(1); Steriade et al. 1998(2)).

1. Gray, C. M., and McCormick, D. A. 1996. “Chattering Cells: Superficial Pyramidal Neurons Contributing to the Generation of Synchronous Oscillations in the Visual Cortex.” Science 274 (5284): 109–13.
2. Steriade, M., Timofeev, I., Durmuller, N., and Grenier, F. 1998. “Dynamic Properties of Corticothalamic Neurons and Local Cortical Interneurons Generating Fast Rhythmic (30–40 Hz) Spike Bursts.” Journal of Neurophysiology 79 (1): 483–90.
3. Kandel, Eric R., Schwartz, James H., and Jessell, Thomas M., eds. 2000. Principles of Neural Science. 4th ed. New York: McGraw-Hill.
4. Azevedo, F. A. C., Carvalho, L. R. B., Grinberg, L. T., Farfel, J. M., Ferretti, R. E. L., Leite, R. E. P., Jacob, W., Lent, R., and Herculano-Houzel, S. 2009. “Equal Numbers of Neuronal and Nonneuronal Cells Make the Human Brain an Isometrically Scaled-up Primate Brain.” Journal of Comparative Neurology 513 (5): 532–41.
5. Sandberg, Anders. 1999. “The Physics of Information Processing Superobjects: Daily Life Among the Jupiter Brains.” Journal of Evolution and Technology 5.
6. Lloyd, Seth. 2000. “Ultimate Physical Limits to Computation.” Nature 406 (6799): 1047–54.
7. Miller, George A. 1956. “The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information.” Psychological Review 63 (2): 81–97.
8. Cowan, Nelson. 2001. “The Magical Number 4 in Short-Term Memory: A Reconsideration of Mental Storage Capacity.” Behavioral and Brain Sciences 24 (1): 87–114.
9. Faisal, A. A., Selen, L. P., and Wolpert, D. M. 2008. “Noise in the Nervous System.” Nature Reviews Neuroscience 9 (4): 292–303.
10. Faisal, A. A., White, J. A., and Laughlin, S. B. 2005. “Ion-Channel Noise Places Limits on the Miniaturization of the Brain’s Wiring.” Current Biology 15 (12): 1143–9.

Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. Translations: Dictionary of Arguments
The note [Concept/Author], [Author1]Vs[Author2] or [Author]Vs[term] resp. "problem:"/"solution:", "old:"/"new:" and "thesis:" is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition.

Bostrom I
Nick Bostrom
Superintelligence. Paths, Dangers, Strategies Oxford: Oxford University Press 2017

Send Link
> Counter arguments against Bostrom
> Counter arguments in relation to Hardware

Authors A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   Z  

Concepts A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   Z