|Brockman I 148
Neural networks/Chris Anderson: [the optimization within the network] consists of the following steps:
1. Define a cost function” that determines how well the network solved the problem.
2. Run the network once and see how it did at that cost function.
3. Change the values of the connections and do it again. The difference between those two results is the direction, or “slope,” in which the network moved between the two trials.
4. If the slope is pointed “downhill,” change the connections more in that direction. If it’s “uphill,” change them in the opposite direction.
5. Repeat until there is no improvement in any direction. That means that you’re in a minimum.
Problem: that might be only a local minimum. >Local minimum/Anderson, >Universe/Anderson.
Brockman I 149
Solution: Random walk. Contrary to humans and most other life-forms, that often stuck in local minima,
Brockman I 150
AI systems can consider many moves ahead. AI can find solutions in a few years that 7 million years of evolution never found.
Anderson, Chris “Gradient Descent” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press._____________Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. The note [Author1]Vs[Author2] or [Author]Vs[term] is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition.
The Long Tail: Why the Future of Business is Selling Less of More New York 2006
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019