|Theory of neural networks as an explanation for mind states.:|
Books on Amazon
|I 128 ff - 145
Neural Networks/Pinker: Learning/Problem: Incorrect reinforcement with "xor" (Shefferstrich) - Solution: interpose internal representation - ...+... -
Rumelhart: return all errors - "hidden levels": several statements that can be true or wrong can be assembled into a complex logical function, the values then vary continuously - System can place the correct emphasis itself if input and output are given - as long as similar inputs lead to similar outputs, no additional training is required ->Homunculi.
Connectionism/Rumelhart: mind large neural network - rats have only fewer nets - PinkerVsConnectionism: networks alone are not sufficient for handling symbols - the networks have to be structured in programs - even past tense overstretches a network - precursor: "association of ideas": Locke/Hume/Berkeley/Hartley/Mill - 1) contiguity (context): frequently experienced ideas are associated in the mind - 2) similarity: similar ideas activate each other.
Computer variant: is a statistical calculations with multiple levels.
VsConnectionism: units with the same representations are indistinguishable - individual should not be construed as the smallest subclass.
Cannot explain compositionality of representation.
I 158 ~
Recursion/Recursive/Neural Networks/Memory/Pinker: recursion solution for the problem of an infinite number of possible thoughts: Separation of short/long-term memory - the whole sentence is not comprehended at once, but words are processed individually in loops.
Networks themselves have to been as recursive processor: for thoughts to be well-formed.
Neural Networks/Pinker: the networks do not reach down to the rules - they only interpolate between examples that have been put in.
Wie das Denken im Kopf entsteht München 1998