A Computational Perspective On Mind and Brain · Martin Butz, [email protected] A...
-
Upload
truongngoc -
Category
Documents
-
view
214 -
download
0
Transcript of A Computational Perspective On Mind and Brain · Martin Butz, [email protected] A...
Martin Butz, [email protected]
A Computational Perspective On Mind and Brain1. From Body and Soul to the Embodied, Anticipatory Mind
Institute of Computer Science & Institute of Psychology
2 | Martin Butz | A Computational Perspective on Mind and Brain
Goal
• Understand how the Mind Comes Into Being
- Functional / Computational Perspective
David Marr: 3 Levels of Understanding- „Computational“- „Algorithmic“- „Hardware“
All three levels and their interactions! This is the functional & computational perspective!
- Abstraction
- Modeling
- Building it, i.e., cognitive systems / cognitive architectures
3 | Martin Butz | A Computational Perspective on Mind and Brain
What …
• … is intelligence ?
• … is understanding ?
• … does a word „mean“ ?
• … do we actually „see“ in our world?
• … do we do - „consciously“ - sub-consciously- automatically ?
4 | Martin Butz | A Computational Perspective on Mind and Brain
True Intelligence
• To understand what is actually happening in our world- Typically primarily on a human-relevant level. Intuitive Physics Folk Psychology (social competence)
• To be able to manipulate objects and use tools.
• To interact with the world flexibly and adaptivelygiven the current circumstances.
• Doing all this for the purpose of …
- satisfying “bodily needs” ? ? ?
- content ? ? ?
- true happiness ? ? ?
https://www.biomotionlab.ca/Demos/BMLwalker.html
5 | Martin Butz | A Computational Perspective on Mind and Brain
BookHow the Mind Comes into Being:Introducing Cognitive Science from a Functional and Computational Perspective
December 2016 | Paperback
Martin V. Butz and Esther F. Kutter
How is it that we can think highly abstract thoughts, seemingly fully detached from the actual, physical reality? It is puzzling how our mind controls our body, and vice versa, how our body shapes our mind.
This book offers an interdisciplinary introduction to embodied cognitive science, addressing the question of how the mind comes into being, while actively interacting with and learning from the environment by means of the own body.
6 | Martin Butz | A Computational Perspective on Mind and Brain
CURRENT MAJOR PROBLEMS IN COGNITIVE SCIENCE
“Frame Problem““Symbol & GrammarGrounding”“Binding Problem“ & “Mind-Body Problem”
7 | Martin Butz | A Computational Perspective on Mind and Brain
Problem 1: The Frame ProblemMore Info: http://plato.stanford.edu/entries/frame-problem/
• Central problem: How do we know what is actually relevant (for accomplishing
something).
• Which effects of a behavior / a manipulation are actually relevant?
- Generally, an myriad, mere infinite number of effects are caused by any behavior !
• Prediction problem: what to predict / consider as consequences) ? • Relevancy problem: what is actually relevant for accomplishing a task / an
action / a goal ? • Qualification problem: did we really consider all relevant circumstances /
potential situations )
8 | Martin Butz | A Computational Perspective on Mind and Brain
The Frame Problem
• R1“robot”
• R1D1“robot deducer”(considers changes)
• R2D1“robotrelevant-deducer”Considers relevancies
• R2D2Always knows „inituitively“ what to consider.
Daniel DennettAmerican philosopher(“philosophy of mind”)
by Isabelle Follath
9 | Martin Butz | A Computational Perspective on Mind and Brain
Problem 2: The „Symbol Grounding“ Problem
• How do symbols (e.g. words) encode the world?
10 | Martin Butz | A Computational Perspective on Mind and Brain
The „Symbol Grounding“ ProblemHow do symbols encode about the world?
• Even more pressing: - The grammar grounding problem.
• Poverty of stimulus argument: - Grammar is so complex … - … but as children,
we learn it nearly effortless Develop an intuitive grammatical understanding
- How?
• Noam Chomsky’s answer: Must be innate!
• Embodied cognitive science: - We develop a grammar-ready, language-ready cognitive system from our
sensorimotor interactions with and experiences about our environment.
- Neural attractors (encoding e.g. an object, a particular behavior etc.) are linked to symbols.
- Environmental interactions are associated with grammatical structures / contain grammatical structures – reflecting the systematics of our world.
11 | Martin Butz | A Computational Perspective on Mind and Brain
Problem 3: The Binding Problem
• Navon figures (many small or one big letter, cf. simultanagnosia) ?
• Multiple perspectives(von M.C. Escher)
• Nuns or Voltaire (von S. Dali)
13 | Martin Butz | A Computational Perspective on Mind and Brain
Binding Problem II
How does our brain bind multiple sources of stimulus information into one percept or „Gestalt“? Gold
BlackGoldBlack
readsilent
Flora
GiuseppeArcimboldo*1526-1593
Simultanagnosia(literally failing to see the forest for all the trees)
14 | Martin Butz | A Computational Perspective on Mind and Brain
Problem 4: The Mind-Body Problem
• How does animmaterial, immortal soul control a material, mortal body?
15 | Martin Butz | A Computational Perspective on Mind and Brain
Plato‘s Question: What is Knowledge?(424/423 BC – 348/347)
• Mind and Body- Plato believes that the mind does
not reflect the organization of the body (somewhat in disagreement with embodiment). (in „Phaedo“).
- He favors the belief of an immortal, immaterial soul.
http://lssacademy.com/2008/01/14/shadows-or-reality/
Platon
16 | Martin Butz | A Computational Perspective on Mind and Brain
The Cave Allegory (in “The Republic”).
• SOCRATES: [...] Imagine human beings living in an underground, cavelike dwelling, [...]. They have been there since childhood, with their necks and legs fettered, so that they are fixed in the same place, able to see only in front of them, because their fetter prevents them from turning their heads around. Light is provided by a fire burning far above and behind them. Between the prisoners and the fire, there is an elevated road stretching. Imagine that along this road a low wall has been built-like the screen in front of people that is provided by puppeteers, and above which they show their puppets.
• [...] Also imagine, then, that there are people alongside the wall carrying multifarious artifacts that project above it statues of people and other animals, made of stone, wood, and every material. And as you would expect, some of the carriers are talking and some are silent.
• GLAUCON: It is a strange image you are describing, and strange prisoners.
• SOCRATES: They are like us. I mean, in the first place, do you think these prisoners have ever seen anything of themselves and one another besides the shadows that the fire casts on the wall of the cave in front of them?
- [...] What about the things carried along the wall? Isn’t the same true where they are concerned?
- [...] All in all, then, what the prisoners would take for true reality is nothing other than the shadows of those artifacts.
17 | Martin Butz | A Computational Perspective on Mind and Brain
Descartes: Mind-Body „Problem“
• Descartes (1596 – 1650)- "Je pense, donc je suis" (Discours
de la méthode pour bien conduiresa raison, et chercher la verité dansles sciences, 1637)
- "Dubito, ergo cogito, ergo sum" (Principles of Philosophy (Principia philosophiae), 1644)
Sets the state for philosophical developments over the next centuries
Particularly sets the stage for the „Mind-Body Problem“(Dualism):How can an immaterial soul control a material body ?
http://www.optics.arizona.edu/Nofziger/UNVR195a/Class3/Descartes3.jpg
18 | Martin Butz | A Computational Perspective on Mind and Brain
The Homunculus Problem(Cartesian Theatre - Daniel Dennett, 1991)
• The perspective emphasizes the problem of „re-representations“
- Including symbolic representations
• Homunculus (i.e. the soul) observes sensory inputs and issues motor commands.
• How may this explain (the functionality of) our mind, our feelings, our consciousness !?
19 | Martin Butz | A Computational Perspective on Mind and Brain
Alan Turing (1912–1954)The TURING TEST
“I propose to consider the question, "Can machines think?“This should begin with definitions of the meaning of the terms "machine" and "think."
The definitions might be framed so as to reflect so far as possible the normal use of the words, but this attitude is dangerous.If the meaning of the words "machine" and "think" are to be found by examining how they are commonly used it is difficult to escape the conclusion that the meaning and the answer to the question, "Can machines think?" is to be sought in a statistical survey such as a Gallup poll. But this is absurd.
Instead of attempting such a definition I shall replace the question by another, which is closely related to it and is expressed in relatively unambiguous words. The new form of the problem can be described in terms of a game which we call the 'imitation game‘.“ (p. 433)
Turing (1950) Computing machinery and intelligence. Mind 59, 433-460.
Alan Turing
20 | Martin Butz | A Computational Perspective on Mind and Brain
The „Chinese Room“ Problem(John Searle, 1980) http://www.iep.utm.edu/chineser
• Chinese room:- Room with a huge data base of Chinese symbols
- A huge set of rules (in English), how to manipulate the symbols.
- Human, who uses the rules to manipulate the symbols (producing „answers“ to incoming „questions“ via the symbol manipulations)
- External person probes the „Chinese Room“ Is there a thinking, intelligent person in the room? Can this person speak / understand Chinese?
Implications− Human in the room does not
understand Chinese. − Symbols remain meaningless, despite
the appearance to the external
Argument for “weak AI” systems:Seemingly intelligent, human-like behavior does not imply that there is a sentient human being behind it.
21 | Martin Butz | A Computational Perspective on Mind and Brain
Searle‘s Dualismus
• Problem of the Chinese Room: “SEMATICS” are missing
- Note the close relation to the „Symbol Grounding“ Problem
- There are no meaningful encodings, states, representations …
- Neither the room nor the human inside are truly „intelligent“.
• John Searle‘s (1980) Argumentation:
- „Brains cause minds“
- „Syntax does not suffice for semantics“
- There is more to the brain than symbols and syntax.
• His arguments are
in favor of- “Weak AI” (intelligent machines have no semantics)
against - “Strong AI” (seemingly intelligent machines are like humans)
John Searle
22 | Martin Butz | A Computational Perspective on Mind and Brain
Consequence: The „Embodiment Turn“(Grafiken: Pfeifer & Bongard (2006). How the Body Shapes the Way We Think: A New View of Intelligence.)
• Cognitive intelligence needs a body!
- Semantics cannot emerge from symbols and rules alone.
- It are the sensorimotor interactions generated by the body, with which the environment is experienced, and from which semantics develops!!!
23 | Martin Butz | A Computational Perspective on Mind and Brain
The Development of Semantics(from an embodiment perspective)
• Environment, body, and brain continuously interact with each other.
• Intelligence develops from these interactions, that is, from the gathered sensorimotor experiences.
• These experiences are thus inevitably pre-structured by the bodily morphology and the properties of the environment (including physics, biology, environment, social and cultural factors etc.) – they essentially reflect reality.
• Cognitive development results from the analysis of these experiences, driven by self-motivation.
• Self-motivation is inevitably evolutionarily grounded.
• To understand how this actually works, scientific research on Embodied Cognition developed.
• Main implications with respect to the four problems are: Symbols become semantically meaningful by associating them with sensorimotor-grounded
experiences (and suitably abstractec and generalized encodings thereof). (Symbol grounding)
Entities and behaviors are flexibly bound into units of thought. (Binding problem) The analysis of embodied sensorimotor experiences directly encodes beliefs about
relevancies. (Frame problem) Grammar reflects the semantics found in our world. (Grammar grounding) The mind-body problem disappears.
24 | Martin Butz | A Computational Perspective on Mind and Brain
EMERGENCE
Morphological ComputationsEmergent BehaviorsEmergent Attractors
25 | Martin Butz | A Computational Perspective on Mind and Brain
Phenomenon: Emergent Behavior
• Emergent behavior is behavior that was no programmed explicitly or directly, but rather emerges from control-body-sensory-motor-environmental interactions.
• Types of emergent behaviors: - Individual behavioral phenomena via body (i.e. sensory-motor) -environment
interactions. Passive walker Attractors in four-legged motion Insect eyes, flying and landing Frog
- Globally collective behavior, where individual beahavior is determined by simple sensory-motor couplings Swiss robots Ants & Pheromones
- Ontogenetische Phänomene wie Entwicklung des Selbst, Bewusstsein, der Sprache, Laufen lernen etc.
- Evolutionäre Entwicklung der Körpermorphologie, Entwicklungsprädispositionen(„Biases“)
26 | Martin Butz | A Computational Perspective on Mind and Brain
Behavior-Oriented Artificial Intelligence / Cognition
• Marvin Minsky (*1927)- Society of Mind, 1987Modular, interactive approach to intelligence with small, processing
experts.
• Rodney A. Brooks (*1954)- Behavior-oriented artificial intelligence- Intelligence without representation / without reason- Subsumption architecture (more complex, interdependen behavioral
modules)
• Rolf Pfeifer- Understanding Intelligence, 1999- How the Body Shapes the Way We Think: A New View of Intelligence,
2007 (mit J. Bongard)
27 | Martin Butz | A Computational Perspective on Mind and Brain
Importance of Bodily Morphology
• The morphology of the body is rather intelligent (shaped by evolution!)Self-stabilizing (e.g. agonist / antagonist)Morphological computations (automatic sensory normalization, e.g.
when entering a dark room)
• Sensor and motor system is strongly interactive- Body is a dynamic, complex, interactive system- Sensory-motor interactions result in morphological computations and
yield useful morphological attractors. Attractors essentially encode dynamic, stable patterns. Such a pattern can be encoded into a “symbolizable” attractor state
28 | Martin Butz | A Computational Perspective on Mind and Brain
Passive Walker(http://ruina.tam.cornell.edu/research/topics/robots/index.php)
2
29 | Martin Butz | A Computational Perspective on Mind and Brain
Morphological Algorithms
Figure 4.1 in Pfeifer & Bongard (2006): Morphological computation. (a) Sprawl robot exploiting the material properties of its legs for rapid locomotion. The elasticity in the linear joint provided by the air pressure system allows for automatic adaptivity of locomotion over uneven ground, thus reducing the need for computation. (b) An animal exploiting the material properties of its legs (the elasticity of its muscle-tendon system) thus also reducing computation. (c) A robot built from stiff materials must apply complex control to adjust to uneven ground and will therefore be very slow.
30 | Martin Butz | A Computational Perspective on Mind and Brain
Morphological Attractors
• Imagine an attractor surface
• Example: different types of four-legged motion:
- Walking- Trotting- Running- Jumping- Skipping
31 | Martin Butz | A Computational Perspective on Mind and Brain
Beispiel Morphologische Berechnung: Billige, einfache, flexible Bewegung I
• Robot „Stumpy“ (Iida et al., 2002, Paul et al., 2002)http://www.ifi.unizh.ch/ailab/people/iida/stumpy/
32 | Martin Butz | A Computational Perspective on Mind and Brain
Beispiel Morphologische Berechnung: Einfache, flexible Bewegung II
• Robot „Puppy“http://www.ifi.unizh.ch/ailab/teaching/semi2004/
33 | Martin Butz | A Computational Perspective on Mind and Brain
Example of a Biological System that uses Morphological Computations
• Task: Fly past an object / above a ground at a safe distance.
• Classical AI approach: complete computation
- Compute object / ground encodings. - Determine the exact distance to object / ground. - Compute desired heading direction dependent on distance.
- … how can an insect compute all this ?
Isn‘t this way too complicated ?
Is it possible to encode this in an evolutionary manner in neurons ?
Does an insect learn this ?
35 | Martin Butz | A Computational Perspective on Mind and Brain
Morphological computationsby the Eyes of an Insect
• Systematic morphology of insect eyes
• Optical flow is approximately constant when passing an object / the floor
• Facets of the eyes are very suitably distributed
• Modelling approach: - Eyebot by Lichtensteiger &
Eggenberger:Evolutionary algorithm evolved similar eye structure in robots.
• (Lichtensteiger, L. & Eggenberger, P. (1999). Evolving the morphology of a compound eye on a robot. Advanced Mobile Robots, 1999. (Eurobot '99) 1999 Third European Workshop on, 127-134. 10.1109/EURBOT.1999.827631.)
36 | Martin Butz | A Computational Perspective on Mind and Brain
Sensorimotor coupling
• Keep optical flow at a constant rate:- As a result: objects / floor is kept at a certain distance
• Analyze optical flow expansion:- When expansion reaches very high speed, landing is imminent
(project legs towards point of expansion)
• Similar, more advanced coupling in the frog: Food = small quickly, irregularly moving optical flow (e.g. flies) Danger = large, dark, expanding optical flow.
37 | Martin Butz | A Computational Perspective on Mind and Brain
Braitenberg VehiclesValentino Braitenberg (1984) Vehicles: Experiments in Synthetic Psychology
• Direct coupling between sensory signals to motor activities.
• Consequence: - Move towards light
source- Avoid light source
38 | Martin Butz | A Computational Perspective on Mind and Brain
Example: Ants
• Probabilistically: - Put down pheromones- Follow pheromones
• Carry food back to nest with homing algorithm.
• Shortest path receives more pheromones faster.
• Result: shortest paths to food source develop in an emergent manner.
39 | Martin Butz | A Computational Perspective on Mind and Brain
MIND THE “FRAME OF REFERENCE PROBLEM”!
3
40 | Martin Butz | A Computational Perspective on Mind and Brain
„Frame of Reference Problem“
• Example: Simon‘s Ants at a beach:- Ant moves about on the beach
• Different perspectives: - Observer perspective: Ants follows a complex path avoiding obstacles / seemingly wanting to go
somewhere. → automatic (false) inference of a complex (planning) mechanism.
- Perspective of the ant Follows simple sensory-motor couplings and few heuristics.
→ Ants does not really plan at all!
41 | Martin Butz | A Computational Perspective on Mind and Brain
Robots are Pushing the Blocks Together (duration 20min)
42 | Martin Butz | A Computational Perspective on Mind and Brain
“Swiss Robots” (Didabot)ftp://ftp.ifi.unizh.ch/pub/institute/ailab/techreports/html/95.09/index.html
Observer Thinks:Robots are Pushing the Blocks Together
Robot Thinks: Nothing!
Robot Control: • Braitenberg-Coupling
between distance sensors and wheels.
43 | Martin Butz | A Computational Perspective on Mind and Brain
Human Interpretation of „Reality“
• Based on Fritz Heider & Marianne Simmel (1944, experimentellen Untersuchung von Attributionen bei der Beobachtung der Bewegung einfacher geometrischer)
• http://www.youtube.com/watch?v=cYAMPsALf-E&feature=player_embedded
4
44 | Martin Butz | A Computational Perspective on Mind and Brain
Frame of Reference Problem -Consequence
• Beware of your interpretation of reality!
• We humans are complex, anticipatory, actively interpreting systems!
• These interpretations depend on our (physical, biological, bodily, social, & cultural) experiences of reality!
45 | Martin Butz | A Computational Perspective on Mind and Brain
So what about the Mind ?
• Reflexes / sensorimotor couplings are not enough!
• Nonetheless, cognition (particularly predictions & anticipations) must develop from sensorimotor experiences.
Human
Cognition
ReflexesSensors
Motors