CS 416 Artificial Intelligence Lecture 17 First-Order Logic Chapter 9 Lecture 17 First-Order Logic...
-
Upload
posy-harrison -
Category
Documents
-
view
219 -
download
0
Transcript of CS 416 Artificial Intelligence Lecture 17 First-Order Logic Chapter 9 Lecture 17 First-Order Logic...
CS 416Artificial Intelligence
Lecture 17Lecture 17
First-Order LogicFirst-Order Logic
Chapter 9Chapter 9
Lecture 17Lecture 17
First-Order LogicFirst-Order Logic
Chapter 9Chapter 9
Guest Speaker
Topics in Optimal Control, Minimax Control, and Game TheoryTopics in Optimal Control, Minimax Control, and Game Theory
March 28March 28thth, 2 p.m. OLS 005, 2 p.m. OLS 005
Onesimo Hernandez-LermaOnesimo Hernandez-Lerma
Department of MathematicsDepartment of Mathematics
CINVESTAV-IPN, Mexico CityCINVESTAV-IPN, Mexico City
This is a nontechnical introduction, mainly thru examples, to some recent This is a nontechnical introduction, mainly thru examples, to some recent topics in control and game theory, including adaptive control, minimax topics in control and game theory, including adaptive control, minimax control (a.k.a. "worst-case control" or "games against nature"), partially control (a.k.a. "worst-case control" or "games against nature"), partially observable systems (a.k.a. controlled "hidden Markov models"), cooperative observable systems (a.k.a. controlled "hidden Markov models"), cooperative and noncooperative game equilibria, etc.and noncooperative game equilibria, etc.
Topics in Optimal Control, Minimax Control, and Game TheoryTopics in Optimal Control, Minimax Control, and Game Theory
March 28March 28thth, 2 p.m. OLS 005, 2 p.m. OLS 005
Onesimo Hernandez-LermaOnesimo Hernandez-Lerma
Department of MathematicsDepartment of Mathematics
CINVESTAV-IPN, Mexico CityCINVESTAV-IPN, Mexico City
This is a nontechnical introduction, mainly thru examples, to some recent This is a nontechnical introduction, mainly thru examples, to some recent topics in control and game theory, including adaptive control, minimax topics in control and game theory, including adaptive control, minimax control (a.k.a. "worst-case control" or "games against nature"), partially control (a.k.a. "worst-case control" or "games against nature"), partially observable systems (a.k.a. controlled "hidden Markov models"), cooperative observable systems (a.k.a. controlled "hidden Markov models"), cooperative and noncooperative game equilibria, etc.and noncooperative game equilibria, etc.
Inference in first-order logic
Our goal is to prove that KB entails a fact, Our goal is to prove that KB entails a fact, • We use logical inferenceWe use logical inference
Forward chainingForward chaining
Backward chainingBackward chaining
ResolutionResolution
All three logical inference systems rely on search All three logical inference systems rely on search to find a sequence of actions that derive the empty to find a sequence of actions that derive the empty clauseclause
Our goal is to prove that KB entails a fact, Our goal is to prove that KB entails a fact, • We use logical inferenceWe use logical inference
Forward chainingForward chaining
Backward chainingBackward chaining
ResolutionResolution
All three logical inference systems rely on search All three logical inference systems rely on search to find a sequence of actions that derive the empty to find a sequence of actions that derive the empty clauseclause
Search and forward chaining
Start with KB full of first-order definite clausesStart with KB full of first-order definite clauses• Disjunction of literals with exactly one positiveDisjunction of literals with exactly one positive
– Equivalent to implication with conjunction of positive Equivalent to implication with conjunction of positive literals on left (antecedent / body / premise) and one literals on left (antecedent / body / premise) and one positive literal on right (consequent / head / conclusion)positive literal on right (consequent / head / conclusion)
– Propositional logic used Horn clauses, which permit zero Propositional logic used Horn clauses, which permit zero oror one to be positive one to be positive
• Look for rules with premises that are satisfied (use Look for rules with premises that are satisfied (use substitution to make matches) and add conclusions to KBsubstitution to make matches) and add conclusions to KB
Start with KB full of first-order definite clausesStart with KB full of first-order definite clauses• Disjunction of literals with exactly one positiveDisjunction of literals with exactly one positive
– Equivalent to implication with conjunction of positive Equivalent to implication with conjunction of positive literals on left (antecedent / body / premise) and one literals on left (antecedent / body / premise) and one positive literal on right (consequent / head / conclusion)positive literal on right (consequent / head / conclusion)
– Propositional logic used Horn clauses, which permit zero Propositional logic used Horn clauses, which permit zero oror one to be positive one to be positive
• Look for rules with premises that are satisfied (use Look for rules with premises that are satisfied (use substitution to make matches) and add conclusions to KBsubstitution to make matches) and add conclusions to KB
Search and forward chaining
• Which rules have premises that are Which rules have premises that are satisfied (modus ponens)?satisfied (modus ponens)?
– A ^ E => C… nopeA ^ E => C… nope
– B ^ D => E… yesB ^ D => E… yes
– E ^ C ^ G ^ H => I… nopeE ^ C ^ G ^ H => I… nope
A ^ E = C… yesA ^ E = C… yes
E ^ C ^ G ^ H ^ I… nopeE ^ C ^ G ^ H ^ I… nope one more try… yes one more try… yes
• Which rules have premises that are Which rules have premises that are satisfied (modus ponens)?satisfied (modus ponens)?
– A ^ E => C… nopeA ^ E => C… nope
– B ^ D => E… yesB ^ D => E… yes
– E ^ C ^ G ^ H => I… nopeE ^ C ^ G ^ H => I… nope
A ^ E = C… yesA ^ E = C… yes
E ^ C ^ G ^ H ^ I… nopeE ^ C ^ G ^ H ^ I… nope one more try… yes one more try… yes
Breadth FirstBreadth First
• A, B, D, G, HA, B, D, G, H
• A ^ E => CA ^ E => C
• B ^ D => EB ^ D => E
• E ^ C ^ G ^ H => IE ^ C ^ G ^ H => I
Breadth FirstBreadth First
• A, B, D, G, HA, B, D, G, H
• A ^ E => CA ^ E => C
• B ^ D => EB ^ D => E
• E ^ C ^ G ^ H => IE ^ C ^ G ^ H => I
Search and forward chaining
Would other search methods work?Would other search methods work?
• Yes, this technique falls in standard domain of all searchesYes, this technique falls in standard domain of all searches
Would other search methods work?Would other search methods work?
• Yes, this technique falls in standard domain of all searchesYes, this technique falls in standard domain of all searches
Search and backward chaining
Start with KB full of implicationsStart with KB full of implications
• Find all implications with conclusion matching the queryFind all implications with conclusion matching the query
• Add to fringe list the unknown premisesAdd to fringe list the unknown premises
– Adding could be to front or rear of fringe (depth or breadth)Adding could be to front or rear of fringe (depth or breadth)
Start with KB full of implicationsStart with KB full of implications
• Find all implications with conclusion matching the queryFind all implications with conclusion matching the query
• Add to fringe list the unknown premisesAdd to fringe list the unknown premises
– Adding could be to front or rear of fringe (depth or breadth)Adding could be to front or rear of fringe (depth or breadth)
Search and backward chaining• Are all the premises of I satisfied? NoAre all the premises of I satisfied? No
– For each (C E G H) are each of their For each (C E G H) are each of their premises satisfied?premises satisfied?
C? no, put its premises on fringeC? no, put its premises on fringe
– For each (A and E) are their For each (A and E) are their premises satisfied?premises satisfied?
A… yesA… yesE… no, E… no,
add premisesadd premisesfor for
each B and Deach B and D
B… yesB… yes
D… yesD… yes
– E, G, H… yesE, G, H… yes
• Are all the premises of I satisfied? NoAre all the premises of I satisfied? No
– For each (C E G H) are each of their For each (C E G H) are each of their premises satisfied?premises satisfied?
C? no, put its premises on fringeC? no, put its premises on fringe
– For each (A and E) are their For each (A and E) are their premises satisfied?premises satisfied?
A… yesA… yesE… no, E… no,
add premisesadd premisesfor for
each B and Deach B and D
B… yesB… yes
D… yesD… yes
– E, G, H… yesE, G, H… yes
Depth FirstDepth First
• A, B, D, G, HA, B, D, G, H
• A ^ E => CA ^ E => C
• B ^ D => EB ^ D => E
• C ^ E ^ G ^ H => IC ^ E ^ G ^ H => I
Depth FirstDepth First
• A, B, D, G, HA, B, D, G, H
• A ^ E => CA ^ E => C
• B ^ D => EB ^ D => E
• C ^ E ^ G ^ H => IC ^ E ^ G ^ H => I
Search and backward chaining• Are all the premises of I satisfied? NoAre all the premises of I satisfied? No
– For each (C E G H) are each of their For each (C E G H) are each of their premises satisfied?premises satisfied?
C? no, put its premises on fringe endC? no, put its premises on fringe end
– E? no, put its premises on fringe endE? no, put its premises on fringe end
– G, H… yesG, H… yes
– Are C’s premises (A E) satisfied?Are C’s premises (A E) satisfied?A… yesA… yesE… no, add E… no, add
premisespremises
– Are E’s premises (B D) satisfied?Are E’s premises (B D) satisfied?YesYes
– Return to C and IReturn to C and I
• Are all the premises of I satisfied? NoAre all the premises of I satisfied? No
– For each (C E G H) are each of their For each (C E G H) are each of their premises satisfied?premises satisfied?
C? no, put its premises on fringe endC? no, put its premises on fringe end
– E? no, put its premises on fringe endE? no, put its premises on fringe end
– G, H… yesG, H… yes
– Are C’s premises (A E) satisfied?Are C’s premises (A E) satisfied?A… yesA… yesE… no, add E… no, add
premisespremises
– Are E’s premises (B D) satisfied?Are E’s premises (B D) satisfied?YesYes
– Return to C and IReturn to C and I
Breadth FirstBreadth First
• A, B, D, G, HA, B, D, G, H
• A ^ E => CA ^ E => C
• B ^ D => EB ^ D => E
• C ^ E ^ G ^ H => IC ^ E ^ G ^ H => I
Breadth FirstBreadth First
• A, B, D, G, HA, B, D, G, H
• A ^ E => CA ^ E => C
• B ^ D => EB ^ D => E
• C ^ E ^ G ^ H => IC ^ E ^ G ^ H => I
Backward/forward chaining
Don’t explicitly tie search method to chaining Don’t explicitly tie search method to chaining directiondirectionDon’t explicitly tie search method to chaining Don’t explicitly tie search method to chaining directiondirection
Inference with resolution
• We put each first-order sentence into conjunctive normal We put each first-order sentence into conjunctive normal formform
– We remove quantifiersWe remove quantifiers
– We make each sentence a disjunction of literals (each We make each sentence a disjunction of literals (each literal is universally quantified)literal is universally quantified)
• We show KB ^ We show KB ^ is unsatisfiable by deriving the empty clause is unsatisfiable by deriving the empty clause
– Resolution inference rule is our methodResolution inference rule is our method
Keep resolving until the empty clause is reachedKeep resolving until the empty clause is reached
• We put each first-order sentence into conjunctive normal We put each first-order sentence into conjunctive normal formform
– We remove quantifiersWe remove quantifiers
– We make each sentence a disjunction of literals (each We make each sentence a disjunction of literals (each literal is universally quantified)literal is universally quantified)
• We show KB ^ We show KB ^ is unsatisfiable by deriving the empty clause is unsatisfiable by deriving the empty clause
– Resolution inference rule is our methodResolution inference rule is our method
Keep resolving until the empty clause is reachedKeep resolving until the empty clause is reached
Resolution
Look for matching sentencesLook for matching sentences
• Shared literal with opposite signShared literal with opposite sign
– Substitution may be requiredSubstitution may be required
• [Animal (F(x)) V Loves (G(x), x)] and[Animal (F(x)) V Loves (G(x), x)] and[~Loves (u,v) V ~Kills (u, v)][~Loves (u,v) V ~Kills (u, v)]
– F(x) = animal unloved by xF(x) = animal unloved by x
– G(x) = someone who loves xG(x) = someone who loves x
Look for matching sentencesLook for matching sentences
• Shared literal with opposite signShared literal with opposite sign
– Substitution may be requiredSubstitution may be required
• [Animal (F(x)) V Loves (G(x), x)] and[Animal (F(x)) V Loves (G(x), x)] and[~Loves (u,v) V ~Kills (u, v)][~Loves (u,v) V ~Kills (u, v)]
– F(x) = animal unloved by xF(x) = animal unloved by x
– G(x) = someone who loves xG(x) = someone who loves x
Resolution
What does this mean in English?What does this mean in English?
• [Animal (F(x)) V Loves (G(x), x)][Animal (F(x)) V Loves (G(x), x)]
– F(x) = animal unloved by xF(x) = animal unloved by x
– G(x) = someone who loves xG(x) = someone who loves x
• [~Loves (u,v) V ~Kills (u, v)][~Loves (u,v) V ~Kills (u, v)]
What does this mean in English?What does this mean in English?
• [Animal (F(x)) V Loves (G(x), x)][Animal (F(x)) V Loves (G(x), x)]
– F(x) = animal unloved by xF(x) = animal unloved by x
– G(x) = someone who loves xG(x) = someone who loves x
• [~Loves (u,v) V ~Kills (u, v)][~Loves (u,v) V ~Kills (u, v)]
• For all people, either a person doesn’t love an animal or For all people, either a person doesn’t love an animal or someone loves the personsomeone loves the person
• Nobody loves anybody or nobody kills anybodyNobody loves anybody or nobody kills anybody
• For all people, either a person doesn’t love an animal or For all people, either a person doesn’t love an animal or someone loves the personsomeone loves the person
• Nobody loves anybody or nobody kills anybodyNobody loves anybody or nobody kills anybody
Resolution
• [Animal (F(x)) V Loves (G(x), x)] and[Animal (F(x)) V Loves (G(x), x)] and[~Loves (u,v) V ~Kills (u, v)][~Loves (u,v) V ~Kills (u, v)]
– Loves and ~Loves cancel with substitutionLoves and ~Loves cancel with substitution
u/G(x) and v/xu/G(x) and v/x
• Resolvent clauseResolvent clause
– [Animal (F(x)) v ~Kills (G(x), x)][Animal (F(x)) v ~Kills (G(x), x)]
• [Animal (F(x)) V Loves (G(x), x)] and[Animal (F(x)) V Loves (G(x), x)] and[~Loves (u,v) V ~Kills (u, v)][~Loves (u,v) V ~Kills (u, v)]
– Loves and ~Loves cancel with substitutionLoves and ~Loves cancel with substitution
u/G(x) and v/xu/G(x) and v/x
• Resolvent clauseResolvent clause
– [Animal (F(x)) v ~Kills (G(x), x)][Animal (F(x)) v ~Kills (G(x), x)]
Example
Resolution example
Inference with resolution
What resolves with what for proof?What resolves with what for proof?
• Unit preferenceUnit preference
– Start with single-literal sentences and resolve them with Start with single-literal sentences and resolve them with more complicated sentencesmore complicated sentences
– Every resolution reduces the size of the sentence by oneEvery resolution reduces the size of the sentence by one
Consistent with our goal to find a sentence of size 0Consistent with our goal to find a sentence of size 0
– Resembles forward chainingResembles forward chaining
What resolves with what for proof?What resolves with what for proof?
• Unit preferenceUnit preference
– Start with single-literal sentences and resolve them with Start with single-literal sentences and resolve them with more complicated sentencesmore complicated sentences
– Every resolution reduces the size of the sentence by oneEvery resolution reduces the size of the sentence by one
Consistent with our goal to find a sentence of size 0Consistent with our goal to find a sentence of size 0
– Resembles forward chainingResembles forward chaining
Inference with resolution
What resolves with what for proof?What resolves with what for proof?
• Set of supportSet of support
– Build a special set of sentencesBuild a special set of sentences
– Every resolution includes one sentence from setEvery resolution includes one sentence from set
New resolvent is added to setNew resolvent is added to set
– Resembles backward chaining if set of support initialized Resembles backward chaining if set of support initialized with negated querywith negated query
What resolves with what for proof?What resolves with what for proof?
• Set of supportSet of support
– Build a special set of sentencesBuild a special set of sentences
– Every resolution includes one sentence from setEvery resolution includes one sentence from set
New resolvent is added to setNew resolvent is added to set
– Resembles backward chaining if set of support initialized Resembles backward chaining if set of support initialized with negated querywith negated query
Theorem provers
Logical inference is a powerful way to “reason” Logical inference is a powerful way to “reason” automaticallyautomatically
• Prover should be independent of KB syntaxProver should be independent of KB syntax
• Prover should use control strategy that is fastProver should use control strategy that is fast
• Prover can support a human byProver can support a human by
– Checking a proof by filling in voidsChecking a proof by filling in voids
– Person can kill off search even if semi-decidablePerson can kill off search even if semi-decidable
Logical inference is a powerful way to “reason” Logical inference is a powerful way to “reason” automaticallyautomatically
• Prover should be independent of KB syntaxProver should be independent of KB syntax
• Prover should use control strategy that is fastProver should use control strategy that is fast
• Prover can support a human byProver can support a human by
– Checking a proof by filling in voidsChecking a proof by filling in voids
– Person can kill off search even if semi-decidablePerson can kill off search even if semi-decidable
Practical theorem provers
• Boyer-MooreBoyer-Moore
– First rigorous proof of Godel Incompleteness TheoremFirst rigorous proof of Godel Incompleteness Theorem
• OTTEROTTER
– Solved several open questions in combinatorial logicSolved several open questions in combinatorial logic
• EQPEQP
– Solved Robbins algebra, a proof of axioms required for Boolean Solved Robbins algebra, a proof of axioms required for Boolean algebraalgebra
Problem posed in 1933 and solved in 1997 after eight days of Problem posed in 1933 and solved in 1997 after eight days of computationcomputation
• Boyer-MooreBoyer-Moore
– First rigorous proof of Godel Incompleteness TheoremFirst rigorous proof of Godel Incompleteness Theorem
• OTTEROTTER
– Solved several open questions in combinatorial logicSolved several open questions in combinatorial logic
• EQPEQP
– Solved Robbins algebra, a proof of axioms required for Boolean Solved Robbins algebra, a proof of axioms required for Boolean algebraalgebra
Problem posed in 1933 and solved in 1997 after eight days of Problem posed in 1933 and solved in 1997 after eight days of computationcomputation
Practical theorem provers
Verification and synthesis of hard/soft wareVerification and synthesis of hard/soft ware
• SoftwareSoftware
– Verify a program’s output is correct for all inputsVerify a program’s output is correct for all inputs
– There exists a program, P, that satisfies a specificationThere exists a program, P, that satisfies a specification
• HardwareHardware
– Verify that interactions between signals and circuits is robustVerify that interactions between signals and circuits is robust
Will CPU work in all conditions?Will CPU work in all conditions?
– There exists a circuit, C, that satisfies a specificationThere exists a circuit, C, that satisfies a specification
Verification and synthesis of hard/soft wareVerification and synthesis of hard/soft ware
• SoftwareSoftware
– Verify a program’s output is correct for all inputsVerify a program’s output is correct for all inputs
– There exists a program, P, that satisfies a specificationThere exists a program, P, that satisfies a specification
• HardwareHardware
– Verify that interactions between signals and circuits is robustVerify that interactions between signals and circuits is robust
Will CPU work in all conditions?Will CPU work in all conditions?
– There exists a circuit, C, that satisfies a specificationThere exists a circuit, C, that satisfies a specification
Statistical Learning Methods
Chapter 20Chapter 20
• Statistical learning (Bayes, maximum likelihood)Statistical learning (Bayes, maximum likelihood)
• Hidden variables (expectation maximization, Markov models)Hidden variables (expectation maximization, Markov models)
• Instance-based (Nearest neighbor)Instance-based (Nearest neighbor)
• Neural networksNeural networks
Chapter 20Chapter 20
• Statistical learning (Bayes, maximum likelihood)Statistical learning (Bayes, maximum likelihood)
• Hidden variables (expectation maximization, Markov models)Hidden variables (expectation maximization, Markov models)
• Instance-based (Nearest neighbor)Instance-based (Nearest neighbor)
• Neural networksNeural networks
Rational agents
Up until nowUp until now• Many rules were available and rationality was piecing rules Many rules were available and rationality was piecing rules
together to accomplish a goaltogether to accomplish a goal
– Inference and deductionInference and deduction
NowNow• Lots of data available (cause/effect pairs) and rationality is Lots of data available (cause/effect pairs) and rationality is
improving performance with dataimproving performance with data
– Model building, generalization, predictionModel building, generalization, prediction
Up until nowUp until now• Many rules were available and rationality was piecing rules Many rules were available and rationality was piecing rules
together to accomplish a goaltogether to accomplish a goal
– Inference and deductionInference and deduction
NowNow• Lots of data available (cause/effect pairs) and rationality is Lots of data available (cause/effect pairs) and rationality is
improving performance with dataimproving performance with data
– Model building, generalization, predictionModel building, generalization, prediction
How early will my son be born?
Logic from first principlesLogic from first principles
• I think he will be born tomorrowI think he will be born tomorrow
– 20 literals corresponding to 20 dates20 literals corresponding to 20 dates
– Well-fed (mom(x)) => late(x)Well-fed (mom(x)) => late(x)
– late(x) ^ impatient(father(x)) => thisWeekend (x)late(x) ^ impatient(father(x)) => thisWeekend (x)
– late(x) ^ impatient(mother(x)) => tomorrow(x)late(x) ^ impatient(mother(x)) => tomorrow(x)
– ……
Logic from first principlesLogic from first principles
• I think he will be born tomorrowI think he will be born tomorrow
– 20 literals corresponding to 20 dates20 literals corresponding to 20 dates
– Well-fed (mom(x)) => late(x)Well-fed (mom(x)) => late(x)
– late(x) ^ impatient(father(x)) => thisWeekend (x)late(x) ^ impatient(father(x)) => thisWeekend (x)
– late(x) ^ impatient(mother(x)) => tomorrow(x)late(x) ^ impatient(mother(x)) => tomorrow(x)
– ……
How early will my son be born?
Statistical LearningStatistical Learning
• Histogram of birthsHistogram of births
• Data from family treeData from family tree
• Multidimensional correlations between early and ethnicityMultidimensional correlations between early and ethnicity
• ……
Statistical LearningStatistical Learning
• Histogram of birthsHistogram of births
• Data from family treeData from family tree
• Multidimensional correlations between early and ethnicityMultidimensional correlations between early and ethnicity
• ……
Function Approximator
Build a function that maps input to outputBuild a function that maps input to output
• Start with a model of functionStart with a model of function
• Use statistics to setUse statistics to setvalues of coefficientsvalues of coefficients
– Pick m and b such thatPick m and b such thatline defined by termsline defined by termsminimizes the sum ofminimizes the sum ofdistances between eachdistances between eachobserved (x, y) and observed (x, y) and (x, f(x))(x, f(x))
Build a function that maps input to outputBuild a function that maps input to output
• Start with a model of functionStart with a model of function
• Use statistics to setUse statistics to setvalues of coefficientsvalues of coefficients
– Pick m and b such thatPick m and b such thatline defined by termsline defined by termsminimizes the sum ofminimizes the sum ofdistances between eachdistances between eachobserved (x, y) and observed (x, y) and (x, f(x))(x, f(x)) x
y
f(x) = mx + b = y
Slightly more complicated
ParabolaParabola• Select a, b, cSelect a, b, c
• Goal is y – axGoal is y – ax22 –bx – c = 0 –bx – c = 0
– If we have three pointsIf we have three pointsand three unknownsand three unknownswe can solvewe can solve
– If we have more pointsIf we have more pointswe must use anotherwe must use anothertechniquetechnique
ParabolaParabola• Select a, b, cSelect a, b, c
• Goal is y – axGoal is y – ax22 –bx – c = 0 –bx – c = 0
– If we have three pointsIf we have three pointsand three unknownsand three unknownswe can solvewe can solve
– If we have more pointsIf we have more pointswe must use anotherwe must use anothertechniquetechnique x
y
f(x) = ax2 + bx + c
Mappings
These function approximators are mappingsThese function approximators are mappings
• They map inputs to outputsThey map inputs to outputs
– We hope the outputs match similar observationsWe hope the outputs match similar observations
• The mappings become better with more informationThe mappings become better with more information
This is what neural networks doThis is what neural networks do
• But the beauty of neural networks is in how they do what they But the beauty of neural networks is in how they do what they dodo
These function approximators are mappingsThese function approximators are mappings
• They map inputs to outputsThey map inputs to outputs
– We hope the outputs match similar observationsWe hope the outputs match similar observations
• The mappings become better with more informationThe mappings become better with more information
This is what neural networks doThis is what neural networks do
• But the beauty of neural networks is in how they do what they But the beauty of neural networks is in how they do what they dodo
Neural Networks
• Biologically inspiredBiologically inspired
– We have neurons in our bodies that transmit signals We have neurons in our bodies that transmit signals based on inputsbased on inputs
Internal dynamics dependent on chemical gradientsInternal dynamics dependent on chemical gradients
Connections between neurons are importantConnections between neurons are important
– Tolerates noisy inputTolerates noisy input
– Tolerates partial destructionTolerates partial destruction
– Perform distributed computationPerform distributed computation
• Biologically inspiredBiologically inspired
– We have neurons in our bodies that transmit signals We have neurons in our bodies that transmit signals based on inputsbased on inputs
Internal dynamics dependent on chemical gradientsInternal dynamics dependent on chemical gradients
Connections between neurons are importantConnections between neurons are important
– Tolerates noisy inputTolerates noisy input
– Tolerates partial destructionTolerates partial destruction
– Perform distributed computationPerform distributed computation
Neural Networks
SyntheticSynthetic
• A neural network A neural network unitunit accepts a vector as input and accepts a vector as input and generates a scalar output dependent on generates a scalar output dependent on activation functionactivation function
• LinksLinks within network controlled through within network controlled through weightsweights
SyntheticSynthetic
• A neural network A neural network unitunit accepts a vector as input and accepts a vector as input and generates a scalar output dependent on generates a scalar output dependent on activation functionactivation function
• LinksLinks within network controlled through within network controlled through weightsweights