Artificial Neuron network

19
Neural Network dan Logika Kabur

Transcript of Artificial Neuron network

Page 1: Artificial Neuron network

Neural Network dan Logika Kabur

Page 2: Artificial Neuron network

Neural networks and fuzzy logic are two complimentary technologies

Neural networks can learn from data and feedback

– It is difficult to develop an insight about the meaning associated with each neuron and each weight

– Viewed as “black box” approach (know what the box does but not how it is done conceptually!)

Page 3: Artificial Neuron network

Two ways to adjust the weights using backpropagation

– Online/pattern Mode: adjusts the weights based on the error signal of one input-output pair in the trainning data.

• Example: trainning set containning 500 input-output pairs, this mode BP adjusts the weights 500 times for each time the algorithm sweeps through the trainning set. If the algorithm sweeps converges after 1000 sweeps, each weight adjusted a total of 50,000 times

Page 4: Artificial Neuron network

– Batch mode (off-line): adjusts weights based on the error signal of the entire training set.

• Weights are adjusted once only after all the trainning data have been processed by the neural network.

• From previous example, each weight in the neural network is adjusted 1000 times.

Page 5: Artificial Neuron network

Fuzzy rule-based models are easy to comprehend (uses linguistic terms and the structure of if-then rules)

Unlike neural networks, fuzzy logic does not come with a learning algorithm

– Learning and identification of fuzzy models need to adopt techniques from other areas

Since neural networks can learn, it is natural to marry the two technologies.

Page 6: Artificial Neuron network

Neuro-fuzzy system can be classified into three categories: 1. A fuzzy rule-based model constructed using

a supervised NN learning technique 2. A fuzzy rule-based model constructed using

reinforcement-based learning 3. A fuzzy rule-based model constructed using

NN to construct its fuzzy partition of the input space

Page 7: Artificial Neuron network

A class of adaptive networks that are functionally equivalent to fuzzy inference systems.

ANFIS architectures representing both the Sugeno and Tsukamoto fuzzy models

Page 8: Artificial Neuron network
Page 9: Artificial Neuron network
Page 10: Artificial Neuron network

Assume - two inputs X and Y and one output Z Rule 1: If x is A1 and y is B1, then f1 = p1x + q1y +r1 Rule 2: If x is A2 and y is B2, then f2 = p2x + q2y +r2

Page 11: Artificial Neuron network

Every node i in this layer is an adaptive node with a node function O1,i = mAi (x), for I = 1,2, or O1,i = mBi-2 (y), for I = 3,4 Where x (or y) is the input to node i and Ai (or Bi) is a linguistic label ** O1,i is the membership grade of a fuzzy set and it specifies the degree to which the given input x or y satisfies the quantifies

Page 12: Artificial Neuron network

Typically, the membership function for a fuzzy set can be any parameterized membership function, such as triangle, trapezoidal, Guassian, or generalized Bell function.

Parameters in this layer are referred to as Antecedence Parameters

Page 13: Artificial Neuron network

Every node i in this layer is a fixed node labeled P, whose output is the product of all the incoming signals:

O2,i = Wi = min{mAi (x) , mBi (y)}, i = 1,2 Each node output represents the firing strength of

a rule.

Page 14: Artificial Neuron network

Every node in this layer is a fixed node labeled N. The ith node calculates the ratio of the ith rule’s firing strength to the sum of all rules’firing stregths:

O3,i = Wi = Wi /(W1+W2) , i =1,2 (normalized firing strengths]

Page 15: Artificial Neuron network

Every node i in this layer is an adaptive node with a node function

__ __ O 4,i = wi fi = wi (pix + qiy +ri) …Consequent

parameters

Page 16: Artificial Neuron network

The single node in this layer is a fixed node labeled S, which computes the overall output as the summation of all incoming signals:

__ O 5,1 = Si wi fi

Page 17: Artificial Neuron network

ANFIS architecture for the Sugeno fuzzy model, weight normalization is performed at the very last layer

Page 18: Artificial Neuron network

Equivalent ANFIS architecture using the Tsukamoto fuzzy model

Page 19: Artificial Neuron network