Back to Basics: Classification and Inference Based on Input Feedback Structure

Post on 08-Jan-2016

30 views 0 download

description

Back to Basics: Classification and Inference Based on Input Feedback Structure. Tsvi Achler Eyal Amir. Department of Computer Science University of Illinois at Urbana-Champaign. AI -> AGI. Ability to generalize Even if only learned basics Training distribution ≠ test distribution - PowerPoint PPT Presentation

Transcript of Back to Basics: Classification and Inference Based on Input Feedback Structure

Back to Basics: Classification and Inference Based on Input

Feedback Structure

Tsvi Achler Eyal Amir

Department of Computer ScienceDepartment of Computer Science

University of Illinois at Urbana-ChampaignUniversity of Illinois at Urbana-Champaign

AI -> AGI

• Ability to generalize– Even if only learned basics– Training distribution ≠ test distribution

• Avoid Combinatorial Explosion– Allows complex networks

New Basic Computational Structure

• Based on massive feedback to inputs

• No emphasis on weight parameters

• Input Feedback during testing

Positive Negative

Y1 Y2 Y3 Y4Output Nodes

Input NodesI4I1 I2 I3I1 I2 I3

Y2

Connections:

Avoids Combinatorial Explosionvia Simple Connectivity

x1 x2 x3x4

Y2

Iterative

I2 I1

Y1

x1 x2

Back

I2 I1

x1 x2

Y2 Y1

Y2

Forward

I2 I1

Y1

x1 x2

Back

I2 I1

x1 x2

Y2 Y1

I2 I1

Active (1)

Inactive (0) Y2 Y1

C2

I2 I1

Active (1)

Inactive (0)

C2

I2

Active (1)

Inactive (0)

Active (1)

Inactive (0)

Active (1)

Inactive (0)

I2

Active (1)

Inactive (0)

Active (1)

Inactive (0)

I2 I1

Steady State

00

1

1 5

ActivityY1

Y2

Simulation Time (T)

Graph of Dynamics

32 4

Resolving Pattern Interactions

1 2

A B

Node:

Input:

Network Configuration

Inputs ResultsNode→Value

A 1 → 1A, B 2 → 1

Steady State

(0, ½) (½, ½)

Half Activation Half Response

Inputs Results Node→Value2 3

A B C

Cells:

Inputs:

A 2 → ½A, B 2 → 1A, B, C 2,3 →¾B 2,3 →¼B, C 3 → 1

Based on Available Representations

InputsResultsCell Value1 2 3

A B C

Cells:

Inputs:

A 1 → 1A, B 2 → 1A, B, C 1,3 → 1B, C 3 → 1

‘Binding’

Most efficient configuration

Can be Chained Ad Infinitum

N

N O

1 2 3

A B C

Nodes:

Inputs:

2 3

A B C

Nodes

Inputs:

...

N

N O

...

New data: Recognize Scene When Trained on Individuals

• Teach single letters

• Test multiple simultaneous letters

• A scene is beyond the training distribution

Feature Extraction• Bag-of-features

Feature 1 = x1 Feature 2 Feature 3

Feature Examples: Feature

1Feature

2Feature

3Feature

4

Feature

n

x.. x4xnx2 x1x3

512 features

Two Stimuli Simultaneously A B

% of combinations

Letters Correctly Classified

% of combinations

0102030405060708090

100

0/2 1/2 2/2

IFN

NN

KNN

SVM

Four Stimuli Simultaneously:

0102030405060708090

100

Letters Correctly Classified

% of combinations

A B

C D

0/4 1/4 2/4 3/4 4/4

IFN

NN

KNN

SVM

Difficulty

• Nonlinear Equations– Can’t mathematically prove general properties

Steps Towards AGI

• Generalize Outside Training Distribution

• Structure Avoids Combinatorial Explosion

Acknowledgements

Cyrus Omar

National Geospatial-Intelligence Agency National Geospatial-Intelligence Agency HM1582-06--BAA-0001

Activation

Combined:

Equations

Inhibition

Feedback

aNi

ia

aa I

n

tYttY

)()(

bMj

jb tYQ )(

b

bb

Q

XI

a

i

Ni

Mjj

i

a

aa

tY

X

n

tYttY

)(

)()(