PATTERN RECOGNITION - MindMeister

Post on 17-Oct-2021

4 views 0 download

Transcript of PATTERN RECOGNITION - MindMeister

PATTERN RECOGNITION

Talal A. Alsubaie

SFDA

1

OUTLINES

¢ What is a pattern? ¢ What is A pattern Class? ¢ What is pattern recognition? ¢ Human Perception ¢ Examples of applications ¢ The Statistical Way ¢ Human and Machine Perception ¢ Pattern Recognition ¢ Pattern Recognition Process ¢ Case Study

2

WHAT IS A PATTERN?

¢ A pattern is an abstract object, or a set of measurements describing a physical object.

3

WHAT IS A PATTERN CLASS?

¢ A pattern class (or category) is a set of patterns sharing common attributes.

¢ A collection of “similar” (not necessarily identical) objects.

¢ During recognition given objects are assigned to prescribed classes.

4

WHAT IS PATTERN RECOGNITION?

¢ Theory, Algorithms, Systems to put Patterns into Categories

¢ Relate Perceived Pattern to Previously Perceived Patterns

¢ Learn to distinguish patterns of interest from their background

5

HUMAN PERCEPTION

¢ Humans have developed highly sophisticated skills for sensing their environment and taking actions according to what they observe, e.g., �  Recognizing a face.

�  Understanding spoken words.

�  Reading handwriting. �  Distinguishing fresh food from its smell.

¢ We would like to give similar capabilities to machines.

6

EXAMPLES OF APPLICATIONS

• Handwritten: sorting letters by postal code. • Printed texts: reading machines for blind people, digitalization of text documents.

Optical Character Recognition

(OCR)

• Face recognition, verification, retrieval. • Finger prints recognition. • Speech recognition.

Biometrics

• Medical diagnosis: X-Ray, EKG (ElectroCardioGraph) analysis.

Diagnostic systems

• Automated Target Recognition (ATR). • Image segmentation and analysis (recognition from aerial or satelite photographs).

Military applications 7

THE STATISTICAL

WAY 8

GRID BY GRID COMPARISON

A A B Grid by Grid Comparison

9

GRID BY GRID COMPARISON

A A B 10

0 0 1 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 0 1

0 1 1 0 0 1 1 0 0 1 1 0 1 0 0 1 1 0 0 1

No of Mismatch= 3

GRID BY GRID COMPARISON

A A B Grid by Grid Comparison

11

GRID BY GRID COMPARISON

A A B 12

0 0 1 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 0 1

1 1 1 0 0 1 0 1 0 1 1 1 0 1 0 1 1 1 1 0

No of Mismatch= 9

PROBLEM WITH GRID BY GRID COMPARISON

¢ Time to recognize a pattern - Proportional to the number of stored patterns ( Too costly with the increase of number of patterns stored )

13

Solution Artificial

Intelligence

A-Z a-z 0-9

*/-+1@#

HUMAN AND MACHINE PERCEPTION

¢  We are often influenced by the knowledge of how patterns are modeled and recognized in nature when we develop pattern recognition algorithms.

¢  Research on machine perception also helps us gain deeper understanding and appreciation for pattern recognition systems in nature.

¢  Yet, we also apply many techniques that are purely numerical and do not have any correspondence in natural systems.

14

PATTERN RECOGNITION

¢ Two Phase : Learning and Detection.

¢ Time to learn is higher. �  Driving a car

¢ Difficult to learn but once learnt it becomes

natural.

¢ Can use AI learning methodologies such as: �  Neural Network. �  Machine Learning.

15

LEARNING

¢  How can machine learn the rule from data?

�  Supervised learning: a teacher provides a category label or cost for each pattern in the training set.

�  Unsupervised learning: the system forms clusters or natural groupings of the input patterns.

16

¢ Classification (known categories) ¢ Clustering (creation of new categories)

CLASSIFICATION VS. CLUSTERING

17

Category “A”

Category “B”

Clustering (Unsupervised Classification)

Classification (Supervised Classification)

PATTERN RECOGNITION PROCESS (CONT.)

18

Post- processing

Classification

Feature Extraction

Segmentation

Sensing

input

Decision

PATTERN RECOGNITION PROCESS

¢  Data acquisition and sensing: �  Measurements of physical variables. �  Important issues: bandwidth, resolution , etc.

¢  Pre-processing: �  Removal of noise in data. �  Isolation of patterns of interest from the background.

¢  Feature extraction: �  Finding a new representation in terms of features.

¢  Classification �  Using features and learned models to assign a pattern

to a category. ¢  Post-processing

�  Evaluation of confidence in decisions. 19

CASE STUDY

¢ Fish Classification: �  Sea Bass / Salmon.

¢ Problem: Sorting incoming fish

on a conveyor belt according to species.

¢  Assume that we have only two kinds of fish:

�  Sea bass.

�  Salmon.

20

Salmon

Sea-bass

CASE STUDY (CONT.)

¢ What can cause problems during sensing? �  Lighting conditions.

�  Position of fish on the conveyor belt. �  Camera noise. �  etc…

¢ What are the steps in the process? 1. Capture image. 2. Isolate fish 3. Take measurements 4. Make decision

21

CASE STUDY (CONT.)

22

Classification

Feature Extraction

Pre-processing

“Sea Bass” “Salmon”

CASE STUDY (CONT.)

¢  Pre-Processing: �  Image enhancement

�  Separating touching or occluding fish.

�  Finding the boundary of the fish.

23

HOW TO SEPARATE SEA BASS FROM SALMON?

¢ Possible features to be used: �  Length

�  Lightness

�  Width

�  Number and shape of fins

�  Position of the mouth �  Etc …

¢  Assume a fisherman told us that a “sea bass” is generally longer than a “salmon”.

¢  Even though “sea bass” is longer than “salmon” on the average, there are many examples of fish where this observation does not hold. 24

HOW TO SEPARATE SEA BASS FROM SALMON?

¢ To improve recognition, we might have to use more than one feature at a time. �  Single features might not yield the best performance.

�  Combinations of features might yield better performance.

25

1

2

xx⎡ ⎤⎢ ⎥⎣ ⎦

1

2

::

x lightnessx width

FEATURE SELECTION

26

“Good” features “Bad” features

DECISION BOUNDARY

27

DECISION BOUNDARY (CONT.)

28 More complex model result more complex boundary

DECISION BOUNDARY (CONT.)

29 Different criteria lead to different decision boundaries

DECISION BOUNDARY (CONT.)

¢ What if a customers find “Sea bass” in there “Salmon” can?

¢ We should also consider costs of different errors we make in our decisions.

30

DECISION BOUNDARY (CONT.)

¢ For example, if the fish packing company knows that: �  Customers who buy salmon will object vigorously

if they see sea bass in their cans.

�  Customers who buy sea bass will not be unhappy if they occasionally see some expensive salmon in their cans.

¢ How does this knowledge affect our decision?

31

CASE STUDY (CONT.)

¢ Issues with feature extraction: �  Correlated features do not necessary improve

performance.

�  It might be difficult to extract certain features.

�  It might be computationally expensive to extract many features.

�  Missing Features.

�  Domain Knowledge.

32

THE DESIGN CYCLE

• Collecting training and testing data.

Collect Data

• Domain dependence.

Chose Features.

• Domain dependence.

Chose Model

• Supervised learning • Unsupervised learning.

Train

• Performance with future data

Evaluate 33

DEMO 34

DEMO

¢ Online face detector demo: �  http://demo.pittpatt.com/index.php

35

36

DEMO (CONT.)

¢ With my friend “Albert Einstein”

37

VIDEO DEMO

38

Q & A 39

THANK YOU 40