Formal Languages Abcd

196
Formal languages abcd From Wikipedia, the free encyclopedia

description

1. From Wikipedia, the free encyclopedia2. Lexicographical order

Transcript of Formal Languages Abcd

  • Formal languages abcdFrom Wikipedia, the free encyclopedia

  • Contents

    1 Abstract family of acceptors 11.1 Formal definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    1.1.1 AFA Schema . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1.2 Abstract family of acceptors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    1.2 Informal discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.2.1 AFA Schema . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.2.2 Abstract family of acceptors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    1.3 Results from AFL theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.4 Origins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    2 Abstract family of languages 32.1 Formal definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.2 Some families of languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.3 Origins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.4 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

    3 Abstract rewriting system 53.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53.2 Example 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53.3 Basic notions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.4 Normal forms and the word problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.5 Joinability and the ChurchRosser property . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.6 Notions of confluence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.7 Termination and convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.8 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.9 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93.10 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

    4 Abstract semantic graph 104.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

    i

  • ii CONTENTS

    4.3 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    5 Abstract syntax tree 125.1 Application in compilers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

    5.1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125.1.2 Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.1.3 Design patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.1.4 Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    5.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155.4 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155.5 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

    6 Action algebra 166.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

    7 Adaptive grammar 187.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

    7.1.1 Early history . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187.1.2 Collaborative efforts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187.1.3 Terminology and taxonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

    7.2 Adaptive formalisms in the literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197.2.1 Adaptive grammar formalisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197.2.2 Adaptive machine formalisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

    7.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217.4 References and notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217.5 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    8 Affix grammar 238.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238.2 Types of affix grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

    9 Agent Communications Language 259.1 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    10 Algorithmic learning theory 2610.1 Distinguishing Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2610.2 Learning in the limit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2610.3 Other Identification Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

  • CONTENTS iii

    10.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2710.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2710.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

    11 Alphabet (formal languages) 2911.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2911.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2911.3 Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

    12 Ambiguous grammar 3012.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

    12.1.1 Trivial language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3012.1.2 Unary string . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3112.1.3 Addition and subtraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3112.1.4 Dangling else . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

    12.2 Recognizing ambiguous grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3212.3 Inherently ambiguous languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3212.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3212.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3212.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

    13 Antimatroid 3413.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3513.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3513.3 Paths and basic words . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3613.4 Convex geometries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3713.5 Join-distributive lattices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3713.6 Supersolvable antimatroids . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3813.7 Join operation and convex dimension . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3813.8 Enumeration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3913.9 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3913.10Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3913.11References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

    14 Aperiodic finite state automaton 4114.1 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4114.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

    15 Ardens Rule 4215.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4215.2 Statement of Ardens rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4215.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4215.4 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

  • iv CONTENTS

    15.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

    16 Attribute grammar 4316.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4316.2 Synthesized attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4316.3 Inherited attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4416.4 Special types of attribute grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4416.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4416.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

    17 Augmented BackusNaur Form 4517.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4517.2 Terminal values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4517.3 Operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

    17.3.1 White space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4617.3.2 Comment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4617.3.3 Concatenation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4617.3.4 Alternative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4617.3.5 Incremental alternatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4617.3.6 Value range . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4717.3.7 Sequence group . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4717.3.8 Variable repetition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4717.3.9 Specific repetition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4717.3.10 Optional sequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4717.3.11 Operator precedence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4817.3.12 Core rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

    17.4 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4817.5 Pitfalls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4817.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4817.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

    18 BackusNaur Form 5018.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5018.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5218.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5218.4 Further examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5318.5 Variants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5318.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

    18.6.1 Software using BNF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5418.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5418.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

    18.8.1 Language grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

  • CONTENTS v

    19 Bigram 5719.1 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5719.2 Bigram frequency in the English language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5719.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5719.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

    20 Boolean grammar 5920.1 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5920.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6020.3 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

    21 Brzozowski derivative 6121.1 Derivative of a regular expression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6121.2 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6121.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

    22 Categorial grammar 6322.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6322.2 Lambek calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

    22.2.1 Relation to context-free grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6522.2.2 Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

    22.3 Historical notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6622.4 Some definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6622.5 Refinements of categorial grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

    22.5.1 Features and subcategories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6622.5.2 Function composition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6622.5.3 Conjunction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6622.5.4 Discontinuity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

    22.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6722.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6722.8 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6722.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

    23 Chomsky hierarchy 6823.1 Formal grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6823.2 The hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

    23.2.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7123.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

    24 Chomsky normal form 7224.1 Converting a grammar to Chomsky normal form . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

    24.1.1 START: Eliminate the start symbol from right-hand sides . . . . . . . . . . . . . . . . . . 7224.1.2 TERM: Eliminate rules with nonsolitary terminals . . . . . . . . . . . . . . . . . . . . . . 72

  • vi CONTENTS

    24.1.3 BIN: Eliminate right-hand sides with more than 2 nonterminals . . . . . . . . . . . . . . . 7324.1.4 DEL: Eliminate -rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7324.1.5 UNIT: Eliminate unit rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7424.1.6 Order of transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

    24.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7424.3 Alternative definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

    24.3.1 Chomsky reduced form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7524.3.2 Floyd normal form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

    24.4 Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7624.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7624.6 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7624.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7724.8 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

    25 ChomskySchtzenberger enumeration theorem 7825.1 Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7825.2 Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

    25.2.1 Asymptotic estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7925.2.2 Inherent ambiguity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

    25.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

    26 ChomskySchtzenberger representation theorem 8126.1 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

    27 Closest string 8227.1 Formal definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8227.2 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8227.3 Simplifications and data reductions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

    27.3.1 Normalizing the input . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8227.4 Approximability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8327.5 Fixed-parameter tractability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8327.6 Relations to other problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8427.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

    28 Compact semigroup 8528.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8528.2 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8528.3 Varieties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8528.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

    29 Compiler Description Language 8729.1 Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8729.2 Use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

  • CONTENTS vii

    29.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8829.4 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

    30 Concatenation 9030.1 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9030.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9030.3 Concatenation of sets of strings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9030.4 Algebraic properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9130.5 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

    30.5.1 Audio/telephony . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9130.5.2 Database theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

    30.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

    31 Cone (formal languages) 9331.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9331.2 Relation to Transducers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9331.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9431.4 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9431.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9431.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94

    32 Conference on Implementation and Application of Automata 9532.1 Topics of the Conference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9532.2 Recent History of the Conference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9632.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9632.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9632.5 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

    33 Conjunctive grammar 9733.1 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9733.2 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

    34 Context change potential 9934.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9934.2 Notes and references . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10034.3 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10034.4 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

    35 Context-free grammar 10135.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10135.2 Formal definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

    35.2.1 Production rule notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10235.2.2 Rule application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10235.2.3 Repetitive rule application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

  • viii CONTENTS

    35.2.4 Context-free language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10335.2.5 Proper CFGs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10335.2.6 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

    35.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10335.3.1 Well-formed parentheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10335.3.2 Well-formed nested parentheses and square brackets . . . . . . . . . . . . . . . . . . . . 10435.3.3 A regular grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10435.3.4 Matching pairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10535.3.5 Algebraic expressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10535.3.6 Further examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10635.3.7 Derivations and syntax trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

    35.4 Normal forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10835.5 Closure properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10835.6 Decidable problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10935.7 Undecidable problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109

    35.7.1 Universality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10935.7.2 Language equality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10935.7.3 Language inclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10935.7.4 Being in a lower or higher level of the Chomsky hierarchy . . . . . . . . . . . . . . . . . . 10935.7.5 Grammar ambiguity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10935.7.6 Language disjointness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109

    35.8 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11035.9 Subclasses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11035.10Linguistic applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11135.11See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111

    35.11.1 Parsing algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11135.12Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11135.13References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

    36 Context-free language 11336.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11336.2 Languages that are not context-free . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11336.3 Closure properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114

    36.3.1 Nonclosure under intersection, complement, and difference . . . . . . . . . . . . . . . . . 11436.4 Decidability properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11436.5 Parsing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11536.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11536.7 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11536.8 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

    37 Context-sensitive grammar 11737.1 Formal definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

  • CONTENTS ix

    37.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11837.3 Kuroda normal form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11937.4 Properties and uses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119

    37.4.1 Equivalence to linear bounded automaton . . . . . . . . . . . . . . . . . . . . . . . . . . 11937.4.2 Closure properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11937.4.3 Computational problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11937.4.4 As model of natural languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119

    37.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12037.6 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12037.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12037.8 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12137.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121

    38 Context-sensitive language 12238.1 Computational properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12238.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12238.3 Properties of context-sensitive languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12238.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12338.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

    39 Controlled grammar 12439.1 Control by prescribed sequences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

    39.1.1 Language controlled grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12439.1.2 Matrix grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12639.1.3 Vector grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12739.1.4 Programmed grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127

    39.2 Control by context conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12839.2.1 Conditional grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12839.2.2 Semi-conditional grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12939.2.3 Random context grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12939.2.4 Ordered grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

    39.3 Grammars with parallelism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13139.3.1 Indian parallel grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13139.3.2 K-grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13239.3.3 Russian parallel grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13339.3.4 Scattered context grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133

    39.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133

    40 Convolution (computer science) 13440.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13440.2 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13440.3 In programming languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135

  • x CONTENTS

    40.4 Language comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13540.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13540.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135

    41 Critical exponent of a word 13741.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13741.2 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13741.3 Repetition threshold . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13741.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13741.5 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13841.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

    42 Cross-serial dependencies 13942.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13942.2 Why languages containing cross-serial dependencies are non-context-free . . . . . . . . . . . . . . 14042.3 Treatment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14042.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

    43 Definite clause grammar 14143.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14143.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14243.3 Translation into definite clauses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142

    43.3.1 Difference lists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14243.4 Non-context-free grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14243.5 Representing features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14243.6 Parsing with DCGs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14343.7 Other uses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14443.8 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14443.9 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14443.10Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14443.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

    44 DershowitzManna ordering 14644.1 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146

    45 Descriptional Complexity of Formal Systems 14745.1 Topics of the workshop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14745.2 Significance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14845.3 History of the workshop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14845.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14945.5 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14945.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14945.7 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150

  • CONTENTS xi

    46 Descriptive interpretation 15146.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15146.2 Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

    47 Deterministic context-free grammar 15347.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15347.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15347.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153

    48 Deterministic context-free language 15548.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15548.2 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15548.3 Importance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15548.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15648.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156

    49 Deterministic pushdown automaton 15749.1 Formal definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15749.2 Languages recognized . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15849.3 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158

    49.3.1 Closure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15849.3.2 Equivalence problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158

    49.4 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15849.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15849.6 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159

    50 diff utility 16050.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16050.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16150.3 Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16150.4 Variations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161

    50.4.1 Edit script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16150.4.2 Context format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16250.4.3 Unified format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16250.4.4 Others . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163

    50.5 Free file comparison tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16450.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16450.7 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16550.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166

    51 Discontinuous-constituent phrase structure grammar 16751.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16751.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

  • xii CONTENTS

    51.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168

    52 Dyck language 16952.1 Formal definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16952.2 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16952.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17052.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171

    53 Semigroup with involution 17253.1 Formal definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17253.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17253.3 Basic concepts and properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173

    53.3.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17353.4 Notions of regularity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173

    53.4.1 Regular *-semigroups (Nordahl & Scheiblich) . . . . . . . . . . . . . . . . . . . . . . . . 17453.4.2 *-regular semigroups (Drazin) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174

    53.5 Free semigroup with involution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17553.6 Baer *-semigroups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176

    53.6.1 Examples and applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17653.7 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17653.8 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17653.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17753.10Text and image sources, contributors, and licenses . . . . . . . . . . . . . . . . . . . . . . . . . . 178

    53.10.1 Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17853.10.2 Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18253.10.3 Content license . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183

  • Chapter 1

    Abstract family of acceptors

    An abstract family of acceptors (AFA) is a grouping of generalized acceptors. Informally, an acceptor is a devicewith a finite state control, a finite number of input symbols, and an internal store with a read and write function. Eachacceptor has a start state and a set of accepting states. The device reads a sequence of symbols, transitioning fromstate to state for each input symbol. If the device ends in an accepting state, the device is said to accept the sequenceof symbols. A family of acceptors is a set of acceptors with the same type of internal store. The study of AFA ispart of AFL (abstract families of languages) theory. [1]

    1.1 Formal definitions

    1.1.1 AFA Schema

    An AFA Schema is an ordered 4-tuple (, I, f, g) , where

    1. and I are nonempty abstract sets.

    2. f is the write function: f : I {} (N.B. * is the Kleene star operation).

    3. g is the read function, a mapping from into the finite subsets of , such that g() = {} and is in g()if and only if = . (N.B. is the empty word).

    4. For each in g() , there is an element 1 in I satisfying f(, 1) = for all such that is in g() .

    5. For each u in I, there exists a finite set u , such that if 1 , is in 1 , and f(, u) = , then f(, u)is in (1 u) .

    1.1.2 Abstract family of acceptors

    An abstract family of acceptors (AFA) is an ordered pair (,D) such that:

    1. is an ordered 6-tuple ( K , , , I , f , g ), where

    (a) ( , I , f , g ) is an AFA schema; and(b) K and are infinite abstract sets

    2. D is the family of all acceptors D = ( K1 , 1 , , q0 , F ), where

    (a) K1 and 1 are finite subsets of K , and respectively, F K1 , and q0 is in K1 ; and(b) (called the transition function) is a mapping from K1 (1 {}) g() into the finite subsets of

    K1 I such that the set GD = { | (q, a, ) for some q and a} is finite.

    1

    https://en.wikipedia.org/wiki/Finite_state_machine#Acceptors_and_recognizershttps://en.wikipedia.org/wiki/Abstract_family_of_languageshttps://en.wikipedia.org/wiki/Kleene_star
  • 2 CHAPTER 1. ABSTRACT FAMILY OF ACCEPTORS

    For a given acceptor, let be the relation on K1 1 defined by: For a in 1 {} , (p, aw, ) (p, w, )if there exists a and u such that is in g() , (p, u) is in (p, a, ) and f(, u) = . Let denote the transitiveclosure of .Let (,D) be an AFA and D = ( K1 , 1 , , q0 , F ) be in D . Define L(D) to be the set {w 1|q F.(q0, w, ) (q, , )} . For each subset E of D , let L(E) = {L(D)|D E} .Define Lf (D) to be the set {w 1|(q F )( ).(q0, w, ) (q, , )} . For each subset E of D , letLf (E) = {Lf (D)|D E} .

    1.2 Informal discussion

    1.2.1 AFA Schema

    An AFA schema defines a store or memory with read and write function. The symbols in are called storage symbolsand the symbols in I are called instructions. The write function f returns a new storage state given the current storagestate and an instruction. The read function g returns the current state of memory. Condition (3) insures the emptystorage configuration is distinct from other configurations. Condition (4) requires there be an identity instruction thatallows the state of memory to remain unchanged while the acceptor changes state or advances the input. Condition(5) assures that the set of storage symbols for any given acceptor is finite.

    1.2.2 Abstract family of acceptors

    An AFA is the set of all acceptors over a given pair of state and input alphabets which have the same storage mech-anism defined by a given AFA schema. The relation defines one step in the operation of an acceptor. Lf (D) isthe set of words accepted by acceptor D by having the acceptor enter an accepting state. L(D) is the set of wordsaccepted by acceptor D by having the acceptor simultaneously enter an accepting state and having an empty storage.The abstract acceptors defined by AFA are generalizations of other types of acceptors (e.g. finite state automata,pushdown automata, etc.). They have a finite state control like other automata, but their internal storage may varywidely from the stacks and tapes used in classical automata.

    1.3 Results from AFL theory

    The main result from AFL theory is that a family of languages L is a full AFL if and only if L = L(D) for someAFA (,D) . Equally important is the result that L is a full semi-AFL if and only if L = Lf (D) for some AFA(,D) .

    1.4 Origins

    Seymour Ginsburg of the University of Southern California and Sheila Greibach of Harvard University first presentedtheir AFL theory paper at the IEEE Eighth Annual Symposium on Switching and Automata Theory in 1967.[2]

    1.5 References[1] Seymour Ginsburg, Algebraic and automata theoretic properties of formal languages, North-Holland, 1975, ISBN 0-7204-

    2506-9.

    [2] IEEE conference record of 1967 Eighth Annual Symposium on Switching and Automata Theory : papers presented at theEighth Annual Symposium, University of Texas, October 18-20, 1967.

    https://en.wikipedia.org/wiki/Transitive_closurehttps://en.wikipedia.org/wiki/Transitive_closurehttps://en.wikipedia.org/wiki/Finite_state_automatahttps://en.wikipedia.org/wiki/Pushdown_automatahttps://en.wikipedia.org/wiki/Seymour_Ginsburghttps://en.wikipedia.org/wiki/University_of_Southern_Californiahttps://en.wikipedia.org/wiki/Sheila_Greibachhttps://en.wikipedia.org/wiki/Harvard_Universityhttps://en.wikipedia.org/wiki/IEEEhttps://en.wikipedia.org/wiki/Seymour_Ginsburghttps://en.wikipedia.org/wiki/Special:BookSources/0720425069https://en.wikipedia.org/wiki/Special:BookSources/0720425069http://www.worldcat.org/oclc/2891921
  • Chapter 2

    Abstract family of languages

    In computer science, in particular in the field of formal language theory, the term abstract family of languages refersto an abstract mathematical notion generalizing characteristics common to the regular languages, the context-freelanguages and the recursively enumerable languages, and other families of formal languages studied in the scientificliterature.

    2.1 Formal definitions

    A formal language is a set L for which there exists a finite set of abstract symbols such that L , where * isthe Kleene star operation.A family of languages is an ordered pair (,) , where

    1. is an infinite set of symbols;

    2. is a set of formal languages;

    3. For each L in there exists a finite subset 1 such that L 1 ; and

    4. L for some L in .

    A trio is a family of languages closed under e-free homomorphism, inverse homomorphism, and intersection withregular language.A full trio, also called a cone, is a trio closed under arbitrary homomorphism.A (full) semi-AFL is a (full) trio closed under union.A (full) AFL is a (full) semi-AFL closed under concatenation and the Kleene plus.

    2.2 Some families of languages

    The following are some simple results from the study of abstract families of languages.[1][2]

    Within the Chomsky hierarchy, the regular languages, the context-free languages, and the recursively enumerablelanguages are all full AFLs. However, the context sensitive languages and the recursive languages are AFLs, but notfull AFLs because they are not closed under arbitrary homomorphisms.The family of regular languages are contained within any cone (full trio). Other categories of abstract families areidentifiable by closure under other operations such as shuffle, reversal, or substitution.[3]

    3

    https://en.wikipedia.org/wiki/Computer_sciencehttps://en.wikipedia.org/wiki/Formal_languagehttps://en.wikipedia.org/wiki/Regular_languagehttps://en.wikipedia.org/wiki/Context-free_languagehttps://en.wikipedia.org/wiki/Context-free_languagehttps://en.wikipedia.org/wiki/Recursively_enumerable_languagehttps://en.wikipedia.org/wiki/Formal_languagehttps://en.wikipedia.org/wiki/Kleene_starhttps://en.wikipedia.org/wiki/Closure_(mathematics)https://en.wikipedia.org/wiki/E-free_homomorphismhttps://en.wikipedia.org/wiki/Homomorphismhttps://en.wikipedia.org/wiki/Regular_languagehttps://en.wikipedia.org/wiki/Cone_(formal_languages)https://en.wikipedia.org/wiki/Union_(set_theory)https://en.wikipedia.org/wiki/Concatenationhttps://en.wikipedia.org/wiki/Kleene_plushttps://en.wikipedia.org/wiki/Chomsky_hierarchyhttps://en.wikipedia.org/wiki/Regular_languagehttps://en.wikipedia.org/wiki/Context-free_languagehttps://en.wikipedia.org/wiki/Recursively_enumerable_languagehttps://en.wikipedia.org/wiki/Recursively_enumerable_languagehttps://en.wikipedia.org/wiki/Context-sensitive_languagehttps://en.wikipedia.org/wiki/Recursive_language
  • 4 CHAPTER 2. ABSTRACT FAMILY OF LANGUAGES

    2.3 Origins

    Seymour Ginsburg of the University of Southern California and Sheila Greibach of Harvard University presented thefirst AFL theory paper at the IEEE Eighth Annual Symposium on Switching and Automata Theory in 1967.[4]

    2.4 Notes[1] Ginsburg (1975)

    [2] Mateescu, A.; Salomaa, A. (2001), Abstract family of languages, in Hazewinkel, Michiel, Encyclopedia of Mathematics,Springer, ISBN 978-1-55608-010-4

    [3] Pun, Gh. (2001), AFL operations, in Hazewinkel, Michiel, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608-010-4

    [4] Ginsburg & Greibach (1967)

    2.5 References Ginsburg, Seymour; Greibach, Sheila (1967). Abstract Families of Languages. Conference Record of 1967

    Eighth Annual Symposium on Switching and Automata Theory, 18-20 October 1967, Austin, Texas, USA. IEEE.pp. 128139.

    Seymour Ginsburg, Algebraic and automata theoretic properties of formal languages, North-Holland, 1975,ISBN 0-7204-2506-9.

    John E. Hopcroft and Jeffrey D. Ullman, Introduction to Automata Theory, Languages, and Computation,Addison-Wesley Publishing, Reading Massachusetts, 1979. ISBN 0-201-02988-X. Chapter 11: Closure prop-erties of families of languages.

    Mateescu, Alexandru; Salomaa, Arto (1997). Chapter 4: Aspects of Classical Language Theory. In Rozen-berg, Grzegorz; Salomaa, Arto. Handbook of Formal Languages. Volume I: Word, language, grammar.Springer-Verlag. pp. 175252. ISBN 3-540-61486-9.

    https://en.wikipedia.org/wiki/Seymour_Ginsburghttps://en.wikipedia.org/wiki/University_of_Southern_Californiahttps://en.wikipedia.org/wiki/Sheila_Greibachhttps://en.wikipedia.org/wiki/Harvard_Universityhttps://en.wikipedia.org/wiki/Symposium_on_Switching_and_Automata_Theoryhttps://en.wikipedia.org/wiki/Abstract_family_of_languages#CITEREFGinsburg1975http://www.encyclopediaofmath.org/index.php?title=Abstract_family_of_languageshttps://en.wikipedia.org/wiki/Encyclopedia_of_Mathematicshttps://en.wikipedia.org/wiki/Springer_Science+Business_Mediahttps://en.wikipedia.org/wiki/International_Standard_Book_Numberhttps://en.wikipedia.org/wiki/Special:BookSources/978-1-55608-010-4http://www.encyclopediaofmath.org/index.php?title=AFL_operationshttps://en.wikipedia.org/wiki/Encyclopedia_of_Mathematicshttps://en.wikipedia.org/wiki/Springer_Science+Business_Mediahttps://en.wikipedia.org/wiki/International_Standard_Book_Numberhttps://en.wikipedia.org/wiki/Special:BookSources/978-1-55608-010-4https://en.wikipedia.org/wiki/Special:BookSources/978-1-55608-010-4https://en.wikipedia.org/wiki/Abstract_family_of_languages#CITEREFGinsburgGreibach1967https://en.wikipedia.org/wiki/Seymour_Ginsburghttps://en.wikipedia.org/wiki/Special:BookSources/0720425069https://en.wikipedia.org/wiki/Introduction_to_Automata_Theory,_Languages,_and_Computationhttps://en.wikipedia.org/wiki/Special:BookSources/020102988Xhttps://en.wikipedia.org/wiki/International_Standard_Book_Numberhttps://en.wikipedia.org/wiki/Special:BookSources/3-540-61486-9
  • Chapter 3

    Abstract rewriting system

    In mathematical logic and theoretical computer science, an abstract rewriting system (also (abstract) reductionsystem or abstract rewrite system; abbreviation ARS) is a formalism that captures the quintessential notion andproperties of rewriting systems. In its simplest form, an ARS is simply a set (of objects) together with a binaryrelation, traditionally denoted with ; this definition can be further refined if we index (label) subsets of the binaryrelation. Despite its simplicity, an ARS is sufficient to describe important properties of rewriting systems like normalforms, termination, and various notions of confluence.Historically, there have been several formalizations of rewriting in an abstract setting, each with its idiosyncrasies.This is due in part to the fact that some notions are equivalent, see below in this article. The formalization that ismost commonly encountered in monographs and textbooks, and which is generally followed here, is due to GrardHuet (1980).[1]

    3.1 Definition

    Abstract reduction system, (abbreviated ARS) is the most general (unidimensional) notion about specifying a setof objects and rules that can be applied to transform them. More recently authors use abstract rewriting system aswell.[2] (The preference for the word reduction here instead of rewriting constitutes a departure from the uniformuse of rewriting in the names of systems that are particularizations of ARS. Because the word reduction does notappear in the names of more specialized systems, in older texts reduction system is a synonym for ARS).[3]

    An ARS is a setA, whose elements are usually called objects, together with a binary relation onA, traditionally denotedby , and called the reduction relation, rewrite relation[4] or just reduction.[5] This (entrenched) terminology usingreduction is a little misleading, because the relation is not necessarily reducing some measure of the objects.In some contexts it may be beneficial to distinguish between some subsets of the rules, i.e. some subsets of the reduc-tion relation , e.g. the entire reduction relation may consist of associativity and commutativity rules. Consequently,some authors define the reduction relation as the indexed union of some relations; for instance if1 2= ,the notation used is (A, 1, 2).As a mathematical object, an ARS is exactly the same as an unlabeled state transition system, and if the relation isconsidered as an indexed union, then an ARS is the same as a labeled state transition system with the indices being thelabels. The focus of the study, and the terminology are different however. In a state transition system one is interestedin interpreting the labels as actions, whereas in an ARS the focus is on how objects may be transformed (rewritten)into others.[6]

    3.2 Example 1

    Suppose the set of objects is T = {a, b, c} and the binary relation is given by the rules a b, b a, a c, and b c. Observe that these rules can be applied to both a and b to get c. Note also, that c is, in a sense, a simplest objectin the system, since nothing can be applied to c to transform it any further. Such a property is clearly an importantone.

    5

    https://en.wikipedia.org/wiki/Mathematical_logichttps://en.wikipedia.org/wiki/Theoretical_computer_sciencehttps://en.wikipedia.org/wiki/Formalism_(mathematics)https://en.wikipedia.org/wiki/Rewritinghttps://en.wikipedia.org/wiki/Set_(mathematics)https://en.wikipedia.org/wiki/Binary_relationhttps://en.wikipedia.org/wiki/Binary_relationhttps://en.wikipedia.org/wiki/Normal_form_(abstract_rewriting)https://en.wikipedia.org/wiki/Normal_form_(abstract_rewriting)https://en.wikipedia.org/wiki/Termination_(term_rewriting)https://en.wikipedia.org/wiki/Confluence_(abstract_rewriting)https://en.wikipedia.org/wiki/G%C3%A9rard_Huethttps://en.wikipedia.org/wiki/G%C3%A9rard_Huethttps://en.wikipedia.org/wiki/Binary_relationhttps://en.wikipedia.org/wiki/Associativityhttps://en.wikipedia.org/wiki/Commutativityhttps://en.wikipedia.org/wiki/State_transition_systemhttps://en.wikipedia.org/wiki/State_transition_system
  • 6 CHAPTER 3. ABSTRACT REWRITING SYSTEM

    3.3 Basic notions

    Example 1 leads us to define some important notions in the general setting of an ARS. First we need some basicnotions and notations.[7]

    is the transitive closure of = , where = is the identity relation, i.e. is the smallest preorder (reflexiveand transitive relation) containing . It is also called the reflexive transitive closure of .

    is 1 , that is the union of the relation with its inverse relation, also known as the symmetricclosure of .

    is the transitive closure of = , that is is the smallest equivalence relation containing . It is alsoknown as the reflexive transitive symmetric closure of .

    3.4 Normal forms and the word problem

    Main article: Normal form (abstract rewriting)

    An object x in A is called reducible if there exist some other y in A and x y ; otherwise it is called irreducibleor a normal form. An object y is called a normal form of x if x y , and y is irreducible. If x has a unique normalform, then this is usually denoted with x . In example 1 above, c is a normal form, and c = a = b . If everyobject has at least one normal form, the ARS is called normalizing.One of the important problems that may be formulated in an ARS is the word problem: given x and y are theyequivalent under ? This is a very general setting for formulating the word problem for the presentation of analgebraic structure. For instance, the word problem for groups is a particular case of an ARS word problem. Centralto an easy solution for the word problem is the existence of unique normal forms: in this case if two objects havethe same normal form, then they are equivalent under . The word problem for an ARS is undecidable in general.

    3.5 Joinability and the ChurchRosser property

    A related, but weaker notion than the existence of normal forms is that of two objects being joinable: x and y aresaid joinable if there exists some z with the property that x z y . From this definition, its apparent one may definethe joinability relation as , where is the composition of relations. Joinability is usually denoted, somewhatconfusingly, also with , but in this notation the down arrow is a binary relation, i.e. we write xy if x and y arejoinable.An ARS is said to possess the Church-Rosser property if and only if x y implies xy for all objects x, y. Equiva-lently, the Church-Rosser property means that the reflexive transitive symmetric closure is contained in the joinabilityrelation. Alonzo Church and J. Barkley Rosser proved in 1936 that lambda calculus has this property;[8] hence thename of the property.[9] (The fact that lambda calculus has this property is also known as the Church-Rosser theo-rem.) In an ARS with the Church-Rosser property the word problem may be reduced to the search for a commonsuccessor. In a Church-Rosser system, an object has at most one normal form; that is the normal form of an object isunique if it exists, but it may well not exist. In lambda calculus for instance, the expression (x.xx)(x.xx) does nothave a normal form because there exists an infinite sequence of beta reductions (x.xx)(x.xx) (x.xx)(x.xx) ...[10]

    3.6 Notions of confluence

    See also: Confluence (abstract rewriting)

    Various properties, simpler than Church-Rosser, are equivalent to it. The existence of these equivalent propertiesallows one to prove that a system is Church-Rosser with less work. Furthermore, the notions of confluence can be

    https://en.wikipedia.org/wiki/Transitive_closurehttps://en.wikipedia.org/wiki/Identity_relationhttps://en.wikipedia.org/wiki/Preorderhttps://en.wikipedia.org/wiki/Reflexive_relationhttps://en.wikipedia.org/wiki/Transitive_relationhttps://en.wikipedia.org/wiki/Reflexive_transitive_closurehttps://en.wikipedia.org/wiki/Inverse_relationhttps://en.wikipedia.org/wiki/Symmetric_closurehttps://en.wikipedia.org/wiki/Symmetric_closurehttps://en.wikipedia.org/wiki/Transitive_closurehttps://en.wikipedia.org/wiki/Equivalence_relationhttps://en.wikipedia.org/wiki/Reflexive_transitive_symmetric_closurehttps://en.wikipedia.org/wiki/Normal_form_(abstract_rewriting)https://en.wikipedia.org/wiki/Word_problem_(mathematics)https://en.wikipedia.org/wiki/Word_problem_(mathematics)https://en.wikipedia.org/wiki/Word_problem_for_groupshttps://en.wikipedia.org/wiki/Undecidable_problemhttps://en.wikipedia.org/wiki/Composition_of_relationshttps://en.wikipedia.org/wiki/Alonzo_Churchhttps://en.wikipedia.org/wiki/J._Barkley_Rosserhttps://en.wikipedia.org/wiki/Lambda_calculushttps://en.wikipedia.org/wiki/Church-Rosser_theoremhttps://en.wikipedia.org/wiki/Church-Rosser_theoremhttps://en.wikipedia.org/wiki/Beta_reductionhttps://en.wikipedia.org/wiki/Confluence_(abstract_rewriting)
  • 3.6. NOTIONS OF CONFLUENCE 7

    Solving the word problem: deciding if x y usually requires heuristic search (red, green), while deciding x = y is straight-forward (grey). For term rewriting systems the Knuth-Bendix completion algorithm enlarges to establish unique normal forms, ifpossible.

    defined as properties of a particular object, something thats not possible for Church-Rosser. An ARS (A,) is saidto be,

    confluent if and only if for all w, x, and y in A, x w y implies xy . Roughly speaking, confluence saysthat no matter how two paths diverge from a common ancestor (w), the paths are joining at some commonsuccessor. This notion may be refined as property of a particular object w, and the system called confluent ifall its elements are confluent.

    semi-confluent if and only if for all w, x, and y in A, x w y implies xy . This differs from confluenceby the single step reduction from w to x.

    locally confluent if and only if for all w, x, and y in A, x w y implies xy . This property is sometimescalled weak confluence.

    Theorem. For an ARS the following three conditions are equivalent: (i) it has the Church-Rosser property, (ii) it isconfluent, (iii) it is semi-confluent.[11]

    Corollary.[12] In a confluent ARS if x y then

    https://en.wikipedia.org/wiki/Knuth-Bendix_completion_algorithm
  • 8 CHAPTER 3. ABSTRACT REWRITING SYSTEM

    If both x and y are normal forms, then x = y.

    If y is a normal form, then x y

    Because of these equivalences, a fair bit of variation in definitions is encountered in the literature. For instance, inTerese the Church-Rosser property and confluence are defined to be synonymous and identical to the definition ofconfluence presented here; Church-Rosser as defined here remains unnamed, but is given as an equivalent property;this departure from other texts is deliberate.[13] Because of the above corollary, one may define a normal form y of xas an irreducible y with the property that x y . This definition, found in Book and Otto, is equivalent to commonone given here in a confluent system, but it is more inclusive in a non-confluent ARS.Local confluence on the other hand is not equivalent with the other notions of confluence given in this section, but itis strictly weaker than confluence. The typical counterexample is {a b, b a, a x, b y} , which is locallyconfluent but not confluent.

    3.7 Termination and convergence

    An abstract rewriting system is said to be terminating or noetherian if there is no infinite chain x0 x1 x2 . (This is just saying that the rewriting relation is a Noetherian relation.) In a terminating ARS, every objecthas at least one normal form, thus it is normalizing. The converse is not true. In example 1 for instance, there isan infinite rewriting chain, namely a b a b , even though the system is normalizing. A confluentand terminating ARS is called canonical,[14] or convergent. In a convergent ARS, every object has a unique normalform. But it is sufficient for the system to be confluent and normalizing for a unique normal to exist for every element,as seen in example 1.Theorem (Newmans Lemma): A terminating ARS is confluent if and only if it is locally confluent.The original 1942 proof of this result by Newman was rather complicated. It wasn't until 1980 that Huet publisheda much simpler proof exploiting the fact that when is terminating we can apply well-founded induction.[15]

    3.8 Notes[1] Book and Otto, p. 9

    [2] Terese, p. 7,

    [3] Book and Otto, p. 10

    [4] Terese, p. 7

    [5] Book and Otto, p. 10

    [6] Terese, p. 7-8

    [7] Baader and Nipkow, pp. 8-9

    [8] Alonzo Church and J. Barkley Rosser. Some properties of conversion. Trans. AMS, 39:472-482, 1936

    [9] Baader and Nipkow, p. 9

    [10] S.B. Cooper, Computability theory, p. 184

    [11] Baader and Nipkow, p. 11

    [12] Baader and Nipkow, p. 12

    [13] Terese p.11

    [14] David A. Duffy (1991). Principles of Automated Theorem Proving. Wiley. Here: sect.7.2.1, p.153

    [15] Harrison, p. 260

    https://en.wikipedia.org/wiki/Noetherian_relationhttps://en.wikipedia.org/wiki/Newman%2527s_Lemmahttps://en.wikipedia.org/wiki/Well-founded_induction
  • 3.9. FURTHER READING 9

    3.9 Further reading Baader, Franz; Nipkow, Tobias (1998). Term Rewriting and All That. Cambridge University Press. A textbook

    suitable for undergraduates.

    Nachum Dershowitz and Jean-Pierre Jouannaud Rewrite Systems, Chapter 6 in Jan van Leeuwen (Ed.), Hand-book of Theoretical Computer Science, Volume B: Formal Models and Sematics., Elsevier and MIT Press, 1990,ISBN 0-444-88074-7, pp. 243320. The preprint of this chapter is freely available from the authors, but itmisses the figures.

    Ronald V. Book and Friedrich Otto, String-rewriting Systems, Springer (1993). Chapter 1, Abstract reductionsystems

    Marc Bezem, Jan Willem Klop, Roel de Vrijer (Terese), Term rewriting systems, Cambridge University Press,2003, ISBN 0-521-39115-6, Chapter 1. This is a comprehensive monograph. It uses however a fair deal ofnotations and definitions not commonly encountered elsewhere. For instance the ChurchRosser property isdefined to be identical with confluence.

    John Harrison, Handbook of Practical Logic and Automated Reasoning, Cambridge University Press, 2009,ISBN 978-0-521-89957-4, chapter 4 Equality. Abstract rewriting from the practical perspective of solvingproblems in equational logic.

    Grard Huet, Confluent Reductions: Abstract Properties and Applications to Term Rewriting Systems, Journal ofthe ACM (JACM), October 1980, Volume 27, Issue 4, pp. 797821. Huets paper established many of themodern concepts, results and notations.

    Sinyor, J.; The 3x+1 Problem as a String Rewriting System, International Journal of Mathematics and Math-ematical Sciences, Volume 2010 (2010), Article ID 458563, 6 pages.

    3.10 External links Abstract Rewrite Tool Java Applet to analyse abstract rewrite systems.

    https://en.wikipedia.org/wiki/Franz_Baaderhttps://en.wikipedia.org/wiki/Tobias_Nipkowhttps://en.wikipedia.org/wiki/Nachum_Dershowitzhttps://en.wikipedia.org/wiki/Jean-Pierre_Jouannaudhttp://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.64.3114https://en.wikipedia.org/wiki/Jan_van_Leeuwenhttps://en.wikipedia.org/wiki/Special:BookSources/0444880747https://en.wikipedia.org/wiki/Preprinthttps://en.wikipedia.org/wiki/Ronald_V._Bookhttps://en.wikipedia.org/wiki/Friedrich_Ottohttps://en.wikipedia.org/wiki/Marc_Bezemhttps://en.wikipedia.org/wiki/Jan_Willem_Klophttps://en.wikipedia.org/wiki/Roel_de_Vrijerhttps://en.wikipedia.org/wiki/Special:BookSources/0521391156https://en.wikipedia.org/wiki/John_Harrisonhttps://en.wikipedia.org/wiki/Special:BookSources/9780521899574https://en.wikipedia.org/wiki/Propositional_calculus#Equivalence_to_equational_logicshttps://en.wikipedia.org/wiki/G%C3%A9rard_Huethttps://en.wikipedia.org/wiki/JACMhttp://downloads.hindawi.com/journals/ijmms/2010/458563.pdfhttp://www.pi23.net/art2/
  • Chapter 4

    Abstract semantic graph

    In computer science, an abstract semantic graph (ASG) or term graph is a form of abstract syntax in whichan expression of a formal or programming language is represented by a graph whose vertices are the expressionssubterms. An ASG is at a higher level of abstraction than an abstract syntax tree (or AST), which is used to expressthe syntactic structure of an expression or program.ASGs are more complex and concise than ASTs because they may contain shared subterms (also known as commonsubexpressions).[1] Abstract semantic graphs are often used as an intermediate representation by compilers to storethe results of performing common subexpression elimination upon abstract syntax trees. ASTs are trees and are thusincapable of representing shared terms. ASGs are usually directed acyclic graphs. However, they may be cyclic,particularly in the field of graph rewriting. Cyclic graphs may represent recursive expressions which are commonlyused to express iteration in functional programming languages without looping constructs.The nomenclature term graph is associated with the field of term graph rewriting,[2] which involves the transformationand processing of expressions by the specification of rewriting rules,[3] whereas abstract semantic graph is used whendiscussing linguistics, programming languages, type systems and compilation.Abstract syntax trees are not capable of representing shared subexpressions due to their simplistic structure; thissimplicity comes at a cost of efficiency due to redundant duplicate computations of identical terms. For this reasonASGs are often used as an intermediate language at a subsequent compilation stage to abstract syntax tree constructionvia parsing.An abstract semantic graph is typically constructed from an abstract syntax tree by a process of enrichment andabstraction. The enrichment can for example be the addition of back-pointers, edges from an identifier node (wherea variable is being used) to a node representing the declaration of that variable. The abstraction can entail the removalof details which are relevant only in parsing, not for semantics.

    4.1 See also

    Ontology (computer science)

    Semantic Web

    Semantic Grid

    4.2 References

    [1] Garner, Richard (2011). An abstract view on syntax with sharing. Oxford University press. doi:10.1093/logcom/exr021.The notion of term graph encodes a refinement of inductively generated syntax in which regard is paid to the sharing anddiscard of subterms.

    [2] Plump, D. (1999). Ehrig, Hartmut; Engels, G.; Rozenberg, Grzegorz, eds. Handbook of Graph Grammars and Computingby Graph Transformation: applications, languages and tools 2. World Scientific. pp. 913. ISBN 9789810228842.

    10

    https://en.wikipedia.org/wiki/Computer_sciencehttps://en.wikipedia.org/wiki/Graph_(data_structure)https://en.wikipedia.org/wiki/Abstract_syntaxhttps://en.wikipedia.org/wiki/Expression_(computer_science)https://en.wikipedia.org/wiki/Formal_languagehttps://en.wikipedia.org/wiki/Programming_languagehttps://en.wikipedia.org/wiki/Graph_(mathematics)https://en.wikipedia.org/wiki/Term_(mathematics)https://en.wikipedia.org/wiki/Abstraction_(computer_science)https://en.wikipedia.org/wiki/Abstract_syntax_treehttps://en.wikipedia.org/wiki/Syntaxhttps://en.wikipedia.org/wiki/Program_(computer_science)https://en.wikipedia.org/wiki/Intermediate_representationhttps://en.wikipedia.org/wiki/Compilershttps://en.wikipedia.org/wiki/Common_subexpression_eliminationhttps://en.wikipedia.org/wiki/Abstract_syntax_treeshttps://en.wikipedia.org/wiki/Tree_(computer_science)https://en.wikipedia.org/wiki/Directed_acyclic_graphhttps://en.wikipedia.org/wiki/Cyclic_graphhttps://en.wikipedia.org/wiki/Graph_rewritinghttps://en.wikipedia.org/wiki/Recursivehttps://en.wikipedia.org/wiki/Iterationhttps://en.wikipedia.org/wiki/Functional_programming_languagehttps://en.wikipedia.org/wiki/Loop_(computing)https://en.wikipedia.org/wiki/Graph_rewriting#Term_graph_rewritinghttps://en.wikipedia.org/wiki/Linguisticshttps://en.wikipedia.org/wiki/Programming_languageshttps://en.wikipedia.org/wiki/Type_systemshttps://en.wikipedia.org/wiki/Compilerhttps://en.wikipedia.org/wiki/Intermediate_languagehttps://en.wikipedia.org/wiki/Pointer_(computer_programming)https://en.wikipedia.org/wiki/Edge_(graph_theory)https://en.wikipedia.org/wiki/Identifierhttps://en.wikipedia.org/wiki/Variable_(programming)https://en.wikipedia.org/wiki/Declaration_(computer_science)https://en.wikipedia.org/wiki/Logical_consequencehttps://en.wikipedia.org/wiki/Parsinghttps://en.wikipedia.org/wiki/Ontology_(computer_science)https://en.wikipedia.org/wiki/Semantic_Webhttps://en.wikipedia.org/wiki/Semantic_Gridhttps://en.wikipedia.org/wiki/Oxford_Universityhttps://en.wikipedia.org/wiki/Digital_object_identifierhttps://dx.doi.org/10.1093%252Flogcom%252Fexr021https://en.wikipedia.org/wiki/International_Standard_Book_Numberhttps://en.wikipedia.org/wiki/Special:BookSources/9789810228842
  • 4.3. EXTERNAL LINKS 11

    [3] Barendregt, H. P.; van Eekelen, M. C. J. D.; Glauert, J. R. W.; Kennaway, J. R.; Plasmeijer, M. J.; Sleep, M. R. (1987).Term graph rewriting. PARLE Parallel Architectures and Languages Europe (Lecture Notes in Computer Science) 259:141158. doi:10.1007/3-540-17945-3_8.

    4.3 External links Dean, Tom. CPPX - C/C++ Fact Extractor.

    Devanbu, Premkumar T.; Rosenblum, David S.; Wolf, Alexander L.. Generating Testing and Analysis Toolswith Aria.

    Mamas, Evan; Kontogiannis, Kostas. Towards Portable Source Code Representations Using XML. CiteSeerX:10 .1 .1 .88 .6173.

    Raghavan, Shruti; Rohana, Rosanne; Leon, David; Podgurski, Andy; Augustine, Vinay (2004). Dex: asemantic-graph differencing tool for studying changes in large code bases. IEEE International Conference onSoftware Maintenance. pp. 188197. doi:10.1109/icsm.2004.1357803.

    https://en.wikipedia.org/wiki/Digital_object_identifierhttps://dx.doi.org/10.1007%252F3-540-17945-3_8https://en.wikipedia.org/wiki/Tom_Dean_(computer_scientist)http://www.swag.uwaterloo.ca/cppx/old_cppx_site/https://en.wikipedia.org/wiki/Premkumar_T._Devanbuhttps://en.wikipedia.org/wiki/David_S._Rosenblumhttps://en.wikipedia.org/wiki/Alexander_L._Wolfhttp://citeseer.ist.psu.edu/devanbu96generating.htmlhttp://citeseer.ist.psu.edu/devanbu96generating.htmlhttps://en.wikipedia.org/wiki/Evan_Mamashttps://en.wikipedia.org/wiki/Kostas_Kontogiannishttps://en.wikipedia.org/wiki/CiteSeer#CiteSeerXhttp://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.88.6173http://www.citeulike.org/user/hayashi/article/259537http://www.citeulike.org/user/hayashi/article/259537https://en.wikipedia.org/wiki/Digital_object_identifierhttps://dx.doi.org/10.1109%252Ficsm.2004.1357803
  • Chapter 5

    Abstract syntax tree

    For the trees used in linguistics, see Concrete syntax tree.In computer science, an abstract syntax tree (AST), or just syntax tree, is a tree representation of the abstract

    syntactic structure of source code written in a programming language. Each node of the tree denotes a constructoccurring in the source code. The syntax is abstract in not representing every detail appearing in the real syntax.For instance, grouping parentheses are implicit in the tree structure, and a syntactic construct like an if-condition-thenexpression may be denoted by means of a single node with three branches.This distinguishes abstract syntax trees from concrete syntax trees, traditionally designated parse trees, which areoften built by a parser during the source code translation and compiling process. Once built, additional informationis added to the AST by means of subsequent processing, e.g., contextual analysis.Abstract syntax trees are also used in program analysis and program transformation systems.

    5.1 Application in compilers

    Abstract syntax trees are data structures widely used in compilers, due to their property of representing the structureof program code. An AST is usually the result of the syntax analysis phase of a compiler. It often serves as anintermediate representation of the program through several stages that the compiler requires, and has a strong impacton the final output of the compiler.

    5.1.1 Motivation

    Being the product of the syntax analysis phase of a compiler, the AST has several properties that are invaluable tothe further steps of the compilation process.

    Compared to the source code, an AST does not include certain elements, such as inessential punctuation anddelimiters (braces, semicolons, parentheses, etc.).

    A more important difference is that the AST can be edited and enhanced with properties and annotations forevery element it contains. Such editing and annotation is impossible with the source code of a program, sinceit would imply changing it.

    At the same time, an AST usually contains extra information about the program, due to the consecutive stagesof analysis by the compiler, an example being the position of an element in the source code. This informationmay be used to notify the user of the location of an error in the code.

    ASTs are needed because of the inherent nature of programming languages and their documentation. Languages areoften ambiguous by nature. In order to avoid this ambiguity, programming languages are often specified as a contextfree grammar (CFG). However, there are often aspects of programming languages that a CFG can't express, but arepart of the language and are documented in its specification. These are details that require a context to determinetheir validity and behaviour. For example, if a language allows new types to be declared, a CFG cannot predict the

    12

    https://en.wikipedia.org/wiki/Concrete_syntax_treehttps://en.wikipedia.org/wiki/Computer_sciencehttps://en.wikipedia.org/wiki/Directed_treehttps://en.wikipedia.org/wiki/Abstract_syntaxhttps://en.wikipedia.org/wiki/Abstract_syntaxhttps://en.wikipedia.org/wiki/Source_codehttps://en.wikipedia.org/wiki/Programming_languagehttps://en.wikipedia.org/wiki/Bracket#Parentheseshttps://en.wikipedia.org/wiki/Concrete_syntax_treehttps://en.wikipedia.org/wiki/Parse_treehttps://en.wikipedia.org/wiki/Parserhttps://en.wikipedia.org/wiki/Compilerhttps://en.wikipedia.org/wiki/Semantic_analysis_(compilers)https://en.wikipedia.org/wiki/Program_transformationhttps://en.wikipedia.org/wiki/Data_structureshttps://en.wikipedia.org/wiki/Compilershttps://en.wikipedia.org/wiki/Syntax_analysishttps://en.wikipedia.org/wiki/Syntax_analysishttps://en.wikipedia.org/wiki/Source_codehttps://en.wikipedia.org/wiki/Context_free_grammarhttps://en.wikipedia.org/wiki/Context_free_grammar
  • 5.1. APPLICATION IN COMPILERS 13

    condition

    body

    else-bodyif-body

    while

    variablename: b

    constantvalue: 0

    compareop:

    branch

    compareop: >

    assign

    bin opop:

    assign

    bin opop:

    statementsequence

    return

    variablename: a

    variablename: a

    variablename: a

    variablename: a

    variablename: a

    variablename: b

    variablename: b

    variablename: b

    variablename: b

    condition

    An abstract syntax tree for the following code for the Euclidean algorithm:

    while b 0

    if a > b

    a := a b

    else

    b := b a

    return a

    names of such types nor the way in which they should be used. Even if a language has a predefined set of types,enforcing proper usage usually requires some context. Another example is duck typing, where the type of an elementcan change depending on context. Operator overloading is yet another case where correct usage and final functionare determined based on the context. Java provides an excellent example, where the '+' operator is both numericaladdition and concatenation of strings.

    https://en.wikipedia.org/wiki/Euclidean_algorithmhttps://en.wikipedia.org/wiki/Duck_typinghttps://en.wikipedia.org/wiki/Operator_overloading
  • 14 CHAPTER 5. ABSTRACT SYNTAX TREE

    Although there are other data structures involved in the inner workings of a compiler, the AST performs a uniquefunction. During the first stage, the syntax analysis stage, a compiler produces a parse tree. This parse tree can beused to perform almost all functions of a compiler by means of syntax-directed translation. Although this methodcan lead to a more efficient compiler, it goes against the software engineering principles of writing and maintainingprograms. Another advantage that the AST has over a parse tree is the size, particularly the smaller height of the ASTand the smaller number of elements.

    5.1.2 Design

    The design of an AST is often closely linked with the design of a compiler and its expected features.Core requirements include the following:

    Variable types must be preserved, as well as the location of each declaration in source code.

    The order of executable statements must be explicitly represented and well defined.

    Left and right components of binary operations must be stored and correctly identified.

    Identifiers and their assigned values must be stored for assignment statements.

    These requirements can be used to design the data structure for the AST.Some operations will always require two elements, such as the two terms for addition. However, some language con-structs require an arbitrarily large number of children, such as argument lists passed to programs from the commandshell. As a result, an AST has to also be flexible enough to allow for quick addition of an unknown quantity ofchildren.Another major design requirement for an AST is that it should be possible to unparse an AST into source code form.The source code produced should be sufficiently similar to the original in appearance and identical in execution, uponrecompilation.

    5.1.3 Design patterns

    Due to the complexity of the requirements for an AST and the overall complexity of a compiler, it is beneficial toapply sound software development principles. One of these is to use proven design patterns to enhance modularityand ease of development.Different operations don't necessarily have different types, so it is important to have a sound node class hierarchy.This is crucial in the creation and the modification of the AST as the compiler progresses.Because the compiler traverses the tree several times to determine syntactic correctness, it is important to maketraversing the tree a simple operation. The compiler executes a specific set of operations, depending on the type ofeach node, upon reaching it, so it often makes sense to use the Visitor pattern.

    5.1.4 Usage

    The AST is used intensively during semantic analysis, where the compiler checks for correct usage of the elements ofthe program and the language. The compiler also generates symbol tables based on the AST during semantic analysis.A complete traversal of the tree allows verification of the correctness of the program.After verifying correctness, the AST serves as the base for code generation. The AST is often used to generate the'intermediate representation' '(IR)', sometimes called an intermediate language, for the code generation.

    5.2 See also Abstract semantic graph (ASG)

    Composite pattern

    https://en.wikipedia.org/wiki/Data_structurehttps://en.wikipedia.org/wiki/Syntax_analysishttps://en.wikipedia.org/wiki/Command_shellhttps://en.wikipedia.org/wiki/Command_shellhttps://en.wikipedia.org/wiki/Visitor_patternhttps://en.wikipedia.org/wiki/Semantic_analysis_(compilers)https://en.wikipedia.org/wiki/Intermediate_languagehttps://en.wikipedia.org/wiki/Abstract_semantic_graphhttps://en.wikipedia.org/wiki/Composite_pattern
  • 5.3. REFERENCES 15

    Document Object Model (DOM)

    Extended BackusNaur Form

    Lisp, a family of languages written in trees, with macros to manipulate code trees at compile time

    Semantic resolution tree (RST)

    Shunting yard algorithm

    Symbol table

    TreeDL

    Term graph

    5.3 References This article is based on material taken from the Free On-line Dictionary of Computing prior to 1 November

    2008 and incorporated under the relicensing terms of the GFDL, version 1.3 or later.

    5.4 Further reading Jones, Joel. Abstract Syntax Tree Implementation Idioms (PDF). (overview of AST implementation in var-

    ious language families)

    Neamtiu, Iulian; Foster, Jeffrey S.; Hicks, Michael (May 17, 2005). Understanding Source Code EvolutionUsing Abstract Syntax Tree Matching. MSR'05. Saint Louis, Missouri: ACM. CiteSeerX: 10 .1 .1 .88 .5815.

    Baxter, Ira D.; Yahin, Andrew; Moura, Leonardo; Sant' Anna, Marcelo; Bier, Lorraine (November 1619,1998). Clone Detection Using Abstract Syntax Trees (PDF). Proceedings of ICSM'98 (Bethesda, Maryland:IEEE).

    Fluri, Beat; Wrsch, Michael; Pinzger, Martin; Gall, Harald C. Change Distilling: Tree Differencing forFine-Grained Source Code Change Extraction (PDF).

    Wrsch, Michael. Improving Abstract Syntax Tree based Source Code Change Detection (Diploma thesis).

    Lucas, Jason. Thoughts on the Visual C++ Abstract Syntax Tree (AST)".

    5.5 External links AST View: an E