6-Linguistics-Grammar-Kinds-Quantitative-Formal

formal grammar

Grammars {formal grammar} can be quantitative.

Chomsky hierarchy

Language processing and computer coding rely on grammar, which specifies sentence-structure rules. Parsing finds syntactical structure. Generating uses rules to create valid sentences. Syntax can have typical sentence structures {regular syntax}, for which words can substitute and which can be recursive {productive syntax}.

grammars

Noam Chomsky defined four generative-grammar classes, with different complexities {Chomsky hierarchy}. Type 0, with highest complexity, is General Grammars, such as Turing Machines. Type 0 grammars can be context-free, unrestricted, contracting, and freely substituting. Turing machines read input and write output anywhere on tape. Type 1 is Context-Sensitive Grammars, such as Linear Bounded Automata. Type 1 grammars can be context-sensitive, unrestricted, and non-contracting. Pushdown machines with finite tape read input and store output, after going backward and forward on tape until they find input string context that tells them what to do next. Type 2 is Context-Free Grammars, such as Pushdown Automata. Type 2 grammars can be context-free, unrestricted, and non-contracting. Pushdown machines read input and store output based solely on current state, without going backward or forward on tape. Type 3, with lowest complexity, is Regular Grammars, such as Finite State Automata. Type 3 grammars can be context-free or context-sensitive, regular, linear, and non-contracting. Finite-state machines read input tape, with no storage, until they find input string that tells them what to do next.

computer language

Computer languages must be deterministic, so parsing look-ahead is finite. Parsing non-deterministic languages requires trying all rules and/or guessing. Most unambiguous and ambiguous recursive transition networks are non-deterministic and cannot map to deterministic recursive transition networks. Non-deterministic finite state automata can map to deterministic finite state automata.

generative grammar

Grammars can have variables, which can have actions and include start symbols. Grammars can have constants, which can have no actions. Grammatical symbols are either variables or constants. Grammars can have rules of going from existing variable series to new variable/constant series. Generative grammars use finite sets of variables, constants, and rules.

relation

Grammar relations involve persons or things {direct object} and time. Relations can be states or events. States are know, believe, or have. States have experiencing subjects. Events involve agents {subject, relation}, instruments "with", beneficiaries "for", and places "on, in, at, above, below", moving, and communicating. States and events determine subject phrases. Events determine verb phrases. To communicate, write, or speak involves recipient "with or to", language "in", and/or topic "on or about". To move or walk involves source "from" and destination "to".

General Grammar

In type 0 {General Grammar}, rules start with variables and productions can be unbounded and context-sensitive. General Grammars are recursively enumerable. General Grammars are equivalent to Turing Machines.

Context Sensitive Grammar

In type 1 {Context Sensitive Grammars}, rules start with variables, and productions are the same length or longer. Rules depend on nearby symbols. Context-sensitive grammars are equivalent to Linear Bounded Automata {non-deterministic Turing Machine}, which have left and right end markers that have no replacements and so bound strings. Context-sensitive grammars are recursive. Context-sensitive grammar-recognition algorithms are Pspace-complete and so can never complete. Context-free grammars plus symbol tables can model context-sensitive grammars.

Context-free Grammar of Chomsky

In type 2 {Context-free Grammar}, rules start with variables and produce variable-and-constant series. Variables are start symbols for grammar subsets. Context-free grammars can accommodate parentheses. Rules do not depend on nearby symbols. Context-free grammars are equivalent to Recursive Transition Networks, which can refer to other transition networks.

parsing

Top-down parsers start with rules with variables and find places that match rules. Bottom-up parsers start with constants and make variables based on rules. Tree structures {parse tree, grammar} show how rules apply. Diagrams {sentence diagram, grammar} show sentence structure. Sentences can have more than one parse tree.

ambiguity

No universal algorithm can determine if context-free grammars are unambiguous or ambiguous or make ambiguous ones unambiguous.

number

Languages can have more than one context-free grammar.

normal form

Context-free grammars can have special forms {normal form, grammar}. Normal forms {Chomsky normal form} can have rules that make one variable into two variables or one constant, with no empty strings. Normal forms {Griebach normal form} can have rules that make one variable into two constants or one empty string.

Regular Grammar of Chomsky

In type 3 {Regular Grammar}, rules start with variables and produces constant-and-variable series {right linear grammar}, or variable-and-constant series {left linear grammar}. There is only one variable and it is on right or left. All other symbols are constants. Simple transition networks are equivalent to regular grammars. Finite state automata (FSA) model regular grammars, because they have start state, finite number of states, set of rules from one constant to another constant, and finite set of terminal states. Regular Grammars use regular expressions: empty strings, variables, or repeated regular-expression strings.

Related Topics in Table of Contents

6-Linguistics-Grammar-Kinds-Quantitative

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0225