Language processing and computer coding rely on grammar, which specifies sentence-structure rules. Parsing finds syntactical structure. Generating uses rules to create valid sentences. Syntax can have typical sentence structures {regular syntax}, for which words can substitute and which can be recursive {productive syntax}.
grammars
Noam Chomsky defined four generative-grammar classes, with different complexities {Chomsky hierarchy}. Type 0, with highest complexity, is General Grammars, such as Turing Machines. Type 0 grammars can be context-free, unrestricted, contracting, and freely substituting. Turing machines read input and write output anywhere on tape. Type 1 is Context-Sensitive Grammars, such as Linear Bounded Automata. Type 1 grammars can be context-sensitive, unrestricted, and non-contracting. Pushdown machines with finite tape read input and store output, after going backward and forward on tape until they find input string context that tells them what to do next. Type 2 is Context-Free Grammars, such as Pushdown Automata. Type 2 grammars can be context-free, unrestricted, and non-contracting. Pushdown machines read input and store output based solely on current state, without going backward or forward on tape. Type 3, with lowest complexity, is Regular Grammars, such as Finite State Automata. Type 3 grammars can be context-free or context-sensitive, regular, linear, and non-contracting. Finite-state machines read input tape, with no storage, until they find input string that tells them what to do next.
computer language
Computer languages must be deterministic, so parsing look-ahead is finite. Parsing non-deterministic languages requires trying all rules and/or guessing. Most unambiguous and ambiguous recursive transition networks are non-deterministic and cannot map to deterministic recursive transition networks. Non-deterministic finite state automata can map to deterministic finite state automata.
generative grammar
Grammars can have variables, which can have actions and include start symbols. Grammars can have constants, which can have no actions. Grammatical symbols are either variables or constants. Grammars can have rules of going from existing variable series to new variable/constant series. Generative grammars use finite sets of variables, constants, and rules.
relation
Grammar relations involve persons or things {direct object} and time. Relations can be states or events. States are know, believe, or have. States have experiencing subjects. Events involve agents {subject, relation}, instruments "with", beneficiaries "for", and places "on, in, at, above, below", moving, and communicating. States and events determine subject phrases. Events determine verb phrases. To communicate, write, or speak involves recipient "with or to", language "in", and/or topic "on or about". To move or walk involves source "from" and destination "to".
Social Sciences>Linguistics>Grammar>Kinds>Quantitative>Formal
6-Linguistics-Grammar-Kinds-Quantitative-Formal
Outline of Knowledge Database Home Page
Description of Outline of Knowledge Database
Date Modified: 2022.0224