Preview

Cnf and Gnf

Good Essays
Open Document
Open Document
816 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
Cnf and Gnf
Chomsky Normal Form
- In formal language theory, a context-free grammar is said to be in Chomsky normal form if all of its production rules are of the form: or or

where , and are nonterminal symbols, α is a terminal symbol (a symbol that represents a constant value), is the start symbol, and ε is the empty string. Also, neither nor may be the start symbol, and the third production rule can only appear if ε is in L(G), namely, the language produced by the Context-Free Grammar G.
Every grammar in Chomsky normal form is context-free, and conversely, every context-free grammar can be transformed into an equivalent one which is in Chomsky normal form. Several algorithms for performing such a transformation are known. Transformations are described in most textbooks on automata theory, such as Hopcroft and Ullman, 1979.[1] As pointed out by Lange and Leiß,[2] the drawback of these transformations is that they can lead to an undesirable bloat in grammar size. The size of a grammar is the sum of the sizes of its production rules, where the size of a rule is one plus the length of its right-hand side. Using to denote the size of the original grammar , the size blow-up in the worst case may range from to , depending on the transformation algorithm used.
Converting a Grammar to Chomsky Normal Form 1. Introduce
Introduce a new start variable, and a new rule where is the previous start variable. 2. Eliminate all rules rules are rules of the form where and where is the CFG's variable alphabet.
Remove every rule with on its right hand side (RHS). For each rule with in its RHS, add a set of new rules consisting of the different possible combinations of replaced or not replaced with . If a rule has as a singleton on its RHS, add a new rule unless has already been removed through this process. For example, examine the following grammar :

has one rule. When the is removed, we get the following:

Notice that we have to account for all possibilities of and so

You May Also Find These Documents Helpful

  • Good Essays

    cis207 VisioSupplement 2

    • 831 Words
    • 4 Pages

    Build a Visio® diagram of the steps. Start and end your diagram with a terminal symbol. Lines with directional arrows should connect all symbols. Save your diagram and submit the VSD file to the Assignments tab.…

    • 831 Words
    • 4 Pages
    Good Essays
  • Good Essays

    SCC3

    • 1250 Words
    • 5 Pages

    In this exercise, you must also get rid of instances of noun clutter (NC)—i.e., verbs that have been changed into nouns by adding suffixes such as -ion, -tion, -ment, -ent, -ance, -ence, -ancy, -ency, etc. Also, when necessary, you may add a "doer" (e.g., "Bob," "Jane," "We," "They," etc.) to passive sentences if the doer is missing in the original, in order to rephrase the sentence so it’s active.…

    • 1250 Words
    • 5 Pages
    Good Essays
  • Satisfactory Essays

    Week 1 HW Graded

    • 1751 Words
    • 26 Pages

    This logic symbol is an OR gate. Refer to the Week 1 Lecture or page 61 of the textbook for more information.…

    • 1751 Words
    • 26 Pages
    Satisfactory Essays
  • Good Essays

    cn's and np's

    • 540 Words
    • 3 Pages

    Are there any differences between CNS’s and NP’s with respect to inter-professional collaboration with MD?…

    • 540 Words
    • 3 Pages
    Good Essays
  • Better Essays

    Chomsky, who supports the ideas of Nativism, has argued that a Universal Grammar exists, and that children are able to learn language so quickly because of an innate understanding of syntax rules (the rules for combining words in to sentences); he proposes that through the use an innate ‘language acquisition device’ language specific features of utterances (the surface structure of language) are translated into the innate deep structure of language with which children are born (Mitchell and Ziegler, 2012, p206).…

    • 1473 Words
    • 5 Pages
    Better Essays
  • Good Essays

    Although grammar of a legal proposition is important normative character of a proposition is not determined by grammar. It’s logical form and purpose for which it is proposed and the context in which it operates are the determining factors.…

    • 1027 Words
    • 5 Pages
    Good Essays
  • Powerful Essays

    Chomsky considers UG to be comprised of what he terms “principles” and “parameters.” The term principles refers to highly abstract properties of grammar that underlie the rules of specific languages. Principles are thoughts to constrain the form that grammatical rules can take and they constitute part of…

    • 4415 Words
    • 18 Pages
    Powerful Essays
  • Good Essays

    10.1. In a language, new messages are freely coined by blending, analogizing from, or transforming old ones. This says that every language has grammatical patterning.…

    • 376 Words
    • 2 Pages
    Good Essays
  • Good Essays

    Tcs Syllabus

    • 504 Words
    • 3 Pages

    Introduction : alphabets, Strings and Languages, automata and Grammars. Finite Automata (FA) −its behavior; DFA − Formal definition, simplified notations (state transition diagram, transition table), Language of a DFA. NFA−Formal definition, Language of an NFA. An Application : Text Search, FA with epsilon−transitions, Eliminating epsilon−transitions, Eliminating epsilon−transitions, Equivalence of DFAs and NFAs. Regular expressions (RE) − Definition, FA and RE, RE to FA, FA to RE, algebraic laws for RE, applications of REs, Regular grammars and FA, FA for regular grammar, Regular grammar for FA. Proving languages to be non−regular − Pumping Lemma, and its applications. Some closure properties of Regular languages − Closure under Boolean operations, reversal homomorphism, inverse homomorphism, etc. M hill−Nerode Theorem. DFA Minimization Some decision properties of Regular languages − emptiness, finiteness, membership, equivalence of two DFAs or REs, Finite automata with output. Context−free Grammars (CFGs) − Formal definition, sentential forms, leftmost and rightmost derivations, the language of a CFG. Derivation tree or Parse tree−Definition, Relationship between parse trees and derivations. Parsing and ambiguity, Applications of CFGs, Ambiguity in grammars and Languages. Simplification of CFGs − Removing useless symbols, epsilon−Productions, and unit productions, Normal forms −CNF and GNF. Proving that some languages are not context free −Pumping lemma for CFLs, applications. Some closure properties of CFLs − Closure under union, concatenation, Kleene closure, substitution, Inverse homomorphism, reversal, intersection with regular set, etc. Some more decision properties of CFLs, Review of some undecidable CFL problems. Pushdown Automata (PDA) − Formal definition, behavior and graphical notation, Instantaneous…

    • 504 Words
    • 3 Pages
    Good Essays
  • Powerful Essays

    Changing of sentences from one form to another is a favorite exercise in public school English. Thus from a sentence like…

    • 7102 Words
    • 29 Pages
    Powerful Essays
  • Powerful Essays

    Finite Automata

    • 26011 Words
    • 105 Pages

    The finite-state machine (FSM) and the pushdown automaton (PDA) enjoy a special place in computer science. The FSM has proven to be a very useful model for many practical tasks and deserves to be among the tools of every practicing computer scientist. Many simple tasks, such as interpreting the commands typed into a keyboard or running a calculator, can be modeled by finite-state machines. The PDA is a model to which one appeals when writing compilers because it captures the essential architectural features needed to parse context-free languages, languages whose structure most closely resembles that of many programming languages. In this chapter we examine the language recognition capability of FSMs and PDAs. We show that FSMs recognize exactly the regular languages, languages defined by regular expressions and generated by regular grammars. We also provide an algorithm to find a FSM that is equivalent to a given FSM but has the fewest states. We examine language recognition by PDAs and show that PDAs recognize exactly the context-free languages, languages whose grammars satisfy less stringent requirements than regular grammars. Both regular and context-free grammar types are special cases of the phrasestructure grammars that are shown in Chapter 5 to be the languages accepted by Turing machines. It is desirable not only to classify languages by the architecture of machines that recognize them but also to have tests to show that a language is not of a particular type. For this reason we establish so-called pumping lemmas whose purpose is to show how strings in one language can be elongated or “pumped up.” Pumping up may reveal that a language does not fall into a presumed language category. We also develop other properties of languages that provide mechanisms for distinguishing among language types. Because of the importance of context-free languages, we examine how they are parsed, a key step in…

    • 26011 Words
    • 105 Pages
    Powerful Essays
  • Good Essays

    the main change between the two stages is that of a language with free word order and…

    • 293 Words
    • 2 Pages
    Good Essays
  • Powerful Essays

    Inflectional Morphology

    • 847 Words
    • 4 Pages

    Morphological rules for combining morphemes into words differ from the syntactic rules of a language, which determine how words are combined to form sentences; but there is an interesting relationship between morphology and syntax. In derivation morphology, we saw that certain aspects of morphology have syntactic implications--nouns can be derived from verbs, verbs from adjectives, adjectives from nouns, and so on.…

    • 847 Words
    • 4 Pages
    Powerful Essays
  • Good Essays

    Csassignment

    • 3037 Words
    • 13 Pages

    In CS3201, you have completed analysis and architectural design, as prerequisites to iterative development. In CS3202, you will develop SPA in four iterations, described in four assignments. Each development iteration delivers a mini-SPA that extends the scope of SPA developed in the previous iteration. Mini-SPAs are fully operational and can be tested against SPA requirements specifications provided in the Project Handbook. At the end of the third iteration, you will have implemented basic requirements for SPA, as described in the Project Handbook. The fourth iteration defines extensions that you may implement for bonus points, provided all the basic SPA functionalities are operational and well tested. Aiming for bonus points based on shaky foundation won’t bring you much benefit from the project experience or grading point of view. Assignment 1-3 suggest the scope of development iterations. Treat our suggested scope as a guideline, an example, and scope your iterations as you wish. But please use the opportunity of weekly consultations to discuss your plans with your team supervisor. This can’t hurt, but may save your precious time in case your plans were to create unforeseen difficulties.…

    • 3037 Words
    • 13 Pages
    Good Essays
  • Good Essays

    Any language, whether natural or artificial has lexemes (symbols) and a grammar. Example : void, ;…

    • 291 Words
    • 2 Pages
    Good Essays