Preview

verification of logic gates

Good Essays
Open Document
Open Document
526 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
verification of logic gates
CHAPTER 2
BINARY CODE
2.1 History of Binary Code:
Binary numbers were first described in Chandashutram written by Pingala around 300 B.C. Binary Code was first introduced by the English mathematician and philosopher Eugene Paul Curtis during the 17th century. Curtis was trying to find a system that converts logic’s verbal statements into a pure mathematical one. After his ideas were ignored, he came across a classic Chinese text called I Ching or Book of Changes, which used a type of binary code. The book had confirmed his theory that life could be simplified or reduced down to a series of straightforward propositions. He created a system consisting of rows of zeros and ones. During this time period, Curtis had not yet found a use for this system.
Another mathematician and philosopher by the name of George Boole published a paper in 1847 called 'The Mathematical Analysis of Logic' that describes an algebraic system of logic, now known as Boolean algebra. Boole’s system was based on binary, a yes-no, on-off approach that consisted the three most basic operations: AND, OR, and NOT. This system was not put into use until a graduate student from Massachusetts Institute of Technology by the name Claude Shannon noticed that the Boolean algebra he learned was similar to an electric circuit. Shannon wrote his thesis in 1937, which implemented his findings. Shannon's thesis became a starting point for the use of the binary code in practical applications such as computers, electric circuits, and more.
2.2 Binary Code:
Binary code represents text or computer processor instructions using the binary number system's two binary digits, 0 and 1. A binary code assigns a bit string to each symbol or instruction. For example, a binary string of eight binary digits (bits) can represent any of 256 possible values and can therefore correspond to a variety of different symbols, letters or instructions.
In computing and telecommunication, binary codes are used for various methods

You May Also Find These Documents Helpful

  • Good Essays

    We all know that digital electronics use the binary number system. However, with new computers containing 32, 64, and even 128 bit data busses, displaying numbers in binary is quite cumbersome. For example, a single piece of data on a 64-bit data bus would look like this:…

    • 412 Words
    • 4 Pages
    Good Essays
  • Better Essays

    Nt1310 Unit 1 Assignment

    • 1994 Words
    • 8 Pages

    * The binary numbering system plays a central role in how information of all kinds is stored on the computer. Understanding binary can lift a lot of the mysteries from computers because at a fundamental level they're really just machines for flipping binary digits on and off. There are several activities on binary numbers in this document, all simple enough that they can be used to teach the binary system to anyone who can count! Generally children learn the binary system very quickly using this approach, but we find that many adults are also excited when they finally understand what bits…

    • 1994 Words
    • 8 Pages
    Better Essays
  • Good Essays

    Lab-Week2

    • 775 Words
    • 3 Pages

    In computers binary code is the language that communicated between applications. Binary Code is a coding system using only digits 0 and 1 to represent a letter, digit or other characters in a computer. It is hard to imagine that huge and hard calculations are done on computers applications, which only use a 0 or 1 to come up with the solution. Binary is a base 2 numbering system. Which means each digit can be one of two values, a 1 or a 0. Our traditional numbering system, decimal, is a base 10; using values 0 – 9. Binary numbers can be converted into decimal numbers and vise versa.…

    • 775 Words
    • 3 Pages
    Good Essays
  • Satisfactory Essays

    IS3120 Unit 3

    • 650 Words
    • 2 Pages

    computer can transmit data in 1's and 0's on and off also called digital or binary form.…

    • 650 Words
    • 2 Pages
    Satisfactory Essays
  • Satisfactory Essays

    Computers only understand one language and that is the machine language (binary code). This type of language is represented in a two digit sequence zero (0) or one (1). Computers use this machine language to represent every task that they perform by means of electrical current that go through switches (capacitors). If a switch has current it means that the switch is “ON” in machine language this will be represented as the number “one”. If there is no current going through the switch then it is considered to be “OFF” and represented in binary code as the number “zero”.…

    • 322 Words
    • 2 Pages
    Satisfactory Essays
  • Satisfactory Essays

    Unit 1 Assignment 1

    • 465 Words
    • 2 Pages

    Bit= The smallest unit of data stored in a computing device, representing a single binary digit of value 0 or 1.…

    • 465 Words
    • 2 Pages
    Satisfactory Essays
  • Satisfactory Essays

    ect114

    • 1514 Words
    • 7 Pages

    Decoding is the process of converting some code (Such as binary,BCD or hex) into a singular active output representing its numeric value. A decoder is made up of a combination of logic gates that produces a HIGH at one of the 10 outputs, based on the levels of four inputs.…

    • 1514 Words
    • 7 Pages
    Satisfactory Essays
  • Better Essays

    Assembly language enables programmers to relate op codes using symbolic names in place of numbers to perform an instruction or input a piece of data. Programmers can inscribe op codes using purposeful words like JUMP, CLEAR, and ADD as an alternative to cryptic binary codes consisting of series of 0s and 1s. An example of assembly language, machine language and its meaning are listed in the book called, “Invitation to Computer Science” (Schneider & Gersting, 2013, pp. 285, fig. 6.5). In figure 6.5, assembly language is clearly easier to comprehend than machine language, which makes assembly language user friendly.…

    • 2002 Words
    • 9 Pages
    Better Essays
  • Good Essays

    Week 1 Homework

    • 843 Words
    • 5 Pages

    Machine code is the language which the computer hardware understands and executes. Instructions in a high-level language are closer to a natural language, such as English and therefore are easier to understand and learn than machine language.…

    • 843 Words
    • 5 Pages
    Good Essays
  • Good Essays

    Computer Concepts Exercises

    • 2338 Words
    • 10 Pages

    011000 The list of codes for a microprocessor’s instruction set is called machine language. (Answer: True)…

    • 2338 Words
    • 10 Pages
    Good Essays
  • Satisfactory Essays

    Boston Police Strike

    • 855 Words
    • 4 Pages

    Then man thought about numbers between 0 and 1. To give us fractions and decimals.…

    • 855 Words
    • 4 Pages
    Satisfactory Essays
  • Powerful Essays

    Boolean logic has a much older history than modern computer systems. Actually, this theory can be said to be the foundation on which modern computer and information technology has been built. According to Cooper, (1988) the term "Boolean" refers to a system of logical thought developed by George Boole. The operators are the words used to refine the search, for example: "and", "or", "not" George Boole, an English mathematician in the 19th century, developed "Boolean Logic" in order to combine certain concepts and exclude certain concepts when searching databases. The Internet has been defined by the Oxford Economics, (2011) as a global system of interconnected computer networks that use the standard Internet protocol suite to serve billions of users worldwide. It is a network of networks that consists of millions of private, public, academic, business, and government networks, of local to global scope, that are linked by a broad array of electronic, wireless and optical networking technologies. The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support email. According to Billingsley,…

    • 3084 Words
    • 13 Pages
    Powerful Essays
  • Good Essays

    The analytical engine, an important step in the history of computers, was the design of a mechanical general- purpose computer by the British mathematician Charles Babbage. It was first described in 1837. Because of financial, political, and legal issues, the engine was never built. In its logical design the machine was essentially modern, anticipating the first completed general-purpose computers by about 100 years. The input (programs and data) was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-point arithmetic. There was to be a store (i.e., a memory) capable of holding 1,000 numbers of 50 decimal digits each (ca. 20.7kB). An arithmetical unit (the “mill") would be able to perform all four arithmetic operations, plus comparisons and optionally square roots. Like the central processing unit (CPU) in a modern computer, the mill would rely upon its own internal procedures, to be stored in the form of pegs inserted into rotating drums called “barrels," in order to carry out some of the more complex instructions the user's program might specify. The programming language to be employed by users was akin to modern day assembly languages. Loops and conditional branching were possible and so the language as conceived would have been Turing-complete long before Alan Turing's concept. Three different types of punch cards were used: one for arithmetical operations, one for numerical constants, and one for load and store operations, transferring numbers from the store to the arithmetical unit or back. There were three separate readers for the three types of cards.…

    • 6581 Words
    • 27 Pages
    Good Essays
  • Good Essays

    Charles Baggage and George Boole are, without question, central figures in the history of computer science. Charles] Babbage was born in Devonshire on December 26, 1791. The son of a London banker, Babbage took a great liking towards mathematics at an early age. Babbage soon became so proficient in mathematics that he was out performing his tutors at Cambridge. By 1812 Babbage co-founded the Analytical Society with the help of three other Cambridge classmates, Robert Woodhouse, Sir John Herschel, and George Peacock. In 1821 Babbage invented the Difference Engine to compile astronomical tables. While in the process of building it in 1832, he conceived a better machine that could perform not just one mathematical task but any kind of calculation. This machine was the Analytical Engine and it possessed some the characteristics of today's computers. George Boole, born November 2, 1815, was a British mathematician and founder of mathematical logic. Coming from a poor family of limited means, Boole was essentially a self-taught mathematician. In 1847 Boole published "Mathematical Analysis of Logic". In the book, Boole established that logic could be represented by algebraic equations. This conception eventually become known as Boolean algebra and the basis of all modern digital computers. The inventions and achievements of Charles Babbage and George Boole are both directly and indirectly responsible for the conception of modern computing as we know it today.…

    • 1160 Words
    • 4 Pages
    Good Essays
  • Good Essays

    <br>In 1936 an American (Alonzo Church) and a Briton (Alan M. Turing) published independently (as is often the coincidence in science) the basics of a new branch in Mathematics (and logic): computability or recursive functions (later to be developed into Automata Theory).…

    • 1707 Words
    • 7 Pages
    Good Essays