Entropy-
refers to the degree of randomness, lack of organization, or disorder in a situation. Information theory measures the quantities of all kinds of information in terms of bits (binary digit).the amount of information according to Shannon is equal to entropy.
8 http://communicationtheory.org/shannon-and-weaver-model-of-communication/
Says what?
In which channel? To whom?
With what effect? Who?
10
Redundancy- is the opposite of information. Something that is redundant adds little, if any, information to a message. Redundancy is important because it helps combat noise in a communication system, e.g. in repeating the message.
A code- is a language or other set of symbols or signs that can be used to transmit a thought through one or more channels to elicit a response in a receiver or decoder.
Efficiency-
refers to the bits of information per second that can be sent and received
Distortion-
is the alteration of the original shape or other characteristic of an object, image, sound, waveform or other form of information or representation. Distortion is usually unwanted, and often many methods are employed to minimize it in practice. 9
The original model consisted of five elements:
● Information source, which has to express the purpose in a form of a message. The message has to be conveyed in the form of code. The communication encoder is responsible for taking the ideas of the source and putting them in code, expressing the source's purpose in the form of a message.
● Transmitter/encoder or the electronic device, which converts/encodes the message into signals
● Channel, the messages are transferred from encoder to decoder through a channel. During this process, if the message get distorted in the channel, it will then affect the communication flow or the receiver may not receive the correct message.
● Receiver/decoder, which decodes (reconstructs) the message from the signal. For communication to occur there must