Tuesday, August 27, 2013



Where Does Knowledge Come From?

Sometimes we just KNOW things.
Did you ever wonder where the information came from?


Maybe you've wondered what it's made of. 



According to some researchers there's' something to it.

(http://www.utwente.nl/cw/theorieenoverzicht/)


"One of the first designs of the information theory is the model of communication by Shannon and Weaver. Claude Shannon, an engineer at Bell Telephone Laboratories, worked with Warren Weaver on the classic book ‘The mathematical theory of communication’. In this work Shannon and Weaver sought to identify the quickest and most efficient way to get a message from one point to another. Their goal was to discover how communication messages could be converted into electronic signals most efficiently, and how those signals could be transmitted with a minimum of error. In studying this, Shannon and Weaver developed a mechanical and mathematical model of communication, known as the “Shannon and Weaver model of communication”.



According to the theory, transmission of the message involved sending information through electronic signals. “Information” in the information theory sense of the word, should not be confused with ‘information’ as we commonly understand it. According to Shannon and Weaver, information is defined as “a measure of one’s freedom of choice when one selects a message”. In information theory, information and uncertainty are closely related. Information refers to the degree of uncertainty present in a situation. The larger the uncertainty removed by a message, the stronger the correlation between the input and output of a communication channel, the more detailed particular instructions are the more information is transmitted. Uncertainty also relates to the concept of predictability. When something is completely predictable, it is completely certain. Therefore, it contains very little, if any, information. A related term, entropy, is also important in information theory. Entropy refers to the degree of randomness, lack of organization, or disorder in a situation. Information theory measures the quantities of all kinds of information in terms of bits (binary digit). Redundancy is another concept which has emerged from the information theory to communication. Redundancy is the opposite of information. Something that is redundant adds little, if any, information to a message. Redundancy is important because it helps combat noise in a communicating system (e.g. in repeating the message). Noise is any factor in the process that works against the predictability of the outcome of the communication process. Information theory has contributed to the clarification of certain concepts such as noise, redundancy and entropy. These concepts are inherently part of the communication process.

Shannon and Weaver broadly defined communication as “all of the procedures by which one mind may affect another”. Their communication model consisted of an information source: the source’s message, a transmitter, a signal, and a receiver: the receiver’s message, and a destination. Eventually, the standard communication model featured the source or encoder, who encodes a message by translating an idea into a code in terms of bits. A code is a language or other set of symbols or signs that can be used to transmit a thought through one or more channels to elicit a response in a receiver or decoder. Shannon and Weaver also included the factor noise into the model. The study conducted by Shannon and Weaver was motivated by the desire to increase the efficiency and accuracy or fidelity of transmission and reception. Efficiency refers to the bits of information per second that can be sent and received. Accuracy is the extent to which signals of information can be understood. In this sense, accuracy refers more to clear reception than to the meaning of message. This engineering model asks quite different questions than do other approaches to human communication research."


What about hard science?  What do we KNOW about the brain?








Shannon, C.E., & Weaver, W. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.

Hawes, L.C. (1975). Pragmatics of analoguing: Theory and model construction in communication. Reading, MA: Addison-Wesley.




http://www.utwente.nl/cw/theorieenoverzicht/Theory%20clusters/Communication%20and%20Information%20Technology/Information_Theory.doc
























No comments:

Post a Comment