Human beings have known information and its relevance to their existence from the beginning of human civilization. Our instincts and intellect have developed and evolved to take advantage of information. However, we have been unconscious about the role it played in human lives till the early twentieth century.
Only in the later part of the twentieth century and specifically in the seventies and eighties has the role of information come into focus. With the success of the US based firms, like Bloomberg, information began to be taken seriously in business organizations. Today information is considered as the most important resource in a business organization. Information is required in business to serve a host of purposes ranging from the most critical, like decision-making, strategizing, planning, to the mundane, like operational control, automation. In some futuristic business organizations, it has assumed the role of a differentiator, giving competitive edge to the organization. Information management has therefore, become a very important task in modern business. Business organizations invest heavily to install modern information management systems. These systems are housed within a computerized environment more popularly called information technology platform. The requirement for technology in information management stems from the dual need for ‘timely’ and ‘accurate’ information in today’s modern business.
We’ll be covering the following topics in this tutorial:
Meaning of Information
Information and the understanding of its value lie at the core of management information system. Information is quite distinct from data, even though the two concepts are interrelated. The concept of information as an important resource began to take root from the late eighties. Management practitioners began to deliberate seriously on the issue after the huge success of Bloombergin the eighties Bloomberg started with the modest business of analyzing the financial results of different US companies and providing analyzed data or information about the financial performance of these firms to brokers in the stock market for a price, thus starting the first information business in the information trading business. Different forms of such information based business have evolved since then and today, information is not only regarded as a resource but is considered as the most valuable resource. A resource which if used properly may result in competitive advantage for the firm.
We are instinctively alert for ‘valuable’ information. However, the human brain has evolved to understand that there is different degree of ‘value’ of information (which the brain unconsciously values). The human brain works in a way that it prioritizes information according to its perceived value, most often this unconscious valuation mechanism in our brain is correct, more so in the case of instinct based information. For example, let us assume that a driver notices a child suddenly crossing the road and evaluates that he will hit the child unless he stops and at the same time he feels an itching sensation on his forehead. In this case the brain of the driver prioritizes the two different information that receive from the different sensory inputs, it reacts by sending a signal to the driver’s right foot to press the brake pedal to stop the car and only after the car stops will the brain react to the itch. We unconsciously do this everyday. Evolution has taught us that information has a context and hence different degrees of value. We exhibit the same behavior of prioritization of valuable information in taking business decisions. Sometimes the value of information is evident but in most cases the value has to be mined out. The value that information delivers for business can be of several types. For example, it may be the launch date of a competitor’s product (kept secret by the competitor) or it may be the total sales of the organization made in a particular market segment in a specific time frame. In the former case the information is very straight forward (it is just a few pieces of data) and the difficult part is to get a source for such information (the basic data is very difficult to get and may well be available only through industrial espionage). In the latter case the data is available in abundance and information is created by processing it and the difficult part of the information creation is the processing part not the availability of data, which of course is important as well but not as important as in the former case. The subject of information as used in business organization deals mostly with the latter case, in which the organization’s own data is captured, stored and analyzed to create information for decision-making. Information of the former type is mostly dealt with through industrial espionage.
Mathematically, information may be defined as a function of the probability of occurrence of an event. The information theory literature has been developed by C.E.Shannon, ‘Norbert Wiener and others. To understand information in a mathematical way, let us consider an event E with probability of occurrence p. Information is of greater value if the probability of its occurrence is closer to 0 than 1. For example, if someone were to convey that the Sun will rise in the East tomorrow, then would we consider this message to be ‘valuable’ information? No, as this event (i.e., Sun rising in the East tomorrow) is certain (and hence has very little, almost no information value), but if on the other hand, if someone were to convey the message that in the next five minutes an earthquake would hit with its epicenter at the very spot on which the reader is sitting, then that piece of message would have a much greater information value as the probability of its occurrence is much less (and hence consequently, its information value great). Thus, we can see that the greater the probability of occurrence of an event, the lower its information. This is what Shannon has defined mathematically as,
h (p) = -Iog (p), where 0 < p ≤ 1
h (p) is the measure of the amount of information (also called information function measured in bits).
p is the probability of occurrence of an event.
We can see that h (p) is a decreasing function whose value varies from ∞(when p is 0)
to 0(when p is 1).
The value of information comes down exponentially as p increases. This means that as events start happening on expected lines, information about such events start to diminish in value. When the event is certain with probability of occurrence at one, then its information value becomes zero.
Entropy is a concept in physics, which is a measure of the degree of randomness in a system. Shannon (1949) suggested that the concept of entropy in information theory. Entropy in information theory is the mathematical expectation of information of the occurrence of one of the events when a set of events is considered.
Let there be n events E1, E2….En, with probabilities of occurrences as P1 P2….. Pn Where all probabilities are greater than 0 and the sum of all probabilities is one.
Now, Entropy function is defined as,
H(p1,p2,…..pn) = – ΣP1 logp1;p1 ≥0, Σp1 = 1
It is clear, that entropy is a measure of expected information. Sometimes entropy is also considered as a measure of uncertainty.