Re: Information Theory

From: Harnad, Stevan (harnad@cogsci.soton.ac.uk)
Date: Sun May 25 1997 - 20:51:00 BST


> From: Hawkins, Sean <swh196@soton.ac.uk>
>
> > if someone tells me that the number is odd today (and that is
> > true) then my probability of lunch has gone up to 1/3."
>
> Surely, this is at best, just probability. You can draw the
> same analogy to a horse race. The probability of X horse
> winning is 2/1 ie you would expect the horse to win twice in
> three races. That is just pure probability - not
> information. If you had bet your life savings on this
> horse based on the probability of it winning, then was
> having the odds before betting, information ?

Yes. And Information Theory is in part probability theory,
in part computational theory, and in part signal analytic
theory: Have a look at the WWW links I put into prior replies.

> I would suggest that throughout our existence, every
> single issue is based on probability

No; mathematical proofs are not based on probability. In mathematics
you have certainty. All the REST is probability, hence uncertainty
(more or less), even scientific laws, like F = ma or e = mc**2 --
though the probability of those laws is overwhelmingly high.

> - Water boiled at 100c
> yesterday so there is a good chance that water will boil at
> the same temperature tomorrow - nobody can be sure until it
> actually happens (Actually, water at higher altitudes boils
> at a higher temperature). An alternative piece of
> 'information' may suggest that water will boil at lower
> temperatures depending on a number of factors. This in the
> purest sense, is information. However, it does not decrease
> your uncertainty because it does not reduce the number of
> alternatives. Suppose your occupation is based on the
> temperature at which water boils - you are a designer of
> heating elements - this is information which will
> increase your uncertainty but it MATTERS.

But if your work and earnings depended on being able to predict the
temperature at which water boils it certainly WOULD lower your
uncertainty if you had data about the boiling temperature at different
altitudes, plus the current altitude: You've given a perfect example
of uncertainty reduction.

> So, are you saying that information is anything which
> decreases your uncertainty and which matters ? If so, it is
> an extremely narrow view that leaves a considerable amount
> of other 'data' undefinable - what is that
> called - misinformation ?

You can call it that; but as I said, formal information theory is
neither about beliefs nor about meaning: It's about the probability of
lunch.

> > That's like saying that telling someone that "lunch is number 5 today"
> > is uninformative to an anorexic: Fine, so let's turn instead to
> > something that DOES matter to an anorexic...
>
> Food or lack of food is information which does matter to an
> anorexic! If an anorexic is told that they will die without
> eating, does that reduce their uncertainty and thus be
> classified as information ? If an anorexic wants to live
> and is told that without food, they will die, they are
> in exactly the same situation as prior to being told. They
> have to act upon the 'information' in order to
> influence the degree of certainty or uncertainty of their
> continued existence. So if MATTERS is a component part of
> the definition of information, does it have to matter to
> that individual in probabilistic terms or does it have
> matter to the individual concerned ?

I don't understand your question, but I think we have gone around
on this one enough times now so that it would be better to return to
revision for the Exam. I would have been happy with the questions
when the topic was first (or second, or third) discussed, but now it
is just keeping people (including yourself) from revising.

> It would appear to me that the definition of information is
> subjectively tied to probability (which humans constructed)
> and subjectively linked to MATTERS, but the question remains
> as to whom it should matter - Should that be individuals who
> decide that this 'information' does matter or should the
> individual(s) be identified through some type of
> classification system which determines the
> characteristics of individuals who will be affected by this
> 'information' ? Importantly, is communication to these
> individuals a component part of information ? If not, then
> I would suggest that Whitehall (and the Pentagon) are
> jammed packed full of 'information' which matters to
> all people across at least two nations and which reduces
> their uncertainty. Perhaps BSE is a good example.
>
> It would seem to me therefore that the definition offered is
> at best idiosyncratic.

The definition's fine, and the data you speak of, if it exists, is
information, and does matter.

For a contrast case, remember the example I gave about counting
all the grains of sand in one square metre of the Sahara Desert at a
particular time and location: That number would not reduce uncertainty,
because no one cares. (If, on the other hand, Saddam Hussein decreed
that his subjects could only get food if they correctly reported the
number of grains of sand in such a square metre, then the outcome would
matter, and would be informative. The only thing that is even the
slightest bit HUMAN about the Shannon/Weaver mathematical theory of
information/communication, is this bit about mattering; but even that is
not part of the theory, but just a natural constraint on how you APPLY
the theory, namely, to situations in which the alternatives matter.)



This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:53 GMT