First, I'd like to acknowledge

**andrewwyld**for providing the above title and the inspiration for this wibble.

If you would like to do useful things in one or more of the following areas:

**Cognitive Science**: computational linguistics, natural language learning and understanding, cognitive modeling**Computer Science**: theoretical CS (complexity and computability), intelligent systems (machine learning, probabilistic reasoning), search**Electrical Engineering**: coding theory, data compression

**Mathematics**: applied probability, theory of communication

**Physics**: quantum computation, quantum teleportation

**Social Sciences**: theory of communication, social development; see also*Statistics***Statistics**: inference, multivariate regression

then it really behooves you to learn a little about information theory (IT) if you haven't studied it laready. As Bobby McFerrin harmonized about math on the old PBS TV show Square One TV, it's a "got to know" topic.

My personal pet IT topic, aside from a little notation on

*common information*[1] I developed in my dissertation research on time series learning, is

*computing as compression*.

Finally, if you know of topics that are missing from my partial list above, please reply or write to me and I will put them in an edit or a follow-up post.

[1] The

*common information*among random variables

*X*,

*Y*, and

*Z*is

*I(X; Y; Z) =*. This is the analogue of

_{def}I(X; Y) – I(X; Y | Z) = I(X; Z) – I(X; Z | Y) = I(Y; Z) – I(Y; Z | X)*n*-way intersection in set theory. The

*modular mutual information*is

*I*. I used these to define a score that I called

_{i}=_{def}I(X_{i}, Y | X_{i}) =_{def}H(X; Y) – H(X_{i}| Y, X_{1}, …, X_{i-1}, X_{i+1}, …, X_{k})*modular common information*,

*I*. Boy, it's hard to write math in HTML.

_{common}=_{def}I(X_{1}; X_{2}; …, X_{k}; Y) =_{def}I(X; Y) – Σ I_{i}--

Banazîr

## Error

Your reply will be screened

Your IP address will be recorded

You must follow the Privacy Policy and Google Terms of use.