First, I'd like to acknowledge andrewwyld for providing the above title and the inspiration for this wibble.
If you would like to do useful things in one or more of the following areas:
- Cognitive Science: computational linguistics, natural language learning and understanding, cognitive modeling
- Computer Science: theoretical CS (complexity and computability), intelligent systems (machine learning, probabilistic reasoning), search
- Electrical Engineering: coding theory, data compression
- Mathematics: applied probability, theory of communication
- Physics: quantum computation, quantum teleportation
- Social Sciences: theory of communication, social development; see also Statistics
- Statistics: inference, multivariate regression
then it really behooves you to learn a little about information theory (IT) if you haven't studied it laready. As Bobby McFerrin harmonized about math on the old PBS TV show Square One TV, it's a "got to know" topic.
My personal pet IT topic, aside from a little notation on common information  I developed in my dissertation research on time series learning, is computing as compression.
Finally, if you know of topics that are missing from my partial list above, please reply or write to me and I will put them in an edit or a follow-up post.
 The common information among random variables X, Y, and Z is I(X; Y; Z) =def I(X; Y) – I(X; Y | Z) = I(X; Z) – I(X; Z | Y) = I(Y; Z) – I(Y; Z | X). This is the analogue of n-way intersection in set theory. The modular mutual information is Ii =def I(Xi, Y | Xi) =def H(X; Y) – H(Xi | Y, X1, …, Xi-1, Xi+1, …, Xk). I used these to define a score that I called modular common information, Icommon =def I(X1; X2; …, Xk; Y) =def I(X; Y) – Σ Ii. Boy, it's hard to write math in HTML.