Yeah, that's probably it!
I've been doing probability thingies lately, because I think I need a real job....
That's something I've been wanting to rant a while now, how frustrating these things can be, especially for a mathematician!
First I borrowed
Probability with martingales by David Williams. I had used it ~10 years ago to quick learn martingales for Banach space studies. Then I noticed I didn't like his conversational style, for which it is often lauded by the way. Also, I don't remember anything about generating functions and all these things, so I go to borrow the undergraduate book for our local university.
Well, that in turn is unreadable, because the writer doesn't tell anything: he uses words like "distribution" or "random variable" without definition, this is of course because they don't want to have measure theory as prerequisite.
Then I rant about it to a mathematician friend, and she advices me to read another book, also for the local university, but this is for graduate students. Again, I can't stand it, because this book doesn't have enough of simple exercises, and it still contains undefined or poorly defined things. Like distribution, it's supposed to be "the set of probabilities for a random variable". So if p(1)=p(2)=1/4 and p(3)=...=p(6)=1/8, does this random variable has the same distribution as the one for which p(1)=p(2)=1/8 and p(3)=p(4)=p(5)=1/4, namely {1/4, 1/8} for both of them?
Ok, now I borrow Shiryaev's
Probability, which is more rigorous, but it has very polarized exercises, either very hard or trivial, and also, it uses unorthodox symbols. And this btw is a general issue with these probability guys, they have to always use symbols the wrong way. They can't say X^{-1} (A) like the normal people, that is if normal people ever called functions Xs, but instead it must be {X=A} or something else equally stupid.
Now I've finally found a good book, Feller's
Introduction to probability, which is from the 40s or 50s, and has lots of interesting examples, good exercises, very good justification for things, and also lovely nostalgia of the time when Harvard computer lab printed heavy books like "binomial probabilities for n<100".
I've learned that the dropping of bombs in London Blitz conformed to Poisson distribution, and there were actually people who cast 12 dice some 12000 times to find out whether it conforms to multinomial distribution. I've also calculated, how many raisins there should be in cookie dough for a probability less than 1/100 for a cookie with no raisins to emerge.
It's interesting subject, but very diffficult to find proper literature.
[/rant]