Well ... you see, we have grown used to decimal. And not only have we grown used to decimal, it really cannot be demonstrated how other systems are superior or inferior to decimal to an extent where either it must be replaced or further appraisal must be given. I myself can actually convert between binary, octal, hex, and decimal freely, but I 'm also better in math than the vast majority of peopole I know. There is nothing too terribly wrong with the decimal system, so I will invoke the paradim of "If It Ain't Broke Don't Fix It".
On a side note though, I'll present a simplistic analysis of number base systems: Theoritically, the euler constant 'e' should be the optimal base, but we don't use 'e' because we like to think of whole numbers. In theoritical computer science, 3 might arguably be an optimal base since it is both absolutely and logarithmically the closest integer to 'e'. In practice, base 2 seems simpler for logical computation since logical statements (where allowed) only has two states (true or false).
However, both binary and ternery are way too tedious for us humans. In binary for particular, it is very easy to lose track of where you are when you are reading a number. For example, it is hard to differentiate between the 8th and 10th digit of this number:
101010101010101
Secondly, smaller bases will make longer numbers, that also compounds the problem mentioned above. That being said, I do not think that we will be using a different number system anytime soon. Also, even if we do, the most likely candidate is hexadecimal, not binary.