My issue with Nate's model this time around is that it remains agnostic to the distribution of undecideds and third-party voters back to one of the two major-party candidates in the final vote counts, which puts an equal tail of uncertainty on both sides of the electorate. Earlier in the process, this is an understandable way to do it, because there isn't really any data to indicate where it might make sense to weight the uncertainty. But that isn't quite the case now, where we have both early voting and polling data that indicates which way these various buckets of voters are going. I feel like it's a poor choice to not to attempt to model the uncertainty, particularly since his probabilities can make big moves based on relatively small changes in polling margins. He simply cuts the undecideds right down the middle, which makes little sense in deep blue or deep red states especially where the undecideds will almost certainly break with the state's political leanings.
As he has noted innumerable times since 2012, polling is getting ever more difficult to carry out, ever less reliable, and actually is less numerous due to newspapers having reduced budgets to spend on such things. He has basically baked all of this into his model as uncertainty that breaks either way, but that feels like an unsatisfactory solution to me.