I totally agree everyone's feelings are completely valid, I do believe getting a university education is a sign of your work ethic, and also shows you've spent formal time practicing with learning and understanding complicated concepts. So I certainly wouldn't say it's universal or anything, but I wouldn't be surprised to learn that in general people who have university degrees think more critically about political issues and may be more likely to understand advanced ideas that might be more difficult for you to grasp without an education.