How much do you know about polling? Which survey doesn't, to some extent, suffer from self-selection/non-response bias? If you go door-to-door, the respondents must still be willing to answer the survey, and typical industry practice is to disclose the topic of the survey upfront as a matter of good practice and ethical research. If the respondent is interested enough to spend some time answering your questions, that is in itself self-selection. Non-verified respondents is a problem not just for online surveys, but pretty much any kind of anonymous surveys, which are sometimes necessary. Those are distinct from surveys in which the answers are kept confidential, but the identities of the respondents are very much known to the researchers, hence allowing the respondents to be verified. Now, if you have an anonymous online survey that is data-checked with demographic quotas and weighted in accordance with general population, would that survey be unreliable? Sure, it may not be as reliable as, say, stratified or truly random door-to-door sampling with verifiable respondents (in which respondents can still outright lie if they want to). But it's also not just a random survey that is worthless. Simply dismissing any kind of online survey is like dismissing Wikipedia articles because they are Wikipedia articles - it's the opposite of demonstrating genuine knowledge of what you're talking about. Another favourite of mine is the often-made claim that a general population survey of 1,000 people cannot possibly be representative. As I posted earlier in the thread, so do the people in the UK apparently (and YouGov isn't exactly a random crappy research outfit).