After the same logic no medicine would ever be sold because humans are immeasurably more complex and more varying than computers are. I guess we can tick off that pseudo-argument, can we?
No, no we can't. It's no an argument. It's a simple fact.
Once again you've done a great job defeating your own argument with your flawed analogy. Medicine
does suffer the same headaches as computer software. Allergies, interactions, long- and short-term side effects, etc.
Many of these side effects and negative drug interactions aren't uncovered until after a drug has been FDA approved. Look at the rash of recent appetite control medicines that have caused heart diseases, or speculation into the involvement of antidepressants in the rise of violent outbursts among teenagers using these medicines.
Why do drug companies make these mistakes? Because no process of quality control can possibly test completely the infinite possibilities that are the human body.
Granted, Computers are an order of magnitude less complex than the human body, but then we're getting into specifics. Drug companies generally have a lot more money and time to get drugs to the market than game companies do software.
Nice try, though. (Sort of).
And, by the way, you don't know what you're talking about - we have problems even with CD and moreso with DVD burners in spite of years of standardization, so would you PLEASE stuff it. Have a look on incompatibility lists published in good PC magazines instead. Learn something.
Great. So if you have problems with hardware that
is standardized imagine the headaches faced by people working on not just one but hundreds of technologies, each with their own (sometimes conflicting) standards. Welcome to large-scale software deployment captain know-it-all.
I really do not know who told you that you knew anything about computers - that guy must have been a jester indeed. I won't go into depth here but recommend a bit of lecture about IT standards, tolerances etc. ISA, EISA, SCSI, ATA and the like, just to name a few of the well known ones.
My degree and years of experience in the industry are of note. You can lecture me all you want about IT standards, EISA, SCSI, ATA, etc.
Then we will talk about how well they're implemented in the real world on actual hardware.
Then you can go back and read some more textbooks and journals and pretend that's how the world works.
Excellent. I allow myself to remind you that you have just judged your very own behaviour (in QUITE precise words, at that), since you, too, lack any "hard data" concerning how many people run CIV4 exactly the way it was promised and intended. Before you demand - deliver. I know you can't. So why posing as if you sat on hard facts facing a hilarious hypothesis?
Do you know how arguments work?
I'm not trying to prove anything. Therefore, it's not required I provide data. I'm not arguing that Civilization IV is not a buggy release. It may well be buggier than most. It may also be much less buggy than average. Until I (or you) see numbers, neither of us can argue in either direction.
My argument is more of a general condemnation of the notion that if a software developer must patch software after release that they have somehow failed or are "incompetent."
Until you can show me data which indicates that the release of Civilization IV is any more of a technological headache than any other modern game release, then the argument that Firaxis or Take2 have somehow done their customers a disservice is moot.
Prove some kind of negligence beyond the expected and inevitable bugs that developers must deal with both before and after release and we will have a conversation about that. Until then, you're doing nothing but whining about the fact that you are a PC Gamer, because all PC Gamers (and PC Game developers) must go through this unfortunate release/patch process.