Don't talk to me about "the good old days" of gaming. When I was younger, you bought a game and it worked and it was free of bugs. Times change, software has to be rushed, and a few bugs get in... I can understand that. I can appreciate a company wanting to make some sacrifices when it's more cost effective to wait and see if problems pop up, than to test and retest every possible scenario. But if a program is going to do anything, it had better be able to execute. We've accepted lower standards already, we accept that software comes chock full of bugs and that it's partly the paying customer's responsibility to fix them by pursuing the patch. But since when do we accept software that doesn't necessarily work at all? You've been around the computer world a long time, and this doesn't offend you? If we're paying money for a program that won't necessarily operate, then how long before we're paying money for a program that doesn't necessarily exist or have any promise of being written? As many technical hurdles as there may be in creating a video game, I guarantee that technologies like DirectX and XML are making the developer's jobs easier, not harder. These should not be pointed to as the causes of fatal bugs.