Why do mods crash on Windows Vista but not on Windows XP ?

Cybah

Emperor
Joined
Jun 22, 2007
Messages
1,481
Why do mods crash on Windows Vista but not on Windows XP ?

Does anybody know?
 
I know how you feel. Mods have a nasty habit of crashing to desktop when I try to start a game on a mod (I have Vista). I have a theory that it might be to do with custom dll files allowing for more than 18 civs. Also check their .ini files. I've come across many problems there (such as incorrectly named, incorrectly spelt, edited in the wrong place, etc). Sorry I can't be any more help.
 
Because Vista is nothing more then a buggy version of XP, that's supposed to look prettier. Sorry that's the honest truth, mods crashing is only one of the complaints I've seen about Vista. I've seen people complain about a whole host of software that Vista bugs out with that opperates fine in XP. And there is absolutely nothing Vista can do that XP can't, and XP litterally does everything better. Unless an upgraded UI supercedes functionality and efficiency, there is simply no reason to use Vista. You're better off demanding XP when you purchase a new computer.
 
to be honest, windows 2000 is a better gaming OS than Vista. Vista uses up a ton more system memory and like phungus said its basically just a buggy version of XP with prettier graphics.

I have XP Media Center Edition and I don't have the slightest desire to ever get vista
 
apparently though XP is soon to be shelved into an extended support category by Microsoft so it will soon receive less support.

Windows 7 will apparently be less resource hungry than Vista (though still heavy compared to XP)
 
Why do mods crash on Windows Vista but not on Windows XP ?

Does anybody know?

Windows Vista crashes whenever an application performs an illegal activity. Windows XP tends to go along doing its own happy business.

For example I had some code in FfH that might delete improvements, and some other code that checked improvement to see if certain things should be done. The pseudo-code looks like this:

1. Does this plot have an improvement?
2. If yes then:
2a. check to see if this improvement should be deleted.
2b. check to see if this improvement should spread to surrounding tiles.

If the improvement was deleted in step 2a then the check to see if the improvement should spread in 2b would be against an object that didnt exist (ie: to check the spread chance on the improvement).

Windows XP didn't care and just accepted the failure to check the spread chance. Windows Vista crashed to the desktop.

You can call it whatever you want. You can say that Vista is more picky than XP or that XP is more tolerant of bad code. Either way the cause of crash is really the modders, its just Vista doesnt let us get away with much.
 
Windows XP didn't care and just accepted the failure to check the spread chance. Windows Vista crashed to the desktop.

You can call it whatever you want. You can say that Vista is more picky than XP or that XP is more tolerant of bad code. Either way the cause of crash is really the modders, its just Vista doesnt let us get away with much.
Part of writing good software is taking into account human error. Humans are not, and never will be perfect (and neither are computers for that matter). Vista's problems with imperfect code go way beyond recreational moders. I've seen many a complaint from IT guys and graphix designers. That's simply poor design of microsoft to demand perfection in the real world, empty variables and what not slip through, part of a good operating system is to just ignore it if it causes no harm to the function. So no, I'm not going to accept your defense of poor quality software, that is supposed to be an upgrade of a superior system.
 
With "software" being as layered as it is, I always blame the first layer below (or above, depends on which direction you view it). In this case the layers are roughly Windows (OS) - Civilization IV (Application) - Mod (Application extension).

So, I have to agree with Kael, it's the modder's fault in first instance, but the modder will blame Firaxis for not making easier tools to debug it's mod (although there are enough in CIV to make an error-free mod) and Firaxis might curse Windows for being closed source, which hinders them in writing better code.

One thing I hate about CIV crashes-to-desktop is the lack of feedback. I'm not sure who's to blame there, but one thing I noticed is that error feedback changed from Warlords to BTS. With Warlords I seemed to get more information on why CIV was crashing...

I'm no fan of Vista, but generally I like strictness in software. On the long term it will lead to better programming. As a consumer market OS, I consider Vista to be a failure and Microsoft probably thinks so as well. If you want to upgrade, upgrading from XP to Windows 7 and skipping Vista is likely the sanest thing to do. Personally, I'll stick to XP for the years to come.
 
Will I find possible errors by using the debug DLL with XP? Or do I need to use Vista to find errors?

edit: need help. I cannot start my mod with the new debug DLL, it says: msvcp71d.dll is missing.

edit2: downloaded the file and put it in system32. still not working.

edit3: regsvr32 MSVCP71D.dll is not working either ("module not found")

edit4: copy the dll to D:\Spiele\Sid Meier's Civilization 4\Beyond the Sword did not help either :(
 
And there is absolutely nothing Vista can do that XP can't, and XP litterally does everything better.

XP can't use DirectX 10. Pretty soon all the new games will require it, including Civ 5. If it ever comes out of course.
 
XP can't use DirectX 10. Pretty soon all the new games will require it, including Civ 5. If it ever comes out of course.


Ah, yes, DirectX 10 - twice the system requirements, half the frame rate and no discernible difference in visual quality :crazyeye:
 
Because Vista is nothing more then a buggy version of XP, that's supposed to look prettier. Sorry that's the honest truth, mods crashing is only one of the complaints I've seen about Vista. I've seen people complain about a whole host of software that Vista bugs out with that opperates fine in XP. And there is absolutely nothing Vista can do that XP can't, and XP litterally does everything better. Unless an upgraded UI supercedes functionality and efficiency, there is simply no reason to use Vista. You're better off demanding XP when you purchase a new computer.

Vista has a TON of security improvements over XP (such as the fact the system files are loaded into memory in a random order), backup, UAC (people complain about it but every other OS has had something like it for ages now), the new interface (which actually uses less memory because it is rendered by the graphics card and not the system RAM), the search is greatly improved, the already mentioned DX 10, etc. It is really nothing like XP; though they look similar and the only immediately obvious change is the interface, Vista has been completely rewritten from XP under-the-hood in some areas (particularly the sound and driver systems). Microsoft is starting to crack down on poorly written software; in fact, you can't even install a driver that hasn't been signed on a 64-bit version of Vista. You may think sloppy code is OK, but it's actually slower and less reliable the properly written code.
 
Right. You've obviously never seen any screen shots of Crysis. :rolleyes:

Here is a list of games with Direct X 10 support that I have played on Vista with DX10 and on XP with DX 9:

Call of Juarez
Assassin's Creed
Gears of War
Universe at War
Hellgate: London
Age of Conan

Not one of them provides any meaningful improvement in visual quality, let alone enough to justify the steep system requirements or the drop in frame rate.
 
One big issue that I struggled with when I first got vista was that since Civ4 installs in the 'Program Files' directory, and also the mods are expected to go there as well, you have to have full admin rights to the directory. Otherwise you end up with multiple copies of the same file if you make any edits. Vista tries to 'pretend' that you can write to the Program Files directory when you can't, and of course this fails miserably. Anytime you have two copies of anything, they are guaranteed to diverge eventually right? Microsoft didn't know that I guess.

If you don't have full rights on the directory inside Program Files, you can open an XML file in notepad, change it and save it and it will work with notepad. Open the same file in a different text editor and you will get the original file before you made changes in notepad! Once you change a file you can't ever be sure which version of the file you are using unless you hunt down the place where these 'fake' copies are stored and delete them. What a nightmare.
 
That's to prevent malware from modifying programs. More security = more annoyance.

And EVERY version of Windows that has been released so far has needed higher specs to perform well than its predecessor. It's not specific to Vista. Windows 7 is the EXCEPTION.

People don't remember it, but XP was considered far worse than Vista when it first came out. Then Vista got delayed and everyone forgot what it's like to use anything else.
 
wow, that scenario you described is exactly why Vista is a complete and utter failure cephalo.

That would be soooo annoying it would not even be tolerable, at least for me. Honestly if you just cough up about $50 a year for a good antivirus and antispyware you wont have problems unless your some kind of porn and spam mail freak. I'd recommend PC Tools. I've had it for 2 years and haven't had a single problem with malware or spyware since. And once you have the minuscule security problems resolved, tell me why else would I want Vista?

I don't ever remember a time when Vista was considered superior to XP. Its because its not. I think most people agree that Vista is in fact a downgrade from XP. It uses up considerably more system resources for minimal graphics improvement, and decreased system performance. Oh and roughly half of your old games and plug and play devices wont work. You'll probably even have to go buy a new printer, fax machine, and scanner if applicable.

go ask some of those people on the Windows Mojave commercial. Every single one of those people looked like it was the first computer they'd ever seen in their life.

Or ask the CEO's that released windows 7 in record time to try and fix the giant pile of crap that is Vista.
 
People don't remember it, but XP was considered far worse than Vista when it first came out. Then Vista got delayed and everyone forgot what it's like to use anything else.

I think that you're thinking of Windows 2000: now that OS was a nightmare, massive compatibility problems (it was the first commercial MS system to use an NT base). I recall Windows XP being a breath of fresh air when it came out.
 
What XP has that Vista never had (and never will) was being the flagship OS for a very, very long time. It really began to be considered the wonderful thing it is today around SP2 (I think). It had worse compatibility, security, and bug issues when it first came out than Vista did. One thing it did not have was a widespread predisposition to hate it based on delays in development and comfort with the prior OS. Windows expert Paul Thurrot was initially going to give Vista a very bad review until he decided to approach it from the perspective of a user who had never even heard of it before. Windows 7 is not out in record time. The gap between Windows 2000 (released 02/2000) and Windows XP (released 10/2001) was shorter. Vista came out in 01/2007 and 7 is currently projected for late this year.
 
Vista takes elements from three different versions of windows.

Windows CE
Windows ME
windows NT

Combined they become Windows CEMENT
 
Top Bottom