Dynamic FPKs and encapsulated dependencies.

A batch file for starting the game, rather than an EXE would be a helpful alternative solution to that... but did you mention compiling the dll as part of that step?
Yeah that is what I suggested in a later post, it could also compile the dll. Basically a one stop batch file to do everything and then run the game.

Ok, yeah the guide wasn't specific about that key difference. That's ok... I don't feel critically lost yet, just unsure like a cat in an abandoned building it might make a home.
Yeah I have split out a setup guide specifically for collaborators now, it is much simpler: https://github.com/billw2012/Caveman2Cosmos/wiki/Collaborator-Setup
I didn't finish the simplified "Introduction to git (for collaborators)" yet, will do it when I get home, and then make a post about it all.
 
Yeah that is what I suggested in a later post, it could also compile the dll. Basically a one stop batch file to do everything and then run the game
I suspect this would not work for compiling the dll because it would disrupt our ability to match the minidumps to the current dll build. I mean, it would work but it would mess up our ability to get any information from a minidump.

What's the reason you feel this would be needed anyhow? You've already made the 'work' of getting the build environment setup as easy as simply downloading VS and running your unpacker. Is the DLL too large for something here?

Yeah I have split out a setup guide specifically for collaborators now, it is much simpler: https://github.com/billw2012/Caveman2Cosmos/wiki/Collaborator-Setup
I didn't finish the simplified "Introduction to git (for collaborators)" yet, will do it when I get home, and then make a post about it all.
OK, cool.
 
I'll time the difference to FPK's and loose, just so I know the difference myself, been a couple of years since I last tried only having loose art files.
Yup it loads a lot slower. A lot.
An increase from 15 secs to several minutes. (amount of loose files that replaced the FPK's were 22290 texture files and 6035 model files)

I tried adding 246 .dds files as loose files and the launch time only increased to 17 secs.
So about 1 second per 100 loose files. (results may vary, no reliable way to approximate such a "time / loose art files")
It definitely makes sense it would be a lot slower, but I definitely wouldn't suggest it is used like that regardless.
Currently the SVN guide recommends strongly that people DON'T use the Mods directory directly for their working copy, instead recommending export to that directory from the working copy that is stored somewhere else.
I could make a system that mimics that closely: you do an initial setup where you select your mods folder, it records it into your working copy directory, and then there is a script that will sync from one to the other, including both building the DLL and packing the FPKs *if they have changed*. What about that? Of course the DLL build could still push directly to the mod directory if enable it, but it can use the same saved directory you set up before. I can also auto detect the civ directories in most cases as 95% of the time they are in steam or program files.
Nah, I don't see much of a need for this, having less than 100 loose files that overwrite files from the FPK's does not significantly increase game launch time. We rarely get more than 50 new loose art files between two major version releases.
If we do get many new loose files we could pack them to FPK's with your pack script even though it may still be some time before a new major version release.
No need to complicate the installation procedure for a small game launch speed improvement that involves repacking FPK's every time one does the SVN export mimic you talked about. It would increase the export time a bit wouldn't it.

My point is that your script would be nice to have for streamlining FPK packing/unpacking, but I see very little need to actually pack FPK's more than twice a year at the most.
 
Last edited:
I suspect this would not work for compiling the dll because it would disrupt our ability to match the minidumps to the current dll build. I mean, it would work but it would mess up our ability to get any information from a minidump.
Yeah very true! Although it certainly isn't an insurmountable problem:

I would probably make a "submit crash report" script or something that would zip the minidump, dll and pdb together along with a file with context info in it (what revision they are on, if they have local changes, whatever other debug info is usually helpful etc.)
In fact it might even be possible to automate that from within the DLL. I have implemented such things before on a couple of games. e.g. When a crash happens you can get a nice dialog popup with a "do you want to send a crash report to the C2C team?" button, and it can do whatever we want. e.g. open up the forums or discord or something, and give them a zip file they can post.

In addition to that something which would also be of benefit, even when keeping the compiled DLL in source control as it currently is, would be source indexing. If you don't know: it embeds scripts into the pdb file (hopefully it is compatible with pdbs generated by the toolset we are using) that can fetch the versions of the source code that the dll was compiled with direct from source control when you load a minidump (if you have the correct pdb to go with it). This stops the annoying "source code doesn't match" errors in Visual Studio, and also avoids you having to revert back to an older revision in source control.
As it currently is if you want to debug a minidump (I guess) you have to revert your working copy back to whatever version of the DLL they were using right? Or deal with the mismatching source code I guess?

What's the reason you feel this would be needed anyhow? You've already made the 'work' of getting the build environment setup as easy as simply downloading VS and running your unpacker. Is the DLL too large for something here?
I don't think it is needed, it is just normal practice to keep build deployment separate from source repository in most cases. However I put the DLL as it is into git without problems, and didn't plan to change it.

Yup it loads a lot slower. A lot.
An increase from 20 secs to several minutes.
Yeah sounds about right! Don't want to do that then
 
I would probably make a "submit crash report" script or something that would zip the minidump, dll and pdb together along with a file with context info in it (what revision they are on, if they have local changes, whatever other debug info is usually helpful etc.)
In fact it might even be possible to automate that from within the DLL. I have implemented such things before on a couple of games. e.g. When a crash happens you can get a nice dialog popup with a "do you want to send a crash report to the C2C team?" button, and it can do whatever we want. e.g. open up the forums or discord or something, and give them a zip file they can post.
Now that would be awesome. ^^
Yeah sounds about right! Don't want to do that then
FPK's are indeed necessary, I edited the post you quoted quite a bit, so I hope you'll look at it again.
 
No need to complicate the installation procedure for a small game launch speed improvement that involves repacking FPK's every time one does the SVN export mimic you talked about. It would increase the export time a bit wouldn't it.

My point is that your script would be nice to have for streamlining FPK packing/unpacking, but I see very little need to actually pack FPK's more than twice a year at the most.

If that is what makes most sense for you then the scripts can be used like that. It can work on git as well as long as the files are <=100MB, if that is done as standard then it will be 9 FPKs I think. In that case I would focus on making the scripts as you originally described such that they unpack the existing FPKs to a separate directory, merge the current art directory (- the bik files) into that newly created directory, then repack it to FPKs, and delete the art directory. This can just be done periodically when the amount of stuff in the art folder gets to large, or before a release.
 
Ok, so work to make it easier to submit bug reports and get the PDB to set itself to an older compatible state would be awesome.

What I don't really understand is... if you don't have VS on your system, or some other compiling software, how would the batch to start the game that includes compiling the dll on game load be useful in that case? It's actually able to compile without such software?

As it currently is if you want to debug a minidump (I guess) you have to revert your working copy back to whatever version of the DLL they were using right? Or deal with the mismatching source code I guess?
Yep... it sometimes doesn't work all that well with just the dll and pdb files either and is often able to get better information if you run the mini from the final_release folder in full but it HAS to be unmodified since the final release build was generated and played to get the minidump to be that effective. Still not as good as finding a repeatable crash point with a VS attached debug dll run but it's the best you've got when you can't get the crash to repeat.

Also, I don't know how this factors in but for every dll we post updates on, Alberts2 and I have been doing completely fresh builds (delete the old final release and debug folders entirely and recompile from scratch) because we found there are some bugs that can take place when we don't. Maybe Alberts2 can explain the need for this more. It's been a while and its also possible that the strange compiler bug we were hitting has been resolved by more advanced compilers since then.
 
if you don't have VS on your system, or some other compiling software, how would the batch to start the game that includes compiling the dll on game load be useful in that case? It's actually able to compile without such software?
You don't need Visual Studio to build the DLL. All dependencies including the toolset (the compiler, linker, nmake etc) are ALL in deps.exe. Visual Studio is just a nice interface for running command line tools. You can install the latest versions with no compiler at all in fact, although without the C++ component it won't load .vcxproj files so it isn't much use!
Also, I don't know how this factors in but for every dll we post updates on, Alberts2 and I have been doing completely fresh builds (delete the old final release and debug folders entirely and recompile from scratch) because we found there are some bugs that can take place when we don't.
Yeah I expect the old compiler isn't helping here :/
 
Yeah I expect the old compiler isn't helping here :/
So that's going to mean then that every time you go to play, if it's not on the latest code assets, the dll has to be completely re built on the player side right?

Seems between that and the FPKs, this could be a very long game start procedure...


it is just normal practice to keep build deployment separate from source repository in most cases
Can you explain more about this? I mean, this is why I didn't include the source files in the release right?
 
So that's going to mean then that every time you go to play, if it's not on the latest code assets, the dll has to be completely re built on the player side right?
No I got rid of the FPK packing being required, and the DLL building isn't required, it is in the repo.
I set it up the way I would want it to be (i.e. stuff is always built locally only from scratch), but if other people don't like that it isn't a problem to keep it the same way it currently is. I already rewrote the FPK script to be for updating them instead of creating them from scratch, and I never got rid of the DLL to begin with, I just also added scripts to build it as well.

Can you explain more about this? I mean, this is why I didn't include the source files in the release right?
There are a few reasons why you normally wouldn't put files that are built from the repository also in the repository. Some reasons don't apply to this project (like other platforms for instance), but some still could.

One reason is that binary files are not mergeable. So when you merge a branch you need to make sure you also rebuild any effected binary files to take into account the merged code. Same for the art assets of course, but as the art folder acts as an overlay on top of the FPKs it is less of an issue at least (FPKs shouldn't change outside of the main branch).
Another reason is that it can just lead to errors. If someone forgets to build, or forgets to submit the build, or submits one that is out of date, etc, etc. you end up with a build that doesn't match the code. This leads to people chasing bugs that are meant to be fixed while the person reporting is swearing up and down they have the latest revision etc. And of course it an extra thing you have to remember to do when you submit code changes.

However if these aren't significant problems on this project then they don't need a solution if it introduces overhead for everyone.
 
However if these aren't significant problems on this project then they don't need a solution if it introduces overhead for everyone.
Yeah, that's the way I see it.
However, the script you wrote for automatic FPK packing and unpacking would help a lot when we actually do mess with the FPK's as it is a hassle to use PAKBuild manually to unpack/pack all the FPK's.
 
No I got rid of the FPK packing being required, and the DLL building isn't required, it is in the repo.
I set it up the way I would want it to be (i.e. stuff is always built locally only from scratch), but if other people don't like that it isn't a problem to keep it the same way it currently is. I already rewrote the FPK script to be for updating them instead of creating them from scratch, and I never got rid of the DLL to begin with, I just also added scripts to build it as well.


There are a few reasons why you normally wouldn't put files that are built from the repository also in the repository. Some reasons don't apply to this project (like other platforms for instance), but some still could.

One reason is that binary files are not mergeable. So when you merge a branch you need to make sure you also rebuild any effected binary files to take into account the merged code. Same for the art assets of course, but as the art folder acts as an overlay on top of the FPKs it is less of an issue at least (FPKs shouldn't change outside of the main branch).
Another reason is that it can just lead to errors. If someone forgets to build, or forgets to submit the build, or submits one that is out of date, etc, etc. you end up with a build that doesn't match the code. This leads to people chasing bugs that are meant to be fixed while the person reporting is swearing up and down they have the latest revision etc. And of course it an extra thing you have to remember to do when you submit code changes.

However if these aren't significant problems on this project then they don't need a solution if it introduces overhead for everyone.
Great answers. I wouldn't take my feedback to be criticism immediately... I'm really just questioning at the moment so as to understand things.
 
Back
Top Bottom