Paging System v2

billw2015

King
Joined
Jun 22, 2015
Messages
837
I'm experimenting with this at the moment after noticing that we have granular control over what parts of a plot are rendered or not.
I modified the paging system so it can load/unload all these parts separately.
Here you can see the results from loading only one of the component types for the whole map:
Code:
                Mem Use MB  Mem diff  Whats rendered
NONE            1403        0
SYMBOLS         1422        19        Yield symbols
FEATURE         1612        209       Improvements, terrain features, resources, resource icons
RIVER           1453        50        Rivers
ROUTE           1430        27        Routes
UNIT            1680        277       Units
CITY            1560        157       Cities
ALL             1964        561       All
It is also worth noting that the speed with which these can be turned on and off varies wildly, with FEATURE component being by far the slowest (it uses the plot builder which I guess is slow).

I have modified the paging system now so that these components can be toggled at different distances, so you can have yield symbols, roads and rivers on always, and units, cities and features only paged in when you are looking at them.

I also modified it so that it will page in progressively instead of trying to do it all in a single frame, this smooths the experience when panning the camera (it pages in more when you move more slowly, less when you move fast).

//edit: I updated the unit numbers after making all tiles visible, they are now the biggest cost, but of course usually you can't see ALL tiles, so you can take a fraction of this number as the "real" cost.
 
Last edited:
One way to reduce those numbers for features would be to considerably reduce the amount of detail their NIF models have. Though I doubt people would want more blurry terrains or improvements...

Speaking of, is this from a matured game? Because I wonder if the mem use is mostly because of terrain or improvements.
 
This is an impressive degree of understanding and control you're taking over the rendering here. It's helping to build hope for us seeing the future we've always planned for in C2C. Thank you for this! Very cutting edge.
 
Is this Nightmare? On this handicap AI will be most advanced by 1900th turn.
In what era they are by the way?
1900AD not 1900 turns, its 3000+ turns so far I think. Its below noble difficulty cos I don't want my AI to get wiped out in my autoplay! He is still losing badly though...
 
1900AD not 1900 turns, its 3000+ turns so far I think. Its below noble difficulty cos I don't want my AI to get wiped out in my autoplay! He is still losing badly though...
You can always plop hyperspace tile in corner of map in middle of ocean and place hyperseedship on it, and delete rest of your AI.
No one will ever find you.

It can be any space tile and any unit, that can't go on Earth tile ;)
 
1900AD not 1900 turns, its 3000+ turns so far I think. Its below noble difficulty cos I don't want my AI to get wiped out in my autoplay! He is still losing badly though...
I wonder if we can figure out the bug that makes the AI not do anything the first round of an autoplay engagement. See if you can confirm this but last I knew, if you autoplay one round at a time, literally nothing gets decided because the AI doesn't actually kick in to make decisions until the next turn. This makes ever round stopped during an autoplay a lost round basically.
 
Does this take into consideration fact that some work has been done to allow the multiple features on a plot? At one stage there was talk about multiple resources on a plot but I don't think any work was done on that.

Yield on a plot is not always on. It comes on "automatically" when you have a settler unit selected and it is shown in the City screen. I rarely use the button that turns it on/off and would prefer it off by default.
 
Does this take into consideration fact that some work has been done to allow the multiple features on a plot? At one stage there was talk about multiple resources on a plot but I don't think any work was done on that.

Yield on a plot is not always on. It comes on "automatically" when you have a settler unit selected and it is shown in the City screen. I rarely use the button that turns it on/off and would prefer it off by default.
Multiple features was on and enabled but it did not give any graphics to the plot past the primary feature slot. There were some unresolved problems so its all commented out at the moment.
 
I didn't change the behaviour of any of the individual components of the plot, I just separated out their creation and deletion.
 
So some further investigation has given me maybe an idea about why we will crash above 2GB memory usage, even though we should be able to allocate up to 3GB with /LARGEADDRESSAWARE, and why the crashes are happening in dx code. See here for details: https://stackoverflow.com/a/22745579/6402065.
TL/DR is that normal 32 bit int overflows above 2GB (becoming -1), so if any function in the exe or any dlls (e.g. d3d9.dll !) use int instead of unsigned int for addresses, failure will happen as soon as we start to allocate above 2GB and pass addresses to those functions. It doesn't matter if we have loads of memory free, we can't pass any of it to functions like this that are unsafe.
This would explain why a crash happens in d3d9.dll, when we are around 2GB memory allocation, but we haven't hit out of memory crash.
I tried to test this hypothesis by using the flag described in that stackoverflow link, to force allocation to start from the highest memory address available instead of the lowest (which is the usual default), and I couldn't even get into the game, the shader loading itself failed (another d3d function I would guess).
I am not certain what I describe here is the reason for the crashes, but it certainly seems to explain them. I'm not sure if there is any real fix for this problem either sadly, other than perhaps hacks to d3d dlls themselves (this is fairly common, shader replacers tend to do it). And paging stuff in and out makes this problem worse sadly, as allocation of the d3d handles is happening after everything else is loaded, and they keep getting re-created, giving a higher chance they are going to be pushed above 2GB boundary.
I hit these problems again when I was testing an approach to detecting low memory by performing test allocations, hoping I could get above the more conservative limits the max memory cap sets. Sadly it crashed way before it got the point of actually being unable to perform allocations.
 
Multiple features was on and enabled but it did not give any graphics to the plot past the primary feature slot. There were some unresolved problems so its all commented out at the moment.
Didn't someone removed that completely from code?
 
just make a note of the revision when it was removed. The multi-feature mod and the pipeline will be around forever on github.
 
So some further investigation has given me maybe an idea about why we will crash above 2GB memory usage, even though we should be able to allocate up to 3GB with /LARGEADDRESSAWARE, and why the crashes are happening in dx code. See here for details: https://stackoverflow.com/a/22745579/6402065.
TL/DR is that normal 32 bit int overflows above 2GB (becoming -1), so if any function in the exe or any dlls (e.g. d3d9.dll !) use int instead of unsigned int for addresses, failure will happen as soon as we start to allocate above 2GB and pass addresses to those functions. It doesn't matter if we have loads of memory free, we can't pass any of it to functions like this that are unsafe.
This would explain why a crash happens in d3d9.dll, when we are around 2GB memory allocation, but we haven't hit out of memory crash.
I tried to test this hypothesis by using the flag described in that stackoverflow link, to force allocation to start from the highest memory address available instead of the lowest (which is the usual default), and I couldn't even get into the game, the shader loading itself failed (another d3d function I would guess).
I am not certain what I describe here is the reason for the crashes, but it certainly seems to explain them. I'm not sure if there is any real fix for this problem either sadly, other than perhaps hacks to d3d dlls themselves (this is fairly common, shader replacers tend to do it). And paging stuff in and out makes this problem worse sadly, as allocation of the d3d handles is happening after everything else is loaded, and they keep getting re-created, giving a higher chance they are going to be pushed above 2GB boundary.
I hit these problems again when I was testing an approach to detecting low memory by performing test allocations, hoping I could get above the more conservative limits the max memory cap sets. Sadly it crashed way before it got the point of actually being unable to perform allocations.
One thing you could do, although it is a bit of low level effort, is to bypass the runtime and allocate all the pages above 2GB from the OS at the start of the DLL. Then add a custom allocator for those pages and overload new to first try to allocate with that custom allocator and only if that fails (because the 1GB is used up) fall back to the normal runtime allocator (for delete you can check if the pointer is above of below 2G to choose the right allocator to pass the delete to). That way any allocations the exe does should go into the lower 2GB including the problematic graphics allocations.
 
One thing you could do, although it is a bit of low level effort, is to bypass the runtime and allocate all the pages above 2GB from the OS at the start of the DLL. Then add a custom allocator for those pages and overload new to first try to allocate with that custom allocator and only if that fails (because the 1GB is used up) fall back to the normal runtime allocator (for delete you can check if the pointer is above of below 2G to choose the right allocator to pass the delete to). That way any allocations the exe does should go into the lower 2GB including the problematic graphics allocations.
I truly wish I had a clue how to do that... but it sounds like as long as we didn't get too crazy, this approach could enable us to break ALL boundaries and would certainly pave the way for Multimaps to become a reality. Jeez... this is truly an amazing proposal.

What's the chances we could get you back on the team to help us with this directly @AIAndy? You've seen how much progress Bill has made in making the development environment much more programmer-friendly right? I know one of your complaints was how long one has to wait on the dll to compile with every adjustment. I think you'd really enjoy working with this current team.
 
One thing you could do, although it is a bit of low level effort, is to bypass the runtime and allocate all the pages above 2GB from the OS at the start of the DLL.
Yeah overriding the allocation system in the exe was one of the two things I was considering, the other being just remapping all the objects between the exe and d3d, if the problem was there. However upon (much) further investigation it turns out the error is really just that we allocate too many lights! We hit the current limit of 1024 at one time when doing paging, but not when just showing everything at once.

I actually fixed the initial problem by intercepting d3d.dll light functions and rejecting the -1 light index, however it now crashes the exe on light *deallocation* of course! So investigations are ongoing...
 
Back
Top Bottom