Though, question: if the existing game already contains portrait/icon art which has sizes that deviate from the power-of-two rule, why would modders need to follow it for users with obsolete/inadequate gpu's? As in, those users wouldn't be able to play the existing game anyway without graphical errors. I don't understand.
It's related to what I said earlier about civ5 splitting some textures into smaller square power-of-two textures. The problem is that the game does not always do it and we don't know when it does. Maybe it's related to the texture format, maybe it's related to its size, maybe it's related to when it's used, etc. The bets are opened!
Also, I performed a little bit of Googling, and it seems like the generalizing of texture sizes which aren't powers of two, with Nvidia for example, started at least as far back as the 6 series, about 7 years ago. Within the context of computer technology, that's pretty old.
Well, first of all, you need to realize that most computers only have an integrated GPU chipset (they only became decent recently), and a game editor cannot just deem those consumers as worthless and not support them, especially for a game like civ5 which has not the same users base as Crysis. Now add the fact that the average PC lasts many years, and that you must look at the whole world, especially India and China, and their 2.5 - 3 billions customers who often have terribly old computers where 800*600 is not uncommon (hence why Firaxis actually did choose 800*600 for their minimum resolution, which is a severe constrain when it comes to UI design). And if you think that at least, since Vista (2007), every new computer had decent graphics capability to match the Vista label, well, it's not true, even big brands released after that computers displaying the Vista label but unable to use Aero.
But the real dark side of 3D lies in the drivers hell. Roughly speaking, about half of the computers either have no 3D drivers, or have bad drivers, or have the very first versions of their drivers, those full of bugs and such. The result of this and the numerous bugs (drivers are fortunately improving nowadays but they used to be real crap) is that many GPU are detected as older, sometimes way older, than they really are, or as not supporting some features they could actually support. Yep, some people do have modern GPU that are detected as a year 2000 hardware. Let's talk about Intel too: sure they released dx9 compatible hardware as soon as 2006... but the pixel shaders were ran in software! A bit later they released an opengl 2.x (I don't remember) compatible hardware, but it then took them three years to release a driver that could use those capacities!
Nowadays, in 2012, more than half of computers cannot simply run Firefox with hardware acceleration and Chrome disabled it by default. But since games cannot just run without 3D, many games choose lower requirements than Firefox and keep a pipeline compliant with old GPU, sometimes as old as the early dx9 (2003), while providing other pipelines for more recent computers. Unfortunately this is common in the computers industry : many softwares and webpages we use are still designed for 800*600 while I run in 1920*1080 since six or seven years now. Games are not an exception and the poor quality of drivers make things even worse ; this is the real pain in 3D programming.