Yeah, I didn't mention those.
But bottom line, on Noble, the player has no advantages over the AI but the AI as well has no advantage, correct? The human player and AIs are equal.
Nope. The AI has an advantage on noble: unit upgrade costs.
And does the AI also play smarter on higher difficulties?
No, but because the bonuses open situations that are not seen at lower difficulties, occasionally it will do things that surprise you because the bonuses allow it to meet certain conditions, such as a 1500 BC deity DoW. The AI won't do that on noble even though the code is the same, because it won't have the necessary unit types, massing, and war roles in time.
I mean, the best way would be to not have any advantages in any difficulty, just making it play smarter. Like in a chess game. WHen you raise the difficulty on a chess game, the cpu doesn't get any "bonus", it just plays better.
Writing an AI like this is incredibly difficult, and I say this w/o even being a programmer. I have made XML tweaks on the AI that make it a lot harder to deal with, albeit a little unfair (hiked unitprob, far less diplo limiters on trade or its willing-ness to declare war, and far more prep before it's willing to declare).
Within constraints, I'm very good at setting up screw-jobs for human players in the AI, I also made a custom script in warcraft III that hit people hard with defended footmen. If I abused the insane resource bonuses and made it go triple rax almost nobody on bnet could beat it unless they found it early and harassed well, and even then that needed them to be orc generally.
But making an AI with no holes for humans to exploit is borderline impossible. You're pitting one or several people with incomplete information against masses of gamers that adapt dynamically. It's a ridiculous recipe. A very good gamer who also programs well might be able to make a decision tree for the AI so that it micros very well and generally plays solidly, but in order for it to be hard for the human to counter, it would also have to pick between meaningful alternatives arbitrarily, such that the human couldn't rely on behavior patterns even if he learned the AI's decision trees (or couldn't easily identify which patterns the AI is using).
But making such a tree without experience playing the game and seeing how everyone else handles situations isn't feasible. It makes plenty of sense that the BTS AI was far better - it was adapted based on play of humans. The betterAI project continues the work to this day, and AFAIK it's gotten better, but you're not going to get an AI that's 100% free of abuse without monumental time and skill investments. It is much easier for programmers to hand them some bonuses if the goal is to add challenge. Good games find a balance between a great AI and the bonuses that is reasonable, or just trivialize single player and design around multiplayer (not a good model for civ, but worked great in starcraft where the AI was woeful but MP very strong).