innonimatu
the resident Cassandra
- Joined
- Dec 4, 2006
- Messages
- 15,375
Did the western European countries with colonies actually benefit in any tangible way from the countries they colonized?
Did it help them gain more power? Income? Resources? Or for all practical purposes, they spent more money on the colonies/lands they invaded than they took back from it?
Did the average person in those countries (England, Spain, France, Portugal, Netherlands) benefit from these colonies in any tangible way?
also: how much difference did the colonies make in wars between powers? (such as WW1, WW2).
As countries, I think they all did. Some might have disappeared but for the usefulness of the colonies as bargaining chips. At least Portugal and the Netherlands were endangered by far larger and expansionist neighbors. Especially early on the colonial trade was very profitable, and those profits were critical to the state's capacity to finance defensive wars in Europe.
In terms of resources and economic development, it can be argued about interminably. In Portugal the argument started back in the 15th century, concerning the questionable usefulness of attempting to conquer territory in North Africa (there was a civil war over the direction the country should go), and contained down to the 20th century. And how can you even value having half a continent speak your language, share some elements of your culture and therefore be somewhat "closer"?
The less useful phase of european colonialist was the 19th century colonial occupation of Africa. That was probably more damaging that profitable. Africa was extremely harsh and the colonial occupation did not last long enough for benefits to be reaped. As with America the opening of the continent to the world led to local catastrophic changes (the cattle plague, for one example), which even now are still being repaired or adapted to.