That only delays the problem. After training them, you will have to increase their paymentos or risk losing them
If you have Devs already trained that do a good work, its quite often the best decision to keep them, even at a slightly higher cost. Its better to keep someone that has already proven to be worth it than to spend time and resources training someone new that you might realize if it will do as good as a jog than the one you have
You should always have new trainees though, to cover for possible future vacancies, try to discover gems and getting fresh ideas
My opinion is that it's a cultural phenomenon that, if not unique, at least began with the last couple of generations.
I've had so many managers who:
- Think in terms of short term success.
- Focus heavily on how they appear to higher ups, where managing optics and blame takes up more energy than making things work.
- An attitude of "I got mine screw you."
- A boys' club or frat boys attitude that if you hold the car keys/building keys/keys to the kingdom, you're entitled to use it as your personal playground for people you like.
- Common narcissistic behavioral patterns, creating loyal inner cliques and identifying outliers you can persecute and turn the community against.
- Higher up leadership that actually really dose only care about appearances, not causes, and is firmly quarterly minded.
There are some economic reasons for this. It's the result of "Modern Monetary Theory" which solve the problem of a poorly structured economy by essentially leveraging the entire economy against massive high risk, high return investments which in theory finally fund the parts of the economy which are structurally wasteful. It also creates more waste. This makes the base structure of the economy unaffordable, so you can't really afford to invest in a studio, a company, people, etc. Ironically though, there's a flood of money available for more expensive high-return investments, but these are indeed measured on a quarterly basis. Lowly capital needs can't compete with quarterly demands, even though there's a ton of money to be had to spend and waste on failed investments, so long as there's a chance it could lead to high returns.
There's no feedback loop to reward doing things the "right" way but you have to do things the right way at a minimum, so the entire burden of performance is placed on a declining expert experience class to make up for the institutional shortfalls in the places where they work. If you work somewhere caught in the money hype loop, like AI, as a top expert, your compensation will be obscene. Outside of that, you're grinding longer hours, with more expected of you. Got a skills gap? Your problem entirely to solve at your own cost and time, always, inevitably. And if you look at inflation and cost of living then integrate it into your anticipated salaries adjusted for time value of money, you'll find out that your lifetime anticipated wealth is closer to the poverty line than you'd think.
I think the only exceptions are in these niche foundational fields like industrial control systems which are actually required for the real economy to function, so they do tend to pay their staffs to go off and get training to keep up with trends.
Anyway, this seems to be an overarching reality. Resource and economy driven, so 4X brain... And it's a systemic feature that could explain a lot of the economic but also cultural constraints affecting why the game turned out this way.
You know, there was a bill lately I think North Carolina wants to grant immigration visas for medical professionals because they have a shortage. This is ironic because medical schools and the residency system in the US explicitly limit the number of doctors produced every year. The left hand isn't talking to the right.
In my case, while I did end up getting a CS as a second degree, when I was first in college with a 5 on my AP CS AB I sought out the CS major - this was my state school, and the only tuition I could afford, and a I had been given a scholarship. I was told that CS was limited enrollment and you had to apply in your senior year of high school and it was highly competitive with limited space, and there was nothing I could do except transfer schools (where I wouldn't receive in state tuition). What's up with that? Why are we restricting the number of CS professionals being produced, but then complaining there aren't enough of them?
I think I literally dove into a rabbit hole of Civ 3 addiction after being mildly depressed about the above, so I frankly am not very happy at what happened with Civ 7. I think I would have got myself fired being a spark-plug complaining about the game had I been working there, which I think is the energy this industry once had, and according to rumors, this is literally what happened with the UI team.
So frustrating.