Thought so. This is why I answered as I did.
There are certain things that shouldn't be done, and in a way legitimize force being done to end it. Problem is just that wars don't work for this. We're dealing with a hypothetical, of course, but when it comes to a moral question, it has to take reality into account.
Invasive wars may make things slightly better, but usually either makes things worse or doesn't change things. We've seen this in practice in basically every war ever. The premise depends on the idea that you make things better. Installing a liberal government doesn't work even when stated as the intended goal. You might say that it's because the wars are actually opportunistic and geopolitical in nature, securing resources, but I'd point out that yes, this is true, and we have to think of this as part of the premise, since that's what war is as practice, even today.
It's worked like two times off the top of my head. WW2 Germany and Japan.