It sounds like you might be able to get away with a good dusting and perhaps re-applying CPU paste between the CPU and its cooler. 100C is the point where most CPUs start slowing down to avoid incinerating themselves, so it's likely that your CPU is slowing down for that reason. The longevity of thermal paste can vary considerably by brand, but if you have a Z170-compatible Skylake (Intel 6000-series) processor, it's old enough that it's plausible the thermal paste has dried out and is no longer effective at helping the CPU cool itself.
Addressing whatever is causing the CPU cooling to be ineffective may obviate the need for an upgrade. The CPU could then run at full speed again.
Aside from that, I expect Q4 2022 to be a good time to upgrade, at least for CPUs. AMD has new CPUs, and their new platform will allow CPU upgrades through at least 2025, which considerably exceeds what Intel offers (usually only one upgrade the following year). Intel's new offerings come out some time in October. New GPUs around... early November I think from AMD? nVIDIA just launched high-end ones but has not announced mid-range offerings, but with the collapse of crypto mining, GPU prices are indeed falling. Q1 2023 may be even better in terms of when to upgrade... but there hasn't been a better time than now in the past 2.5 years. SSDs are cheap. DDR4 RAM is cheap. DDR5 isn't cheap, but is on the way down and is not nearly as expensive as six months ago.
Personally, I'm heavily leaning towards AMD for my likely CPU/motherboard/RAM upgrade. The reason being that they are much more energy efficient, and often also have better performance for multi-threaded tasks (i.e. productivity, rather than gaming). Set their new CPUs to use 65 Watts max, and they are far better than Intel in efficiency. Combined with being able to upgrade the CPU until at least 2025 without upgrading the motherboard, and I think they more than make up for the higher cost of the required DDR5 (Intel's newest CPUs can use either DDR5 or DDR4, the latter being cheaper and not having much of a performance penalty).
Unless you are seeing GPU limitations, I'd be inclined to stick with the 1080 Ti; it's still a good GPU. And with the CPU hitting 100C, some apparent GPU limitations may actually be CPU limitations imposed by thermal throttling.
GPUs are also an even better area for waiting than CPUs. The used market is likely to be flooded with used mining GPUs within weeks, driving prices down across the board, and AMD's November launches should also improve the value proposition. IMO most GPUs are still over-priced relative to pre-pandemic pricing at the moment. AMD's RX 6600 isn't a bad value, but isn't worth switching to if you have a 1080 Ti. The RTX 3050/3060 are overpriced relative to their performance now, but might fall quite a bit by the end of the year or early next year. It looks like nVIDIA is holding off on introducing new midrange GPUs because they have an excess of inventory of existing (RTX 3000 series) parts; the corollary of that is that if they don't sell quickly, their prices will likely fall, and the used-mining-GPU-glut is likely to result in the new ones not selling very quickly.
nVIDIA's new high-end offerings are only worth it if you absolutely need the best possible performance. Otherwise, IMO, they are priced very highly, and don't offer better performance-per-dollar than the previous generation, just higher performance at a higher cost. From the articles I've read written by people more knowledgeable in that area than I am, AMD's November offerings are likely to be more cost-effective due to a different hardware approach that allows them to build chips of similar performance for a significantly lower cost. We'll see if that pans out in consumer pricing (or if they pocket the savings), but I would not advise buying an RTX 4080 or 4090 right now.