Watch JayzTwoCents' video (like and subscribe!), he explains it really well:
Long story short, the nVidia Geforce GTX 1080 thermal throttles at 83C with the original nVidia cooler, although I wouldn't rate it as catastrophic. Some laptops with Intel Core M processors tend to overheat, and then the CPU throttles way down to it's lowest possible frequency, like 800Mhz. That's what I'd call catastrophic.
OK, so the founder's edition GTX 1080 with the vapor chamber + heat pipe cooler throttles slightly, but other video card - and cooler manufacturers managed to keep temperatures (barely) under the limit, so the video card will work perfectly, even when overclocked.
The sad truth is that neither Intel nor nVidia planned ahead to improve cooling, even though the technologies were already thought through.
The simple solution researchers came up with was inserting (highly) heat conductive metal nano-rods into the architecture of the chips, so the heat can be dissipated more efficiently.
The slightly more complicated proposed solution was to make nano-tube heat pipes inside the chip, which would be even more efficient than the metal rods, but a lot harder to do.
Personally I'd go with a more barbaric solution, like spreading out the chip architecture on a bigger wafer, to have a larger surface. This is not ideal, because the whole industry is trying to make things smaller, in order to bring down material costs ... so we'll see what they'll actually start using in the next generation of smaller chips.
NOTE: It is well known that AMD's 32nm FX processors consume a lot of electricity and produce a lot of heat, but you should also know that they are kinda' the last generation of processors with very efficient heat dissipation. The FX processor wafer is quite large and the metal cap is soldered on, so if you have a good cooler, you can keep its temperature down at 10-20 degrees above ambient, while comparable 14nm/22m Intel Core i3/i5 processors always go above 70 degrees C under load.