nVidia Geforce GTX 1080 - thermal throttling?

Yes. It seems that the new 16nm FINFET manufacturing technology has the same downside as the 22nm and 14nm Intel CPUs, which is lousy heat dissipation. Even though the overall power consumption/heat generation is lower than with previous generation cards, the smaller manufacturing process has a smaller surface to dissipate the heat.


Watch JayzTwoCents' video (like and subscribe!), he explains it really well:


Long story short, the nVidia Geforce GTX 1080 thermal throttles at 83C with the original nVidia cooler, although I wouldn't rate it as catastrophic. Some laptops with Intel Core M processors tend to overheat, and then the CPU throttles way down to it's lowest possible frequency, like 800Mhz. That's what I'd call catastrophic.

OK, so the founder's edition GTX 1080 with the vapor chamber + heat pipe cooler throttles slightly, but other video card - and cooler manufacturers managed to keep temperatures (barely) under the limit, so the video card will work perfectly, even when overclocked.

The sad truth is that neither Intel nor nVidia planned ahead to improve cooling, even though the technologies were already thought through.

The simple solution researchers came up with was inserting (highly) heat conductive metal nano-rods into the architecture of the chips, so the heat can be dissipated more efficiently.

The slightly more complicated proposed solution was to make nano-tube heat pipes inside the chip, which would be even more efficient than the metal rods, but a lot harder to do.

Personally I'd go with a more barbaric solution, like spreading out the chip architecture on a bigger wafer, to have a larger surface. This is not ideal, because the whole industry is trying to make things smaller, in order to bring down material costs ... so we'll see what they'll actually start using in the next generation of smaller chips.


NOTE: It is well known that AMD's 32nm FX processors consume a lot of electricity and produce a lot of heat, but you should also know that they are kinda' the last generation of processors with very efficient heat dissipation. The FX processor wafer is quite large and the metal cap is soldered on, so if you have a good cooler, you can keep its temperature down at 10-20 degrees above ambient, while comparable 14nm/22m Intel Core i3/i5 processors always go above 70 degrees C under load.

Personal rig update: adding an ATI Radeon 5550 ... what?!

I have never been a fan of high end hardware, mostly because they depreciate in value much quicker than low-end to mid-range hardware, and they offer significantly less performance per dollar.

In the past 5-6 years I have only used Celeron and Pentium dual core processors from Intel - Athlon X2s and an E-350 APU from AMD.

Currently my PC includes:
  • mainboard: ASUS H81M-K
  • processor: Intel Celeron G1820 22nm
  • memory: 4GB DDR3-1600MHz
  • SSD: 60GB SATAIII Kingston V300
  • video card: ATI Radeon 5550 512MB DDR3 128bit
  • case: noname ugly white ATX case
  • PSU: RPC 450W ATX, 120mm fan, 72% efficiency
  • OS: Microsoft Windows 10 Professional OEM 64bit
  • monitors: 2 x 17" LCD, 1280 x 1024, VGA ports only
  • keyboards: Logitech K220 Wireless, Serioux Radiant-KBL003
  • mouse: Logitech M150 Wireless
  • speakers: Logitech 2.1 S220
I'll do reviews on each component soon, but today I wanted to say a few things about the "new" video card I added to my PC.

When I bought my Haswell Celeron G1820 machine, I used it with two 17" 1280 x 1080 LCD monitors: one connected to the on-board DVI and the secondary screen to the on-board VGA port. This configuration worked relatively well, although the processing power of the integrated Intel HD Graphics chip wasn't enough to play an HD 720p video on one screen, and do something 2D/3D intensive on the other. This was really annoying. But as luck would have it, the DVI port died on my 10 year old Fujitsu-Siemens B17-2 LCD, so I had to find a cheap solution to connect both monitors to the PC.

My main constraint was money, of course, but having the Windows 10 license was also a very important concern. I had to find a cheap video card with VGA, DVI and HDMI (for future compatibility) that is still supported in Windows 10. As it turns out, the Radeon 5000 family is the oldest, still supported by official drivers (without tweaking).

So my two options were to buy a cheap, scratched monitor (Grade B) with a working DVI port for about 20 USD or a Radeon 5450 video card for about 22 USD.

Q: So why don't I want to buy a new video card?
A: Neither AMD nor nVidia have released low-end video cards in the past few years. nVidia Geforce GT 710/720 still use Kepler architecture, while AMD Radeon R5-230 is even older, it has a 40nm GPU.

Luckily my sister didn't throw away her old video card, which was recently declared dead, after a long series of blue screens (every 20-30 minutes of gaming). So I got a good deal on it, 12 USD, and I managed to bring it to life, by replacing the thermal paste and cleaning the radiator (which was blocked by dust). It has 320 stream processors, 512MB DDR3 128bit and consumes under 39W, while it's only about 10% faster than the integrated Intel HD Graphics (Haswell).

The ports on the card allow for many combinations, as there's one of each: VGA, DVI and HDMI. This means that I can connect either two VGA monitor or two DVI monitors, by converting the DVI to VGA or the HDMI to DVI respectively (i could also connect 2 HDMI screens with a DVI to HDMI adapter).


First I tried disabling the iGPU and connecting both monitors to the video card, in order to free up some system memory (the Intel iGPU used 256MB of memory), but I soon realized that it's not the best option, as most applications and games worked like before, but some ran even slower.

After a bit of experimenting I re-enabled the iGPU and connected it to my secondary monitor. This way apps and 3D games run well on my main screen with the Radeon 5550, without affecting the 720p video playing on the other screen through the Intel iGPU.

One thing I struggled with after I used the PC for a while, is the refresh rate. While I was trying to connect the monitors in different ports, the drivers automatically reset the refresh rate to 60hz, which in itself isn't a problem, but in my case it made the screens blurry mostly in the center. No matter what I tried to tweak in the monitor's menu, it wouldn't help, so finally I looked in the AMD Catalyst Center, and realized the only possible explanation. I quickly switched back both monitors to 75Hz, pressed AUTO configure on both monitors and the picture became clear, FINALLY!

I don't know what's the exact reason behind the blurriness at 60Hz, but little things like this occur with old hardware... and these monitors and video card are definitely old :)

AMD Radeon M400 GPUs confirmed


We already know that Polaris 10 and 11 GPUs will be shown at Computex 2016, but now we have a clear picture about the specs of all the AMD laptop video cards, which will be launched in the M400 series:
  • Radeon R9-M490X - Polaris 10, 8GB GDDR5-1250MHz 256bit
  • Radeon R9-M490 - Polaris 10, 8GB GDDR5-1250MHz 256bit
  • Radeon R9-M485X - Antigua XT, 8GB GDDR5-1250Mhz 256bit
  • Radeon R9-M480X - Polaris 11, 4GB GDDR5-1500MHz 128bit
  • Radeon R9-M480 - Polaris 11, 4GB GDDR5-1500MHz 128bit
  • Radeon R9-M470X - Strato XT, 4GB GDDR5-1500MHz 128bit
  • Radeon R9-M470 - Strato PRO, 4GB GDDR5-1500MHz 128bit
  • Radeon R7-M465X - Tropo XT, 4GB GDDR5-1125MHz 128bit
  • Radeon R7-M465 - Litho XT, 4GB GDDR5-1150MHz 128bit
  • Radeon R7-M460 - Litho XT, 4GB DDR3-1000MHz 64bit
  • Radeon R7-M445 - Meso XT, 4GB GDDR5-1000MHz 64bit
  • Radeon R7-M440 - Meso XT, 4GB DDR3-1000MHz 64bit
  • Radeon R5-M435 - Meso XT, 4GB GDDR5-1000MHz 64bit
  • Radeon R5-M430 - Exo XT, 4GB DDR3-1000MHz 64bit
  • Radeon R5-M420 - Jet XT, 4GB DDR3-1000MHz 64bit
Surprising or not, only 4 video cards will be launched with the new architecture, which means that only the highest quality chips will be used. This is an absolutely normal procedure for CPUs and GPUs, but the rebranding of so many old GPUs is a bit worrying, as it could mean that the new Polaris architectures are not modular enough to cut out/disable defective parts and sell the chips as inferior R7 or R5 Radeons.

On the desktop side nVidia also had some late launches in the Geforce 700 generation, as the GT 740/730/720/710 were launched (in this order) months and years after the high-end cards with the same 28nm Kepler architecture. To make things more confusing, nVidia also made the GTX 750/750Ti/745 with the much newer Maxwell chips. Long story short, it is not uncommon to have three or more different chip generations/architectures in one product family.

The exciting part of this news is that laptop video cards will be launched first, so whatever benchmark scores we'll see at Computex, the desktop video cards will surely best them by a lot, as they'll have more power available and better cooling.

source: wccftech

AMD is launching Polaris 10 June 1st @Computex

Whatever the real development and manufacturing situation may be, AMD needs to launch its new video cards soon, as nVidia has already shown their new high end video cards. The GTX 1080 video cards are already in reviewer's hands, but the NDA is expected to be lifted only on the 17th this month).


AMD Polaris 10 is the faster GPU, which promises Radeon R9-390/390X level performance, while consuming about half the power, and a 299 USD price tag.

AMD Polaris 11 is the smaller GPU, designed for laptops, but it will most likely appear on low-end to mid-range desktop video cards. On the desktop side there hasn't been a new low-end video card since the AMD Radeon R7-240 and nVidia Geforce GT 730, as both manufacturers recycled old chips for the cheaper models: AMD Radeon R5-230 and nVidia Geforce GT 710/720.

AMD Vega 10 was previously announced for a 2017 launch, but rumors are circulating about problems with high-end configurations of Polaris 10, so they may decide to launch Vega 10 early, before the end of this year.


In this slide AMD detailed the target of Polaris 10. For this (internal testing) benchmark they used a Core i7-4790K system with 4x4GB DDR4-2600MHz (this may be a typo, as officially LGA1150 Core i7 processors only support DDR3) to compare a GTX 950 card with a (mid-range?) Polaris card.


In laptops AMD hopes to achieve "console caliber" performance, which will allow some gaming, video editing and CAD apps on a relatively unimpressive laptop configuration and at a low price.

The FP4/AM4 Bristol Ridge platform will also be launched at Computex, which is really needed to get fans interested again in AMD hardware. Unfortunately this platform will not address enthusiasts, but rather the mainstream users, who may be satisfied with a cheap APU or with an even cheaper dual core/quad core processor. The Bristol Ridge product family will bring to the table DDR4 support and the AM4 mainboards, which will be used by Zen processors too (probably with after a simple BIOS update).


This is the leaked screen capture, which revealed most of the new information.
This article was mostly based on this WCCFtech post.

AMD Zen, Polaris, FreeSync page, SSD Radeon R3 ...

AMD recently accepted to make (server?) processors for China, most likely based on Zen architecture (or ARM), rather than on the outdated Bulldozer stuff. They estimate, that AMD will receive around 300 million USD cash infusion as a result of this deal.


Finally we have clear picture about the upcoming AMD processors. AMD Zen is on its way, as 8-core chips are prepared for mass production. Apparently only 8-core/16 thread processors will be made on 14nm technology this year, some of which will have one (defective) locked module in order to obtain cheaper, 6-core CPUs. Yes, the first Zen chips will only be CPUs, without integrated graphics.

At launch, the AM4 platform will not have dual core and quad core Zen processors - instead 28nm Bristol Ridge APUs will address the low-end to mid-range market. Some of these APUs will most likely come with disabled/defective integrated graphics, in order to keep the low-end gaming market, held today by AMD Athlon X4 FM2+ processors, which are significantly cheaper than the APUs (and Intel CPUs) with the same performance.


The web page for FreeSync has been updated. Thanks to the latest changes, users can choose and match FreeSync enabled video cards and APUs with compatible FreeSync monitors. This compatibility factor is mostly about frame rate, as not all GPUs have the horsepower for high frame rates and not all FreeSync monitors do the sync at both low and high frame rates.

In the beginning FreeSync was only available on the DisplayPort interface, but now AMD in collaboration with Acer, LG and Samsung are expanding compatibility to HDMI too, because it's the most common (and practical) interface.


nVidia recently launched the Geforce GTX 1080, which will be faster than the Titan X and will only cost 699 USD for the founder edition (aka. reference card) or MSRP 599 USD for cards made by other manufacturers (ASUS, Gigabyte, EVGA...etc.) with different coolers.

It's unclear at this time what's the intention behind this pricing difference, but we'll know for sure after the NDA is lifted, later this month, and all the reviewers will publish the results of their tests.

So ... AMD is a funny situation, because nVidia just spent around 2 billion USD on the development of the GTX 1080, which doesn't even use HBM or HBM2, just 8GB GDDR5X. There's no information about how much AMD is spending on research and development, but 14nm AMD Polaris 10/11 GPUs will be launched later this month, which will be in the "efficient + VR Ready" class (R9 290/390 level performance) at a low price point. However, these won't come close in performance to what nVidia -supposedly- made with GDDR5X.

It is rumored that AMD's response to these really fast nVidia cards will be the Vega generation with HBM2, launched much earlier than expected - before the end of the year. It'll be interesting to see reinvigorated competition between these two, after relatively boring year for PC hardware.


In other news, AMD launched a more affordable line of AMD Radeon R3 desktop/laptop SSDs. They are already in stores for about the same price as Kingston SSDs (at first glance). Write speed on the 120GB model is at 360MB/s, while the 240GB and 480GB models write at up to 470MB/s. Read speed is 520MB/s for all three models, but we can expect slight differences in benchmarks.

Embedded Skylake "R" with powerful integrated graphics

The first Intel processors with powerful iGPUs and on-board eDRAM memory were the Intel Core i5-4570R, Core i5-4670R and Core i7-4770R, all from the Haswell generation. Needless to say that they were all made for a BGA socket (FCBGA1364), which means the CPU is soldered onto the mainboard.

After Haswell, Intel released the Broadwell desktop processors, which came very late, but were designed to fit in socket LGA1150 (also used by Haswell), although they only work with the newest LGA1150 chipset, the Intel Z97.

Congatec CS170
Today Intel launched three new "R" series Skylake desktop processors, manufactured on 14nm technology, and featuring Intel Iris Pro Graphics 580 (72 execution units) with 128MB eDDR:

Intel Core i7-6785R - 4 cores / 8 threads, 3.3-3.9GHz, 8MB L3 cache, DDR4-2133/DDR3L-1866 - 370 USD
Intel Core i5-6685R - 4 cores / 4 threads, 3.2-3.8GHz, 6MB L3 cache, DDR4-1866/DDR3L-1333 - 288 USD
Intel Core i5-6585R - 4 cores / 4 threads. 2.8-3.6GHz, 6MB L3 cache, DDR4-1866/DDR3L-1333 - 255 USD

Unfortunately these processors are also BGA-exlusive, they'll come soldered onto motherboards. This may not be such a bad thing, as PC enthusiasts will most likely skip this category, as they always use fast video cards, so they won't be satisfied with any integrated iGPU solution.

Having a max TDP of 65W, these processors are perfect for small HTPCs, NUC-type mini PCs and AIO desktops.
Prices are a relatively acceptable, but only if you're taking advantage of the powerful iGPU. Otherwise you're much better off with a standard upgradeable Haswell (LGA1150) or Skylake (LGA1151) desktop system, which also have iGPUs with respectable performance.

Personally I would also build laptops with these chips, in order to compete with future Zen platforms. One 65W APU is still better than a weak Intel mobile CPU and a separate AMD/nVidia GPU...

Silverstone Nitrogon NT08-115X

Intel should have been back to the drawing board after seeing AMD's new Wraith stock cooler and the smaller (but just as quiet) FM2+ cooler. Instead, it's up to other manufacturers to address the problem. Silverstone just launched the Nitrogon NT08-115X LGA115X cooler, which promises exceptional cooling for standard and overclocked processors too, without exceeding the physical dimension limits of low-profile computer cases. It measures just 48mm with the fan.


The heatsink itself is more or less the same as Intel's stock heat sink for ulocked processors, but it has a much better mounting mechanism with a backplate.


OK. So it seems that the SilverStone Nitrogon NT08-115X has it all, including a copper center for better heat dissipation, but unfortunately the design doesn't address a very important issue: NOISE! The relatively large 92mm PWM fan can produce up to 47CFM airflow, but while doing so, it'll also increase in decibels up to 62.88dBA. That's unacceptable in most situations, as small form factor PCs are usually expected to be silent too.

Personally I consider the Silverstone Nitrogon NT08-115X cooler to be a failure, but if it will be cheap enough, then you can replace its fan with any silent model from any manufacturer, as the cooler has standard fan mounting holes. Mounting a fan with silicone feet is also something you should consider, even if the fan you've chosen already has a good vibration isolation (ex: like Arctic Cooling fans).


One final thought about the Silverstone Nitrogon NT08-115X cooler ...


The Intel stock cooler draws cool air from the top and some from the side, but some of the air also escapes on the sides of the fan, without going through the heatsink. Warm air also escapes on all sides of the heatsink, without going through the full height of the heatsink, thus reducing its cooling efficiency.

Silverstone's Nitrogon NT08-115X cooler has a "closed fan", meaning that air can't escape on the sides. The cooler also has a plastic cover, a continuation of the fan's body, which makes sure the air passes -most of the way- through the heatsink, taking away more heat than the Intel stock cooler.