In the past 5-6 years I have only used Celeron and Pentium dual core processors from Intel - Athlon X2s and an E-350 APU from AMD.
Currently my PC includes:
- mainboard: ASUS H81M-K
- processor: Intel Celeron G1820 22nm
- memory: 4GB DDR3-1600MHz
- SSD: 60GB SATAIII Kingston V300
- video card: ATI Radeon 5550 512MB DDR3 128bit
- case: noname ugly white ATX case
- PSU: RPC 450W ATX, 120mm fan, 72% efficiency
- OS: Microsoft Windows 10 Professional OEM 64bit
- monitors: 2 x 17" LCD, 1280 x 1024, VGA ports only
- keyboards: Logitech K220 Wireless, Serioux Radiant-KBL003
- mouse: Logitech M150 Wireless
- speakers: Logitech 2.1 S220
I'll do reviews on each component soon, but today I wanted to say a few things about the "new" video card I added to my PC.
When I bought my Haswell Celeron G1820 machine, I used it with two 17" 1280 x 1080 LCD monitors: one connected to the on-board DVI and the secondary screen to the on-board VGA port. This configuration worked relatively well, although the processing power of the integrated Intel HD Graphics chip wasn't enough to play an HD 720p video on one screen, and do something 2D/3D intensive on the other. This was really annoying. But as luck would have it, the DVI port died on my 10 year old Fujitsu-Siemens B17-2 LCD, so I had to find a cheap solution to connect both monitors to the PC.
My main constraint was money, of course, but having the Windows 10 license was also a very important concern. I had to find a cheap video card with VGA, DVI and HDMI (for future compatibility) that is still supported in Windows 10. As it turns out, the Radeon 5000 family is the oldest, still supported by official drivers (without tweaking).
So my two options were to buy a cheap, scratched monitor (Grade B) with a working DVI port for about 20 USD or a Radeon 5450 video card for about 22 USD.
Q: So why don't I want to buy a new video card?
A: Neither AMD nor nVidia have released low-end video cards in the past few years. nVidia Geforce GT 710/720 still use Kepler architecture, while AMD Radeon R5-230 is even older, it has a 40nm GPU.
When I bought my Haswell Celeron G1820 machine, I used it with two 17" 1280 x 1080 LCD monitors: one connected to the on-board DVI and the secondary screen to the on-board VGA port. This configuration worked relatively well, although the processing power of the integrated Intel HD Graphics chip wasn't enough to play an HD 720p video on one screen, and do something 2D/3D intensive on the other. This was really annoying. But as luck would have it, the DVI port died on my 10 year old Fujitsu-Siemens B17-2 LCD, so I had to find a cheap solution to connect both monitors to the PC.
My main constraint was money, of course, but having the Windows 10 license was also a very important concern. I had to find a cheap video card with VGA, DVI and HDMI (for future compatibility) that is still supported in Windows 10. As it turns out, the Radeon 5000 family is the oldest, still supported by official drivers (without tweaking).
So my two options were to buy a cheap, scratched monitor (Grade B) with a working DVI port for about 20 USD or a Radeon 5450 video card for about 22 USD.
Q: So why don't I want to buy a new video card?
A: Neither AMD nor nVidia have released low-end video cards in the past few years. nVidia Geforce GT 710/720 still use Kepler architecture, while AMD Radeon R5-230 is even older, it has a 40nm GPU.
Luckily my sister didn't throw away her old video card, which was recently declared dead, after a long series of blue screens (every 20-30 minutes of gaming). So I got a good deal on it, 12 USD, and I managed to bring it to life, by replacing the thermal paste and cleaning the radiator (which was blocked by dust). It has 320 stream processors, 512MB DDR3 128bit and consumes under 39W, while it's only about 10% faster than the integrated Intel HD Graphics (Haswell).
The ports on the card allow for many combinations, as there's one of each: VGA, DVI and HDMI. This means that I can connect either two VGA monitor or two DVI monitors, by converting the DVI to VGA or the HDMI to DVI respectively (i could also connect 2 HDMI screens with a DVI to HDMI adapter).
First I tried disabling the iGPU and connecting both monitors to the video card, in order to free up some system memory (the Intel iGPU used 256MB of memory), but I soon realized that it's not the best option, as most applications and games worked like before, but some ran even slower.
After a bit of experimenting I re-enabled the iGPU and connected it to my secondary monitor. This way apps and 3D games run well on my main screen with the Radeon 5550, without affecting the 720p video playing on the other screen through the Intel iGPU.
One thing I struggled with after I used the PC for a while, is the refresh rate. While I was trying to connect the monitors in different ports, the drivers automatically reset the refresh rate to 60hz, which in itself isn't a problem, but in my case it made the screens blurry mostly in the center. No matter what I tried to tweak in the monitor's menu, it wouldn't help, so finally I looked in the AMD Catalyst Center, and realized the only possible explanation. I quickly switched back both monitors to 75Hz, pressed AUTO configure on both monitors and the picture became clear, FINALLY!
I don't know what's the exact reason behind the blurriness at 60Hz, but little things like this occur with old hardware... and these monitors and video card are definitely old :)
The ports on the card allow for many combinations, as there's one of each: VGA, DVI and HDMI. This means that I can connect either two VGA monitor or two DVI monitors, by converting the DVI to VGA or the HDMI to DVI respectively (i could also connect 2 HDMI screens with a DVI to HDMI adapter).
First I tried disabling the iGPU and connecting both monitors to the video card, in order to free up some system memory (the Intel iGPU used 256MB of memory), but I soon realized that it's not the best option, as most applications and games worked like before, but some ran even slower.
After a bit of experimenting I re-enabled the iGPU and connected it to my secondary monitor. This way apps and 3D games run well on my main screen with the Radeon 5550, without affecting the 720p video playing on the other screen through the Intel iGPU.
One thing I struggled with after I used the PC for a while, is the refresh rate. While I was trying to connect the monitors in different ports, the drivers automatically reset the refresh rate to 60hz, which in itself isn't a problem, but in my case it made the screens blurry mostly in the center. No matter what I tried to tweak in the monitor's menu, it wouldn't help, so finally I looked in the AMD Catalyst Center, and realized the only possible explanation. I quickly switched back both monitors to 75Hz, pressed AUTO configure on both monitors and the picture became clear, FINALLY!
I don't know what's the exact reason behind the blurriness at 60Hz, but little things like this occur with old hardware... and these monitors and video card are definitely old :)