Gizmo - http://gizmo.web.id has posted a new item, 'AMD Radeon HD 6670'
You don't always have to spend hundreds of dollars to have good graphics, and
the AMD Radeon HD 6670 is an excellent example of getting more than your money's
The HD 6670 is a very capable graphics card, especially if you don't need to run
every game or program on ultra or high settings. It is low profile, taking up
only one PCIe slot, but the heat sink fan can be a little big, blocking
additional slots. The card itself is barely over 8 inches long, shorter than
most higher end cards, making this a great option for smaller PC towers. The HD
6670 is capable of running DirectX 11 as well as OpenGL 4.1 and OpenCL 1.1. This
lets you use and play demanding software and games without having to worry about
your cards performance stalling.
To get the best performance out of the HD 6670, set your display settings to
medium and tweak individual graphic settings in your game to see what it can
handle. While you can always get strong graphics with good frame rates for
medium settings, the HD 6670 doesn't hold up as well in the high and ultra
settings with screen resolutions 1280 x 1024 and higher.
A very positive feature is AMD's Eyefinity multi-display technology, which
allows you to connect up to 4 monitors from a single card. Eyefinity allows you
to use multiple screens as if they were one, much like a television display at
your local electronics store. This gives you a type of peripheral vision and
enhances your game enjoyment considerably.
The greatest strength of the HD 6670 is its low power consumption. The HD 6670's
thermal design power, or TDP, is ranked at 66 watts, while sitting at 12 watts
when idle. This is very low compared other cards, and you will not have to
attach the HD 6670 to your power supply because it gets all the power it needs
from the PCIe port. Unlike other entry-level budget graphics cards, you can
overclock the Radeon HD 6670, but the card quickly heats up with the added
stress. We caution you to be careful when overclocking any processor, as it can
be risky and possibly burn out your video card or cause damage to your system.
You may view the latest post at