Okay lets get this out of the way before we get started. Yes, I work for Sapphire and they make AMD based video cards. Yes my final build is using a Sapphire video card. Yes I know the someone will cry that I am not able to be fair in my evaluation of graphics cards, however my advice is get over it. Over the years I have been accused of being a fan boy for both sides. I have always striven to give our audience the best information I have at hand and will not stop now. So now that we have that full disclosure out of the way can we get on with a discussion?
Choosing a graphics card today is tough and the reason why is both nVidia and AMD have given us some GREAT cards to choose from. Okay let me rephrase that, both have given us great chips. Neither company actually makes the cards, that is up to the various partners. If you read or listen to our past shows you know that I have always recommended that if you want an AMD card choose Sapphire and if you want an nVidia card then EVGA is your first choice. The opinion was formed years ago after dealing with many video cards from many venders and it has not changed today. We have a lot of options for our video needs and I am going to try to help you narrow those choices down.
For the purposes of our build we are setting our sights at game play with the resolution of 1080 and details levels in the high to ultra range. With this in mind we can quickly narrow down our choices and walk through them from the lowest cost to the most expensive.
The lowest cost card I would look at is the HD 7790, a relatively new card on the market. Priced at around $150 this card can do a solid job of gaming at 1080 and with most games set on high. A few of the more intense games might require the detail level to be lowered but they are the exception. A little further up the food chain we have the GTX 650Ti Boost. This card offers a nice step up in performance from the 7790 with only a mild cost increase but if you want a little extra umph, it is worth a look.
Next we come HD 7850, our 2012 Golden Mic Winner. If you thought it was an attractive buy when we named it for our award, now it is practically a steal. At under $200 is offers a great gaming experience for the money and many games will run at 1080 with the detail set to ultra, about an even mix will need to back off to high. For an ITX build the card is small, the Sapphire model we award cools amazing and is super silent. A perfect fit for an ITX build. Up the run around the $200 mark is the GTX 660, again, as in the last rung this card gives a bump in performance over the 7850 but that bump comes with a price. From a size point of view the GTX 660 is a little larger physically, but still easily fits in our ITX build.
With the next card up things begin to spread out a bit, priced at around $220 the HD 7870 gives a nice bump over the GTX 660 in performance. Only a tiny bit longer than the HD 7850 this card is a great size for the ITX build and packs a lot of gaming horsepower, chugging through most games at ultra detail and a few needing to step back to high.
The final rung we hit is the top of the cards we will talk about. The reason for this is simple, at the 7870 we have pretty much made every game we play smooth and with great detail, anything after this is gravy.
For our first entry at the top of our ITX build list is the GTX 660ti. This card packs a lot of horsepower into an package costing around $280. It can run any game we threw at it at ultra levels and does so with seeming ease. The top dog card for this build is the HD 7950, tipping the price scale at around $290. The 7950 is physically bigger than it’s actually a little longer than the 660ti however it still fits well in our ITX build. The 7950 brings more horsepower than the 660ti as well as a large memory buffer, 3GB instead of 2GB along with more memory bandwidth. Now at 1080 this is not a big deal as 2GB is plenty and both cards haven enough horse power to push the pixels around with ease. The advantage of the 7950 comes in when you push a ton of HD mods on a game like Skyrim or decide to go for a 1440 display or bigger instead of the 1080.
In this look we have covered a range of $150 and looked at 7 video cards. This is what meant when I said choosing a card was tough. All 7 cards mentioned will give a great gaming experience and each step up gives a little more detail to your games and horsepower for the future. The choice you make is based on balancing your budget against the gaming experience you desire.
The good news is that ANY of the cards mentioned here will give you a great experience, but there can be only one as the saying goes. Based on the cost, performance and our stated goal of 1080, the HD 7870 is the best bang for our buck in this build. It will give you amazing game play as well as being reasonably priced.
Be sure to join us on the March 27th at Rend Lake College, Doug and I will be there to talk to everyone, give some stuff away as well as show off the ITX rig we have been talking about the last few weeks. Come on by and see it in action first hand.
We would like to thank the folks at EVGA and SAPPHIRE for providing the card samples we used in our testing. All discussion of card pricing was based on EVGA and SAPPHIRE cards on Newegg at the date of this article.
Show segments from show airing the weekend of April 20th, 2013
We have had the discussion on the show more than once about gaming laptops and even a few listener requests to talk about them. So when Dell and Alienware offered us the chance to take a high end gaming laptop for a test spin we jumped at the chance. Equipped with an Intel i7m processor and a nVidia GTX 680M for video this brute packs a punch and is at the upper end of laptop performance.
The outside of the laptop is covered is a rubberized coating that is super durable, non skid and after over a month of testing does not show wear easily. This coating gives the laptop a kind of stealth fighter look from the outside. From the outside you also become aware of the size of this brute. While this might be technically a laptop it is hard to imagine that word fitting this product accurately. Pushing the scales at almost 13 pounds this brute is less portable and more luggable.
Opening the laptop we are treated to a large 17” display, that has arguably the best screen I have ever seen on a laptop for image quality. In fact it looks nicer than many desktop monitors I have seen over the years. These cell phone pictures I took yesterday for this article do not do the display justice. With a 1080 resolution this screen demands movies and high ending gaming on it. Despite what the flash did for the picture, this display is actually really good about not having glare. In fact the bevel around the display, a glossy back, causes more glare than the screen.
The keyboard is fully backlit and is actually divided into three zones that can be individually lit. In fact all of the lighting can be modified thanks to Alienware’s software package that give you complete customization of the bling on this beast. The lighting goes even further with the software sensing a number of games and changing the lighting automatically when events within the games occur.
Once you get past the bling the keyboard is a very nice membrane design with a real effort made to give the membrane some quality in the response, the result is a serious cut above the typical laptop keyboard and equal to many quality desktop keyboards. It has a full number pad and while still using a chiclets square design that is typical for laptops, it is large and open enough to make this a none issue.
The touchpad has a rubberized feel to it and is not slick but not sticky at the same time, if that makes sense. It really must be used to understand what I mean. While still not as nice as a mouse for moving around the screen it is very serviceable for day to day use.
Alienware did not put all the bling however in the looks of this laptop. For sound they put in a SoundBlaster Recon3D and let that power a pair of speakers produced by Klipsch. The result is the best sound I have ever heard from a laptop. Other companies have tried various combinations of sound and speakers and used some big names to do it but they sound like crap compared to this setup. However even the mighty folks at Klipsch cannot produce great speakers this small and stay reasonably priced. So while the sound is really good the audiophile in us wants more. To accommodate us they put two headphone jacks on the side for sharing the sound, we put them to use with our Draco signature headphones and were suitably impressed. The sound was deep and rich in music and movie playback.
When it comes to gaming machines one of the things I love is tinkering with them. Adding more RAM and maybe more HD space or even an SSD is something we all think about. Alienware made sure that there was an upgrade path for you and it was easy to reach. The back panel uses a slide on system like a computer case side panel and is held in place by two screws under the battery. Pop those screws and the back slides off the reveal the system RAM, two HD cages, only one occupied in our system and the serious effort Alienware has made to cool this beast.
The dual fans with heat pipes provide the cooling for your CPU and GPU, they do an awesome job. Even in room at 80F the system never showed signs of overheating, despite being pushed hard and with some pretty hot hardware. This system is to big to ever really consider gaming on your lap so the bottom of the system heat is not a big deal but we tested anyway and found that the bottom of this system under load was cooler than that of the other two laptops I tested. Now these we simpler and more generic builds but they also used less powerful hardware that should generate less heat. This shows me that the solution the people at Alienware came up with for cooling this system works well.
Okay so this computer has great hardware and a lot of bling, but can it game, that has a mixed answer. We threw World of Tanks, Start Trek Online, Borderlands 2 and Civilization V at this to see how it faired under gaming load. All of the games played smooth and at high detail level with the 1080 resolution in place. Sounds good right? Well yes and no, while it provided a great gaming experience it also highlighted the “laptop” tax that I have referred to in the past. Priced at $2500 for the build we looked at the system was actually slower when we looked at frame rates than a $1200 DIY gamer build.
What does this all mean, well it means what I have been saying on the show for years is true, no laptop will ever be equal to a desktop system for gaming within the same price point. However that is just common sense. A lot of the price of a laptop system is for the use of special components to fit the smaller design and you pay for the portability.
When I shared my views with the folks at Alienware I was surprised, pleasantly so, to find they agreed with me. They agree that this kind of a laptop is a pure niche market product. Now if you have a job that keeps you on the road and in hotels all the time. Say a trucker or insurance adjuster, even traveling scrub nurse, okay I think you get the idea; then if you are a gamer this type of portability for you is priceless. It means you get to enjoy your gaming with a great experience and still having it with you every night after work.
For the everyday consumer however this is not a product that really fits well. As a laptop is great but has a short battery life, even just browsing the internet we only got about 2 hours and if you are gaming forget it, maybe 30 minutes. It is heavier than other laptops we have looked at, so much so that it actually requires a lot of shoulder shifting to carry in the bag when walking around.
However if you are the group of people of mentioned, that travel all the time and want to game this beast stands out of the crowd and begs to be played on. The raw horsepower makes it more powerful than most cookie cutter systems sold today so this could be more powerful than many peoples home PC. t The horsepower along with the fact that real gaming grade components are used means this will paly your games and deliver a great gaming experience. Add to that bling factor that will make you the envy of a lan party or be a great talking point with friends, you have something that can fit your mobile life style and looks cool doing it.
While there are other gaming laptops on the market and come in at lower prices, I think I would go to Alienware first. This design is zero compromise with every aspect of the system being well thought out and executed. The system is not just great hardware but it is well put together in an effort to maximize the geek and gaming experience and it delivers in spades.
Show segment as aired on our weekend edition the weekend of 17 November 2012
Well it has taken most of the year but nVidia has finally given us the complete Kepler lineup of GPUs with the release of the GTX 650Ti. Our look at the GTX650TI is a reference card provided by nVidia, this is the baseline that various companies will work from to release their models of this video card. The base design comes with a very simple cooler, 1 gig of memory and 2xDVI and an HDMI connection. This is a basic budget gaming card, the target price for this series is around $140. This places it squarely between the GTX 650 and the GTX 660 cards.
The original target for this card was aimed to be in direct competition with the HD 7770 but recent price drops have put that card at a slightly lower price. Further price cuts have moved the 7850 to within striking distance of that price point and has changed the dynamic of this cards launch.
The $140 price point places the GTX 650Ti squarely in the mainstream card segment and at the upper end of the budget gaming card range. This is shown in the breakdown of the nVidia GTX 600 series cards. In our review of the GTX 650 we referred to that card as a gateway gaming card. The GTX 650Ti is a step up in gaming performance that is aimed at people wanting to taking their gaming a bit farther or for the budget gaming DIY build.
In the briefing we got from nVidia the 650Ti was referred to as the 650 turbo charged, the naming may also give this impression. This impression however is not correct, the 650Ti is built on the same chip as the 660 and then toned down a bit. The result is a pretty solid chip for the price point.
For purposes of our testing we put this board on our test rig with an Intel i5 3450 and 8 Gigs of Kingston HyperX. This build represents a solid build for a lower cost, mainstream gaming rig; a likely target for someone looking at this card. For our testing we used some of todays popular games such as Borderlands 2, Skyrim, Star Trek Online, World of Warcraft, Mechwarrior Online and Civilization V. All tests were run with the goal of achieving great playback with the modest detail we could add. From a frame rate point of view we shot for 60 FPS with the settings as high as we could get. We tested at resolutions of 1920×1080, 1600×900 and 1280×720. Nvidia made clear the goal of this card was to attain 1080 resolutions with medium to high settings and some AA enabled.
The performance difference over the GTX 650 was obvious from the start. With the straight 650 we had been forced to lower settings quite a bit to get good playback at 1080, with the 650Ti the settings were able to give much more detail and still be very playable. In fact our testing showed a nice frame rate boost, on average from all our testing we got about 75% above the lower priced 650.
When we compared to the competition with the 7770 we again see the 650Ti offering a nice boost in performance. However there was little difference in the actual gaming experience, that is until we started turning on features. When a game makes use of nVidia PhysX the 650Ti steps up and the game experience is improved over that of the 7770 from a visual perspective.
With the price drop of the 7850 we felt that we needed to compare it to the 650Ti as well. The 7850 has an obvious advantage in the case of raw horsepower but again the actual gaming experience did not change drastically until you turn on various nVidia features in games.
As we lowered the resolution the 650Ti was able to max out details and deliver some really exceptional play performance, pegging our vsynced setup at 60 FPS. At 1080 this was not always the case but we did see some very smooth playback, even with high settings.
The 650Ti is a strong card for the price point but hit the market with a bit of a limp. If AMD had not done the price cuts they did then this card would rule the price point, as it stands it is just a strong contender. Also it’s price point represents an in-between area that can create some confusion for the new gamer.
The straight 650 comes in for less money and IF you are going to run your gaming at lower than 1080 resolution, is a strong buy for the price. The 650Ti allows the ability to bump to that 1080 resolution for a price jump with a nice performance boost but for another $50 to $60 you can jump to the 660 and get a nice jump again over the 650Ti.
The 650Ti delivers on the target nVidia claimed to be aiming for, giving a solid card that delivers a GOOD 1080 experience at under $200. When put against the original target, the 7770, the 650Ti is a clear win but the recent price cuts have muddy the water. When put up against the new prices of the 7850 there is no clear winner but instead a very competitive setup were you can choose between raw horsepower or features in some specific game titles.
At the end of the day the 650Ti is a solid card and worth the cost. It is a worth candidate for the new gamers or budget DIYers to consider.
Show segment as aired live 13 October 2012
When I look at the mobile phone world and even so many of the enthusiast sites and the advice they offered I realize that old PT Barnum was on his game and dead on when he said, “There’s a sucker born every minute”. In the world of technology too many people are attracted to the latest shiny without ever taking into consideration if their experience will truly be enhanced. This can come from buying a cell phone that only really gives a slightly bigger screen, turns off features we enjoy and give us new features that are broke, or the person that rushes out each year to buy the latest $600 video card because they will get a few more frames per second. We, as a technological culture, are full of “suckers”.
Lets start with the giant elephant in the room, the new iPhone. The masses have rushed out like little bugs drawn to the bright and shiny new phone. Now I am not referring to people with a phone 3 to 4 years old that has worn out and they want to buy the newest. I am talking about the VAST number of people that will pay the full retail price just to say they have the new iPhone, despite their current iPhone still working perfectly for them.
What is interesting about this is that despite the phone only offering MINOR improvements over the previous model they will claim it is life changing. In fact it will be barely noticed in real world use, something we see every day. Take for example the craze for the two way cameras on phones. When the first phone with this feature released the lemming rush was on and people flocked to have this MUST have feature. Well this MUST have feature is now barely used despite people still listing it as a reason to buy a new phone. In fact of 26 different people I know (clients and friends) that have this feature not a single one of them have ever actually used it except for the first few days they had the phone. Take it a step further, only one of these people even know someone that uses the feature at all. It was a nice shiny and people want to buy shiny.
In the PC world one of the biggest lemming rushes I have seen were the extra features on a keyboard. I am talking about all those multimedia keys and web access keys, even the USB ports on the keyboards. I remember when the first keyboards with multimedia keys hit the market, people drooled over them. However within about a week of getting these new keyboards the people found they seldom, if ever, again used those keys. The truth was that while this was a neat feature it’s impact on the daily use of the computer was pretty much non-existent. The current boondoggle for the keyboard world is the addition of USB ports on the keyboard. Various reviewers claim they are so important because they make it easy to connect USB devices? Really, it is easier to try and blindly connect to the backside of the keyboard or to take the time to tilt up the keyboard so you can see what you are doing, INSTEAD of just plugging into the front port on your case which is easy to see? Neither of these features had any real impact on the computing experience and yet today when you see many reviewers look at keyboards they will ding the keyboard for NOT having these features.
Meanwhile, as we all go rushing to buy the latest shiny, new device that will change our computing forever, the tech companies perpetuate this behavior with release schedules that are designed to make the geeks out there feel outdated within 6 months of their most recent purchased. Helping them to keep this steady stream of cash flowing is a majority of the so called reviewers out there that help perpetuate these buying sprees by posting material that is full of meaningless numbers and charts to ensure the people look at the material are lost. This is done, not for direct cash, okay sometimes it is, but also for the chance to get the shiny new toys to play with.
Do not despair however there is a way to counter this trend and even save a little money in the long run. Now the way to do this is kind of technical so I want you to read this carefully.
ONLY UPGRADE WHEN YOU NEED TOO!
Okay now I have gone and done it, I have given common sense advice. Well folks at the end of the day this is what it boils down to. If you can use the apps you want and you can make phone calls then WHY upgrade your phone? I mean if it is old and dying then there is a need. If there is an app you NEED for some reason that will only work on the new phone then there is a need. However if your old phone is doing the job and you were happy until some commercial showed you something shiny and the drool began, you can safely wait, save your money and not hurt your experience in using your device.
The same is true in the PC world, if you wait until it will make a real difference you will be amazed. Now I know this might seem to go against some advice I give to businesses about a 5 year upgrade cycle but that is not the case. For the business the 5 year cycle is a need to make sure they are running the newest versions of the software they need and to ensure little downtime. For those of us in the real world, in our homes waiting for the need is not that hard.
This works for the uber-geeks and the gamers as well. The latest video card might look cool but do you need it? I mean do you have a game that is not running right? If you were happy with your games before the new shiny hit then you will be happy still without it. It was a nice change of pace at the latest nVidia briefing to actually see them embrace this idea. They did not tell us why the new 600 series was better than their 500 series. Instead they compared to 3 and 4 generations of video card back. The reason was, the 500 series still gives awesome game play experience. The older cards are beginning to get a little slow and so the upgrade would impact the user experience, not just frame rates. (BTW bravo to the folks at nVidia for looking out for the consumers)
Look if you need a new phone or component because the old one is broke, then it is just common sense to buy the newest one. If your product is 3 or 4 years old and it does not do some of the new stuff you want to try then it makes sense to buy the latest and greatest. However for most people they are happy with what they have now, well were before a commercial changed their mind, or some benchmark claims they are not a true geek without the latest new hardware. It is too those people that old PT referred.
Do yourself a favor, sit down, have a cup of coffee and use your current device. Has anything really changed? Is it not making phone calls or are your games stuttering all the time? Think before you drop that money on the counter. Sure it might be shiny but does it really have enough too it to change the way your experience will feel? Being a “sucker” is not based on intelligence but rather a choice. Do we choose to be lemmings led by meaningless drivel or do we think the process through and then make choices based on what is best for our experience, not some corporate bottom line.
Well we might be a little late to the party with this review but it is better to be late than never, and definitely late and doing it right. When we were briefed on these two cards I can tell you the 660 is something that was exciting but I felt the 650 was going to get overlooked. Looking at the reviews around the net I can see I was right and am glad we pushed for a 650 for review.
With this in mind lets begin with the EVGA GTX 650. This is the baseline GTX 650 card, running at stock speeds with a single gig of memory. The 650 is designed to be a gateway gaming card. What that means is this is the base line card that someone new to computer gaming might look at. To fully understand this concept lets put together a scenario that is going to represent a pretty large portion of new gamers.
Your buddy has been telling you for the last 2 months about how much fun he is having playing Champions Online (or any other MMO or online style game you want to name). Sure he knows you play BF or MW on your Xbox but the games on the PC offer more choices and options, he sure wishes you would join him. So you load up the game of choice on your cookie cutter PC you got from Best Buy and the game runs like CRAP. Your buddy explains that you need a better video card to enjoy the game, now lets get real do you think this scenario will end with a $200+ video card? Of course not, you want a basic, low cost card that will let you see what all the craze is about in PC gaming.
Enter the GTX 650 from nVidia, these cards are priced at around the $110 price point, a cost that would be reasonable to the gaming novice. To help that gaming novice along nVidia made this card with the goal of giving a solid gaming experience at 1080 resolutions with middle of the road detail settings in most games. The choice of 1080 for the target is obvious as it is one of the most used resolutions in gaming and monitor design. The idea of hitting the middle detail levels with this resolution means trying to create realistic performance within the price point.
For our testing I wanted to play out the scenario I listed above so to test the 650 I got a cookie cutter PC using an Intel Core 2 G630 processor with 6 Gigs of RAM. The system was an Asus built computer and can be purchased at Best Buy, or a similar system for around $450. This makes it the type power level of PCs found in most homes of people not yet into gaming.
As you can see in the picture the 650 card is small, very compact, this fits well with it’s target. A lot of the cookie cutter designs that will allow an add-on card are smaller case and this small design fits nicely. The specs on the box claim this card needs a 400 watt PSU, well hit snag one. The cookie cutter PC we used had a 250 watt PSU. I went along with the scenario I had laid out however and presumed to not have the knowledge to foresee this issue. Good news, the card was functional on the 250 watt PSU, I was even able to play Champions Online, World of Tanks and a few other games on it. The bad news was that while gaming the system was pushing it’s PSU right to it’s limit the entire time. The PSU held up for testing but this is NOT a scenario I would recommend. This does however show that picking up a good 350 or 400 watt PSU will give you the juice you will need for this card.
What about the actual game play? The GTX 650 lived up to expectations and delivered a solid gaming experience at 1080 resolutions when the settings were middle of the road. I was able to push the settings up on a few games but the majority would begin to bog down. This is not a bad thing however since this card squarely delivers as promised on the target nVidia set for it.
The GTX 660 is the next up the food chain of the nVidia 600 series. Priced at around $220 this card is aimed again squarely at the 1080 resolution market and this time with the goal of allowing for high detail levels. The GPU on this card is quite a bit more powerful than the 650 and a step down from the 600ti, it might share it’s name but not it’s chip. It does however share the feature set of the 660ti and higher end cards in the lineup, including the ability to boost it’s clock speed automatically and use SLI, both of which are lacking in the GTX 650. The model we got from EVGA is their Superclocked model with 2 Gigs of RAM.
While the 650 is the gateway gaming card the 660 is the mainstream work horse. The price point keeps it within reach of most people and the performance gives a nice boost when that new PC gamer wants to take things to the next level. For our testing purposed we used an Intel i5 3450 with 8 gigs of RAM. Again our goal was to represent a machine that is likely in a scenario were this card would come into play.
We again threw our normal mix of online games at this card but then added such games as Civilization V, Elder Scrolls, Mafia II and a few others. The reason for the increased game selection is the change of target for this card. Someone getting this card is likely not a person new to PC gaming and not being mentored into it by a buddy. This is a gamer that has a more open selection of gaming experiences for the PC and wants to enjoy them. Every game we tested we ran at 1080 with detail level set to high within all the games we ran. Like our testing of the 660ti and 670 we did our tests with Adaptive VSync enabled.
From a pure gaming experience point of view this card delivered in spades, every game had a smooth playback and looked great. In fact only the more hard core gamers I had look at the results could really see any difference between the gaming experience with the 660 and the 660ti. Once we hit the benchmarks however the difference became a little clearer. With the 660ti I could pretty much peg the frame rates at 60 FPS with our testing. (remember vsync is on). With the 660 the rates dropped to an average of around 47 FPS. Now is a drop for sure but member you are saving around $100 on this card over a 660ti and the lower rates were still high enough that the game play was smooth.
These two cards together offer a serious one two punch for the PC gamer. The 650 is a great entry level card, priced at the perfect point for it’s target and offers a performance edge over the competition within it’s price point. The 600 allows a nice step up and still gives a solid budget value and again has a performance edge within it’s price point.
This entire lineup of 600 cards has been really interesting to watch as they have been released. Each step of the way nVidia set clear targets for the cards in terms of the type of gamer and the price range they were shooting for and each time they hit the bull’s-eye dead center. This has been arguably the most impressive card line release I have seen. Each card delivers the best performance in it’s price point and has solid, easily defined lines between each step. The cards are aimed squarely at various ranges of the gamer market and the ranges are clearly defined. Each of the cards are very efficient in their design and the gaming experience is outstanding across the board.
Now we have talked about the cards but we have not mentioned much about the partner that made these cards, EVGA. Each of these cards are very high quality in their build. They tend to be closer to the baseline design produced by nVidia than other partners but EVGA manages to to take those designs and tweak even more out of them. The cards come with a 3 year warranty and this can be extended if you desire for a reasonable fee. I can tell you from a quality point of view I have not had a single EVGA product ever fail on me, something that few other companies can brag about around our labs.
If you are looking for that gateway gaming card for you or a friend I cannot strongly enough recommend the EVGA GTX 650. You can get the 1 Gig card in a Superclocked model for $10 and I would definitely recommend it. A 2 Gig model is also available and while it might give a boost I personally would not go that route with this type of card. The extra memory is nice but the feature set you need to push to make it noticeable is really outside the target of this card. If you want to go the 2 gig route I suggest spending the extra and getting the EVGA GTX 660. The extra cost raises the performance and feature set to a whole new level and still remains a reasonable price point.
As aired live 15 September 2012
When the GTX 670 was released we were impressed, the card was very fast and offered a great feature set. In fact the only downside we found was the cost. It was just hard to justify our listeners dropping $400 plus for a video card that, while awesome, was mostly wasted on the typical gamer setup of a 1080 resolution monitor. We said then we were waiting excitedly for the 660Ti and here it is.
The card we have for testing is the Super Clocked editions from EVGA. What they have done is up the base clock speed from the stock configuration of 915MHz to 980 MHz. In essence they have taken the boost speed that Kepler offers at stock and made it the stock speed. They have also upped the boost speed from 980 MHZ to 1059 MHz. This means that the card we are testing comes with a nice overclock out of the gate.
Now if you looked at the above speeds and think something sounds familiar, you are right. The 660Ti is essentially the same chip we saw on the 670 card. In fact it is identical except for a single aspect, the memory bandwidth has been reduced from 256 bit memory bus to a 192 bit bus. At higher resolutions with features turned up this could result in a slow down but at 1080 resolutions the impact should be minimal.
The package from EVGA is what we have come to expect from the premier video card party for nVidia. The two power adapters and DVI converter are individually wrapped. The card itself is based off a reference design with some EVGA tweaking down under the hood. You get a quick install guide, information on a 3 year warranty as well as some stickers, a nice case badge and even a poster to hang on the wall of your geek room. What is not shown here is that with this card’s launch various etailers will also be giving a free copy of Borderlands 2 with each card purchased. This is sweet because we are not talking a second tier or older game but one of the most anticipated games of this year.
An examination of the card itself may look familiar if you have seen a GTX 670. It should, for all practical purposes the cards design is identical. The PCB is a shorter board with the fan section of the heatsink actually off the end of the PCB. For power it uses dual PCIe power connections, but do not be fooled, this card sips juice.
For testing purposes I put the card into my main gaming rig and tested it against a GTX 670 and an AMD 7950. Wait a second, this card costs $300 and you are testing against a 7950? That’s right, in our briefing nVidia claimed it ran with the 7950 despite being put in the cost bracket of the 7870, well let’s see if they told us the truth. For purposes of our testing I used the rig listed below.
- Intel i7 3820 (stock)
- Sapphire Pure Black X79N
- Kingston HyperX 1600 RAM
- Kingston HyperX 3K 240 Gig SSD
- Thermaltake Level 10 GT Case
- Thermaltake Toughpower 850
While many reviews of this card will show you a bunch of benchmark scores, we look at video cards a bit differently on our show. We do run a number of benchmarks but these are for comparison purposes and we do not give you the actual scores. The reason we do this is the benchmark system as it exists today does little to tell you about the real world performance you will get with the cards. For this we use the benchmarks as a reference and then use subjective, actual game play, testing to reach our conclusions. This combination of information allows us to give you a simple to understand evaluation of the cards.
Let’s begin by looking at something that caught my eye right away when looking at the cards stats. This card seems to be, for all practical purposes identical to a GTX 670 so I began my comparison there. I wanted to start with something that was measurable and so I began my testing using the 3DMark11 test suite. For this comparison I used the default Extreme settings. We did not look at lower settings because lets be real anyone dropping $300 on a video card is NOT going to want to game at lower than 1080 resolutions. When the test numbers came back we found that the GTX 660Ti was only 5.5% overall behind the GTX 670. Now think about that for a moment, the GTX 670 costs $400 and the GTX 660Ti is coming it at round $300, so 25% less cost but only 5.5% performance lose? WOW! Now in fairness the model we are testing is not a stock 660Ti but the cost difference for the SC model from EVGA is only $10 more.
Benchmarks however are next to meaningless to me, let’s look at gaming performance. Now I could take the time to list all the games we spent time looking at but the list would be pretty long. Let’s say it is safe to say I looked at MMOs, RPGs, FPS and RTS gaming. Of all the titles I looked at, most of them can hold 60 FPS like a rock when vsync is enabled. What that means is when I run the games at 1080 with the highest detail levels the games allow and vsync enabled (this usually locks in at 60 for most monitors) the game will run right at the 60 frame limit with only minor drops ever during game play when using a GTX 670. Well the 600Ti delivered an identical showing. Skyrim, World of Tanks and many others all pegged the limit and ran like as smooth as butter. In fact in NO game tested did I see any performance or experience drop when switching from the 670 to the 660Ti.
After all the testing was done it was clear the 660Ti is the 670 with just a lower memory bandwidth and this has no effect on game play, worth mentioning, at 1080 resolutions. What about the claim that the 660Ti could run with the 7950? All of our testing backed this up. The 7950 is a very capable card but at no point could it pull a lead over the 660Ti, again costing less money.
Now I wanted to see how this scaled so I put the card on an i5 3450 at stock speeds as well as a Phenom II 965 at stock speeds and the card continued to deliver an outstanding gaming experience. This card is a true gem and continues to put nVidia in the strange place of finding it’s main competition is itself. With the majority of gamers playing at 1080 resolutions or lower it is really hard to justify spending $400 for a GTX 670. It is a great card and gives a gamer a ton of headroom to make sure his games today and tomorrow will run well, but for most of us $400 is a tough pill to swallow. Along comes the 660Ti which is pretty much identical on every way. Even going so far as being truly identical in gaming experience at 1080 resolutions, and suddenly the choice of getting a 670 is a little harder to make.
With the Kepler release nVidia has truly raised the bar. The 680 delivered an amazing card at it’s price point when you considered not just the performance but the low power usage and great feature set. Then the 670 came along and the performance lose was TINY yet it dropped the price $100 and kept that same great feature set and high level of performance. Now we have the 660Ti and the trend continues. This card drops $100 off the price but takes a minor performance hit and considering the target audience no performance hit at all over the 670.
With an amazing feature set, nVidia putting more emphasis on gaming and gaming support, a reasonable price, great game for free AND great performance this card is a no brainer for the mainstream gamer wanting the best bang for their buck at the upper end card. Sure you can go higher but if you live in the real world and use a single gaming display at 1080 this is the top of the line and buying higher nets you nothing of value but less money for buying games.
As for the specific model provided to us by EVGA? The EVGA GTX 660Ti SC is an awesome card for the money. Only $10 more than the stock speed cards this gives you a nice little performance boost. You also get EVGA quality and a great 3 year warranty as well as the best video card tweaking software on the market Precision X. If you want to take your gaming to the next level then this is card for you!
Show segment as aired live 18 August 2012
nVidia for some time has championed the cause of the GPU compute direction for PCs. They have pushed out their CUDA programming system and shown how it can make the GPU in a computer do way more than put pictures on the screen. Now they have also discussed gaming with their cards and put some neat features on the cards but when you speak to them the push was always CUDA and GPU Computing, that is until now.
With the first release for the Kepler GPU there was a difference in the attitude of the people at nVidia. In the various briefings I have attended and even the keynote I listened to live by their CEO, I have not heard the same drive at Cuda or GPU computing. In fact I barely heard it mentioned at all. However what I have heard talked about is computer gaming and nVidia renewing their commitment to it.
As an avid computer gamer. this push and the vigor I hear in their words excites me, so when we got the chance to take our first hands on look at a Kepler based video card to say I was excited was an understatement. However I was cautious as well, history has shown us hardware companies love to hype a product, so it is not until I get my hands on it that I actually believe anything they say.
Lets begin by taking a look at the card itself, the particular card we are looking at is a reference design card provided by nVidia. The card came in a rather simple black box, however as I began to open the box it became apparent that nVidia was serious about this being a gaming card. Inside the outer box cover and on the back of the box was printed what you see in the picture to the left.
Since this is a reference designed card there was nothing in the box except for the card itself. The card we have in front of us is not a fancy design, in fact it looks from the front to be very similar to the already release GTX 680. It has the same looking cooler, a design that has over the years become the stock look for most basic higher end video cards.
In this case the appearances are correct, the cooler we are seeing is the same one used on the GTX 680. The fan and shroud are designed to push the air through the shroud to the back of the case and out vents on the back of the card next to the various display ports. The shroud is actually very tightly enclosed so that air does not escape from the top or the inside thus forcing the air out of the case and removing the GPUs heat from the system.
The design looks vary plain jane but again remember this is a reference design. This same design in fact is used in a lot of the cards that are on the market and basically just snazzed up with some artwork stuck to the shroud. The cooling solution works well, in our testing we found that the temp when gaming seldom went about about 65C, even with overclocking it did not rise about 70C. The testing was done in a Thermaltake Level 10 GT with the fans all set to low and only using the stock fans.
Not only is the stock design doing a good job of cooling the card it is also quiet. It was impossible to pick the fan for this card out of the normal sounds coming from the computer unless the GPU was pushed hard with stress testing. Under normal game play the card is very quiet and cool, exactly what we expect and want from a cooling solution.
The Kepler design is super efficient in the way the card handles power and even performance. Using a dynamic clock system that monitors the game you are playing for performance, the card kicks up or lowers the cards GPU speed as needed to ensure smooth game play. Now the kicking of the speed up makes sense to a lot of people but you might be confused by the kicking down. The simple truth is not all games need a card to run full tilt to ensure great game play. This design allows the card to pull back on the power of the GPU to still give a great game play experience and at the same time cut the heat and power usage. The design aspect of this was easy for us to test, we played games and watched the power levels. The 670 delivered on it’s promise and sipped juice, actually coming close in many cases to much lower powered cards when it came to power draw.
The card comes stock with a good set of connection options, 2 DVI ports, an HDMI and a Display Port. Where previous models of nVidia cards required two cards to game on multiple monitors, this card out of the box directly supports three monitors for gaming. Plus you can hook up a fourth monitor for other uses such as watching your voice comms software or searching the internet for gaming hints while you play. The mutli-monitor versatility built into this card is very impressive if that is the type of setup you enjoy using.
Flipping the card over I had a bit of a surprise, this card is much smaller than it seems. The fan area of the shroud is not actually over the card’s board but in a separate area. The board itself is tiny, in the picture on the right I wanted to give some perspective. The card of the 670 ends where the fan area starts as the picture shows. The card below it for perspective is a Sapphire HD 7850.
nVidia claims the reason for the smaller board is that the chip’s efficiency allows them to use smaller power components and this allowed them to move the power from the tail end of the card to the business end. This simple move meant that less space for the cards components were needed and has the added benefit of putting the heat producing power components near the exhaust and thus making them easier to cool.
Okay so at this point I imagine you are all chomping at the bit, tell us Ed, how does it perform! Well let me put it this way, the card performs from a raw horsepower point of view so well that it steals the thunder from nVidia’s higher end GTX 680. In all of our subjective testing the GTX 670 was able to deliver maxed out detail, some of the games we ran were Skyrim, Champions Online, Star Trek Online, Civilization, Batman; Arham Asylum and Mafia II. We did run other games as well but I think you get the idea. The system we used for performance testing was an i5 2500K on a Gigabyte Z68X-UD3H-B3 with 8 Gigs of Kingston HyperX DDR3 1600 and using a Kingston HyperX 3K 240 Gig SSD. Our testing was done at a resolution of 1920×1080 and the system we ran was a typical usage system. What I mean by that was it was not purely for benchmarks, we had Comodo AV, Steam, Skydrive and Skype running in the backgrounds. Our goal is to test the card as you are likely to use it.
Now everyone knows we do not put much stock in benchmarks and with a tons of sites on the web doing extensive benchmark testing I do not see the need to rehash the material they found here. If you need specific benchmark scores to look at I suggest heading over to the review of this card done by our friends at Overclockers Club, the numbers they got were in line with our findings. I can say however that the performance is impressive, in Skyrim with the HD pack from Bethesda and maxed out in game settings I was able to pin the frame rate counter at 60 FPS. Now that is with vSync enabled and by pinned I mean it was never under 55 FPS and was mostly right at 60 FPS. I did the same in Champions and Star Trek Online with maxed out settings in each. No card that we have ever had in the labs was able to deliver such a consistent frame rate in a game.
However even with the impressive horse power, this is not what has impressed me the most. The card is fast no doubt, in fact it is safe to say that it is the fastest card in it’s price point and is even close in performance to the top end cards from both nVidia and AMD. That is so strange nVidia building their own competition. What impresses me more however is the effort made by nVidia to have a real impact on the gaming experience other than pure frame rates.
Lets begin with efforts to refine anti-aliasing, nVidia introduced FXA a while back but with the Kepler they have placed to ability to force FXAA to be enabled, even in older games that did not directly support it. FXAA allows for better image quality with less of a performance hit. In Kepler this is being further enhanced to TXAA, a new method that is available only on Kepler based cards. This new method allows for even higher levels of anti-aliasing with less performance hit. What this means for all of you is your games look better and you do not lose your smooth game play by using it.
Next we come to the GPU Boost, this system basically monitors the work load on the hardware and makes changes dynamically to the GPU clock speed to ensure it stays within it’s power/heat envelope yet gives maximum GPU speed. We have seen this feature before in CPUs and it only makes sense to see it move into the GPU. While tweakers can still overclock the card even more if they desire, there is little need as the card at stock speeds does a lot of auto-overclocking. In fact I do not recall ever seeing my GPU running at stock speeds, it seems like when it was under load it was always running jacked up. This increases the gamers experience by offer extra power when it needed and reduced power when it not thus saving electricity.
We also see Adaptive VSync, basically this feature allows VSync to work more effectively than the stock on and off options. VSync is the setting that ensures your game does not produce frames faster than the monitor can show them. A lot of games turn this off but doing so risks something call tearing. That is were the monitor could not keep up and you get some textures that tear on the screen. By keeping VSync enabled the gamer does not get this but does suffer through some performance issues. Adaptive VSync takes out those potential performance issues by capping the frame rate but also leaving it more open at lower speeds. This has the effect of removing the downside of VSync and thus means you have no reason to see tearing.
A similar tool that we now have is Frame Rate Targeting. The reasoning behind this is that some older games would take the GPU to max power and have incredible frame rates but lets face it, at around 75 it is hard to see performance boost in real game play so 300 FPS is just crazy. Rather than force you to waste the power and potential of your card the Frame Rate Target lets you set a max frame rate your system will strive for and actually slow down the GPU if it can once that target is met. This tied in with the GPU Boost means you only use as much of the GPU as you need to use. This however can in my opinion replace the Adaptive VSync, it can be used to create the same effect after all. Set your Frame Rate Target at 60 FPS and you do not need any form of VSync since the GPU will hold down the frame rates to 60 and with this you have none of the down side of VSync.
Add to these features the old standards from nVidia of PhysX and their 3D Vision and you have a card lineup that is geared to give the gamers the best experience possible with their games.
Priced at $399 the GTX 670 is at the upper end of the gaming cards in cost. The card sits squarely in the second tier of the nVidia lineup for single GPU cards and is direct competition to AMD’s 7950 and 7970.
The GTX 670 really does change the gaming experience when put to full effect. The horsepower of the card lets you run your games at maxed out detail and the features make sure the game looks great and the card is efficient in it’s use. In June nVidia is going to up their game for the user experience with GeForce Experience, which we will review. Right now however they offer great support for games on GeForce.com which has a great section of optimized settings for various game and card combinations.
If you are a PC Gamer then nVidia has you squarely in it’s sites. They are looking to become the defacto gaming card for the PC. With the Kepler releases to date and the features they offer that are aimed at improving the gaming experience I would say so far they are hitting the target dead center.
nVidia GTX 670 Review as aired 12 May 2012
Buying for the geek in your life can be hard and we all know it. The typical geek gets what he wants when he wants it and the only things left on his Christmas list are usually items that are way outside our Christmas budgets. Our current economy has people being forced to cut spending costs so these items are even less possible to buy, but fear not Computer Ed is on the case. I have put together a list of a few items that can fit most budgets and will make any geek happy.
Mousing Surface: We have talked about these on the show and while they may not seem sexy they are something that geeks appreciate when it comes to getting the most out of their computing experience. There are a lot of them out there but only one that I would want under my tree. The WoW!Pad is ultrathin and made of the PVC material making it super durable. It is available in a few different sizes and ranges in price from $10 to $6 on Amazon. The WoW!Pad comes in a few different sizes and round as well as the normal squared look. Personally I have used the largest pad for a long time now and still love it. The WoW!Pad also comes in some interesting styles with the Master’s Series which have a number of great paintings on the pads. No cloth or thick bulking pad here, just clean durable mousing goodness.
USB Keys: No self respecting geek can ever have to many USB keys and my choice for years has been the Corsair Voyager line. This USB key is unique in that it has a rubber housing. This makes these drives incredibly durable. How durable you ask? In our initial testing of these drives on release the test drive worked perfectly after be dropped 12 stories, run over by a car, stepped on by your truly, washed three times in a row and chewed on by a dog. All of these tests where done to the save drive, so it was a cumulative effect and it still kept working. Currently you can pick up a USB 2.0 model with 16 gig for under $20 on Newegg. Eight gig models are consistently below $20 and up to 32 gig which can be had below $50. The entire lineup of Corsair Flash Voyager models can be found on Newegg, check them out.
Warming the Heart: I do not know anyone that does not have a person in their lives that is not always cold. You know the people I am talking about, they keep a space heater under their desk and even have it on low sometimes in the summer. They might be warm of heart but their hands and feet are always cold. Well we have something here to help the hands at least, USB Heating Gloves. These nifty little mitts have two heaters in each glove and are powered by the USB ports on your PC. The ends turn up to allow the fingers to type or can be turned down to allow for maximum hand warming. At a reasonable price of $22 these are a great gift that can be fun and practical for the female geek in house. However do not feel left out guys, there is a model for men as well. You can find these at usb.brando.com.
Boys Just Want To Have Fun: Yeah I know the song says girls but the truth is so do boys and nothing can be as fun for a guys as something wacky and militarily oriented. With that in mind bring to his desk his very own USB Rocket Launcher.
I know this gets talked about every year but seriously these are cool. Priced at under $20, you can find these at Think Geek and make the little boy in your geek really happy this year. The device needs batteries which is a bit of a bummer but can be controlled from the computer to rotate, aim and launch it’s foam warheads of fun across the room. These are great fun and your guy will love it, that is until mom gets hit in the head and then a disarmament is forced on you.
Light Up Their Life: Most geeks, especially gamers, seem to have this affinity to using their computers in darkened rooms. I know I have it and others I know prefer it as well. For me it stems from the fact that the screen pops more and lets me see my games details easier. Whatever the reason this is just a fact of life. The overhead lights of a room are just to much and often finding a small desk lamp is okay but difficult to get it to light just the area you want it. The good news is that there are a ton of USB lighting system out there from traditional desk lamps to flexible neck devices. These flexible lamps are great for laptops but I like them for some of the new mechanical keyboards. These can plug into the USB hub on something like a Steelseries or Thermaltake mechanical keyboard and give good lighting for those dark gaming sessions. USB lighting can be priced anywhere from $10 to around $30 but it comes in quite a few different styles, check out usb.bando.com for a wide selection.
Game On: Speaking of those dark night gaming sessions, lets face it we all have a gamer in our life. With our budgets like they are right now, many of the gamers we know out there have moved to the F2P models of gaming. While the game is free to play there are however always little items in game that the person might want to buy and getting the points for those items requires spending a little real cash. With this in mind a great gift for those gamers can be purchasing some of the game store points for them. These store point bundles can come in various price packages ranging from $10 to around $30. Some of the games will require you to do it through their site but some games like Wizard 101 have point cards available in stores.
Protecting the Smart Phone: Can you seriously say you know a geek that does not have a smart phone on their hip? If so you have found a species of geek that will be extinct in a few years. Today smart phones are the craze and we all seem to have them, we also all seem to drop them sooner or later as well. These are not just phones for us, they can often be our business or even more important our portable internet connection! With this in mind we should protect these investments and nothing says safe to your smart phone user like an Otterbox phone case.
These are the premier portable devices cases and while they are not the most inexpensive, they are the best protection your devices can have. Pricing can range from as low as $20 to as high as $75 depending on the device you want to protect and the level of protection you want to have. I have used these, and tested a few, they really are the best protection your phone can have bar none.
Check for these at your local cell service store, or if they do not have them then head over to the Otter Box site to look at their selection.
Gaming at the Next Level: Okay we have kept the budget under control a bit but sometimes it is better to get one uber gift than a lot of cool ones. If you have a budding gamer in your life there is a chance he is suffering from video card envy. The problem is modern PC games need some umph in their video card to really enjoy them and most budget cost PCs do not come with that umph. The good news is the video cards of today often pack some nice gaming power in a reasonably priced package. Video cards based on the AMD 6670 or the nVidia 550Ti can be had for the $100 to $130 price range. These will take a budget PC to a whole new world when it comes to gaming, allow for good gaming performance without breaking the bank. While you can get these cards locally I would suggest looking at sites like Amazon and Newegg first to get the best prices.
I could keep going, listed all sorts of other devices for the geek in your life but we only have so much time before Christmas and I do not want you to spend all that time reading a wall of text. The good news though is that you can make your geek happy this year without spending a lot of money. If you have geek gift questions be sure to email them in or call into the live show. Also listen to our show for more gift ideas over the next week. Of course be sure to keep listening and enter our holiday giveaway to maybe win something cool for your geek or like our Facebook page for a chance to win all year long.
Remember, the family geek is one of the most under appreciated people in the family, at least we feel that way often, take some time this Christmas to show you how much he means to you. Let him enjoy the new toys he gets this year before you ask him to make your work for you.
This is the tagline that AMD now proudly totes to everyone that will listen. Fusion, for those that do not know is a CPU that has a GPU built into it. Why is this so important, after all Intel has already done it right? Well my take on this is a bit different from the many pundits that attended this years CES and spent their time wetting themselves over the ability of a netbook sized product being able to play a video game. While Fusion looks pretty today, it is tomorrow that I am looking at to get me geek excitement rolling. Lets however begin first with a bit of a history lesson.
A long, long time ago in a PC era far, far away we had the 286 processor, this chip was mighty for it’s day and delivered outstanding performance but lacked the ability to handle floating point calculations well. However that did not matter because the industry told us that floating point was for the real uber geeks and science types. After all our 286 could run X-Wing well, the big game of the day.
The next generation of chips came along , the 386 and gave us more power but no floating point. However for an additional fee we could put in a floating point co-processor to give us that extra umph when dealing with floating point calculations. The geeks where all over this and I know of few DIYers that did not add the FPU. In fact the first system I ever built from scratch was a 386 based computer, I recall happily using a 386DX40 (AMD chip) to stave off the needs of moving to a 486 and skipping straight to the Pentium.
While the 386 had a readily available FPU for add-on it was still an add-on that was not used except for the real geeks of the world and the science types. However computer gaming had begun to show it could benefit from the use of an FPU and since the gamers of the day where adding them in the games started using them more.
So Intel, always looking to make a buck saw it’s first real chance to gouge the “enthusiast” market of the day and build two models of the next generation, the 486. The SX series was less expensive but did not have the FPU built in, or we could opt to spend more money and the the DX model which came with a shiny FPU built right into the chip. From that point forward the FPU has been in CPUs. In fact we are at the point today that the current generation of “enthusiasts” likely cannot imagine the FPU as a separate chip.
Flash forward to a few years ago, 2007, and a dark lab in the nVidia research park. In this dark lab came out the first beta’s of something known as CUDA. It seems some whiz kid there had come up with this neat idea, video cards did not have to be just for putting things on the screen, they could be used in other capacities! (As a disclaimer I do not know if nVidia was the first to conceive this or not but this my story and I will tell it like I want) So the push began to make the GPU more than it was before. Of course as happened back in the day we where told by many that we did not need this extra ability, only the uber enthusiasts or the science community would make use of it. (Sadly I must admit I was one of those in the early days but over the last year have changed my position)
Fast forward to today and we see that demand growing. nVidia has leveraged their system to the point it can be used to enhance physics in gaming environments and assist in various scientific calculations. AMD has seen the advantage grow enough that they are now investing in building an infrastructure that is more open than CUDA to do essentially the same thing. Microsoft, seeing where this is going has jumped in the game with the addition of DirectCompute to DX11.
We see this technology of use in the computing world with various cooperative computing projects, most notably Folding@Home. There are of course games supported by nVidia that make use of PhysX ( their Physics model) in games but to me the biggest sign of where this is heading is when Firaxis used DirectCompute in the release of Civilization V to handle texture compression. The way they did this was to leverage the parallel horsepower of the GPU, a way of computing that the traditional CPU has trouble with, to allow for a more complex compression system that can work faster thanks to the way the GPU calculates information.
You see while the CPU is a powerful brute there are other ways of calculating data that the CPU is not as efficient at. Wait that sounds a lot like the 286 days doesn’t it. With this in mind then it only makes sense that the GPU begin to make it’s way into the CPU. Instead of existing as a co-processing unit that only a few people will use it can now be considered a part of every PC with the coming of the Fusion chip.
This inclusion of a fully functional GPU that works with the various GPU computing environments means that programmers now have another tool in their hands to make programs do more and do it faster. I had to opportunity to see a program, still in development, that allows the use of this power to be harnessed for facial recognition. Imagine the possibilities for the none gamer. The genealogist that has a few thousand photos could identify a person in just a few of them and then sit back while the computer goes through all the pictures and finds the person they are looking for. Oh sure this can be done now but it would takes hours, I am talking about doing it in minutes!
The integration of the FPU into the CPU all those years ago was a big deal then but today is common fair. The same will happen with the GPU being added today but I feel for the wrong reasons. Yes it is cool that the new chip will allow for lower cost PCs and gaming on smaller system. Yes it is neat that the APU as it has been dubbed, uses less power and runs cooler. But forget graphics for a minute and imagine the new possibilities. This is something that will touch everyday users not just geeks.
Now in fairness I am looking 5 years and more down the road. The APU that AMD has released is just the birthplace, not the realization of the vision. There is still work to be done on the programming end and AMD needs to get more aggressive about it’s efforts to get DirectComputer and OpenCL into the mainstream. However we are beginning to head down that road.
When I say Fusion is the future, I am saying that we are seeing today a fundamental change in the way the CPU functions. A change that is bigger than the various new processor designs we have seen over the years. We are seeing the potential of the CPU allowing programmers to be even more inventive in what they create because they now will have standardized a whole new set of programming options open to them.
I believe what we are seeing today is more than the ability of a single chip to put graphics on the screen or even play games. We are seeing the beginning of a new processor design just like the 486DX, we are seeing the integration of a new co-processor to the CPU that will open it’s capabilities.
Fusion Segment as it Aired Live on 9 January 2011
Usually this blog and show covers basic mainstream product lines but we had a unique opportunity this week and the fortunes brought us two of the “luxury” end of video cards for a direct head to head comparison. The above $300 price range is typically the area of the extreme gamer, heavy duty folder or the middle aged geek having a mid life crisis. With this in mind I took up the challenge of putting the EVGA GTX 570 SC head to head with the newly released XFX HD 6970. Both of these cards come in very close to the $350 price point and in my opinion offer the highest point that the majority of us would ever consider. Oh do not get me wrong there are more expensive cards but most people just have a hard time spending this high let alone higher.
Since this blog and my show is directed at the more mainstream we will be testing the products at the 1080 resolution and a 1055T processor to reflect a more mainstream approach and we will use part of this review to compare what this higher cost “luxury” level component offers and if it is worth the cost over a more typical mainstream card.
The two cards I am looking at today are both on the shelf models, not the typical engineering samples we usually get for review. The EVGA GTX 570 is a fairly new release with nVidia getting this chips out so they could steal some of AMDs thunder for the release of the 6900 series. The XFX 6970 is built using the recently released AMD chip and is the current high end card of the AMD line up.
Right out of the gate it is easy to begin comparisons. The cards actually have a similar look as far as fan placement and the plastic shroud. The 570 however is almost a full inch smaller than the 6970. Both cards use top mounted power connectors with the 570 using dual 6 pin and the 6970 requires an 8 and a six pin connector.
Moving around to the connection end of the cards we find the 6970 has dual DVI as well as dual Displayport connectors plus an HDMI, the 570 sports the dual DVI connectors and an HDMI connector. Looking closer we see that the 6970 has a smaller heat exhaust port then the 570, we will see what this means for heat in a bit.
Since these cards are really meant for one thing, gaming, I began my testing by firing up a few games that I have laying around. For the MMO market we tested using STO, for the FPS market I fired up Mafia II and Medal of Honor, I also took a look at F1, Supreme Commander II, Civilization as well as Mass Effect 2. I figure this gave me a solid look across the various play styles. As we neared the end of the review I was able to squeak a few more titles in for the subjective testing.
Across the board the system was set with default driver settings, game settings at maximum everything the resolution se to to 1920×1080. I can tell you the results where a pure joy to behold in that everything I threw at this played smooth and looked great on both cards. As for the benchmark numbers it was a dead heat for all intents and purposes. The 570 won a few more than the 6970 but not all of them.
I threw a few tessellations into the mix because this was the early claim of superiority from AMD on the initial DX 11 releases and they had fallen off to a superior system from nVidia, could the 6900 series recover? Well it was a nice boost over the old 5800 series and shows AMD is moving in the right direction but with tessellation running full bore the 6970 just could not catch the 570.
As I said earlier one of the big draws besides gaming or trying to make yourself feel techier was the use of the parallel processing power for folding or distributive computing. Folding@Home is the single biggest use for this I can find looking through the various enthusiast web sites so I fired it up and put these two cards to the test. The difference was STAGGERING with the 570 destroying the 6970 in this test. Now I know people are going to point out that the Folding software is heavily optimized on the nVidia end and it shows. However as I pointed out this is a PRIME buying consideration for this level of card and thus the reasoning does not matter to the consumer the results do if this is something that is important to them.
As we noticed looking at the cards the nVidia card used dual 6 pin power while the AMD required an 8 and 6, what does this mean for power consumption, nothing. When we looked at the power draws of the system using both cards at idle the numbers where nearly identical with the 6970 drawing a couple of watts less. At load the difference stayed with the 6970 actually using a little less power.
We had also notice that the 6970 had a smaller exhaust port than the 570, our concern was that this would result in less heat expulsion and perhaps higher temps. Our suspicions where right. The 6970 was actually 4c warmer under load but that was not all. The smaller exhaust area seemed to also result in more of the warm air escaping into the case with the overall internal case air temp rising 3C. Despite using less power the 6970 was leaving more heat around.
Both designs had very quiet fans in their design. Despite running full tilt neither card pushed it fans above 50% of it’s speed and both where whisper quite.
Our subjective testing revealed what I expected after my first test run, there was no noticeable difference in gaming experience with either card. Well at least not from a performance view point. We did find some issues with a few games that required certain settings be lower or turned off for the 6970 to run without crashes. This brings up a point that will be addressed in the next few weeks in a different article.
The final comparisons we performed was to look at these two cards in a quick comparison with their mainstream counterparts, the GTX 460 and the 6870. The results where that at 1080 the lower costs cards delivered a nearly equal gaming experience across the games we tested. However notice it was nearly equal. What was noticed was the higher settings gave the games a little better appearance. Among my testers the unanimous choice however was to save the money based on the level of difference they saw.
With all this in mind it is hard to call a clear winner between these two cards. The price point is within $10 to $20 and at this range that is not a difference worth noting. The heat and power consumption numbers are very close and nothing really stood out on either card from these numbers enough to make a clear choice. With those areas in a virtual tie we then move to the intangibles.
The 6970 offers a greater level of freedom when it comes to hooking up displays. The use of a single card to get a larger number of displays, as much as 6, firing at once is nice but still hampered by cost and lets face it room. However in the rarified air of the “luxury” users this is still a very viable consideration.
The 570 brings to the table a mature GPU computing environment with CUDA. While DirectComputer and OpenCL might be the future for GPU computing that future is still a ways off, Cuda is a much richer and mature environment at this time. PhysX cannot be ignored either. While Bullet is making inroads we still must deal with the current situation and Physx is out there in the wild and usable today. Finally we come to 3D. I am not a fan of this technology as many know but the truth is that it is here, the masses want it and that is the reality. The other reality is that nVidia is a good bit ahead of AMD in this technology and it’s implementation.
While these are both great cards and ANYONE would be happy using them I have to give this horse race to the GTX 570 by a nose. The raw performance is close enough to each other that the cards are dead even at that stage but when it comes to making use of in the wild today level technology the GTX 570 has a solid edge.
As for using this level of “luxury” card instead of a mainstream I am torn. While the subjective reviewers though it was not worth the cost I am not so sure. The extra $150 gets you a nicer image quality and a chance to hold off a bit longer on the next upgrade. Remember however that the only reason to really do this is gaming pure and simple. Other uses of the card do not see any real improvement over a solid mainstream card, okay you can fold a bit more.
AMD made some great strides with the 6900 series and I think they are heading in the right direction but they got sucker punched by nVidia. The 6900 was supposed to go head to head with the GTX 400 series and in that fight they would have had a clear win. However nVidia pulled the GTX 500 series rabbit out of its hat and then played a serious round house with the GTX 570 at the last second. When the 6900 should have been a leader it is in a tight race, catching AMD off guard.
Horse Race Segment Aired 19 December 2010