NVIDIA's Fermi GF100 Facts & Opinions
NVIDIA's "Fermi" next generation GF100 GPU is not here yet. Nope, we do not have hardware. But NVIDIA has given us an in-depth look at the specifics behind the architecture as it relates to gaming. NVIDIA certainly remembered us gamers and the fact that we like lots and lots of polygons.
Brent’s GF100 Thoughts:
Brent’s GF100 Thoughts:
All of the work put into making the GF100 a Geometry powerhouse makes me raise my eyebrows in extreme interest. On paper it sounds like the GF100 may trump one of the Radeon HD 5000 series’ strong "future proof" check boxes , DX11 Tessellation usability. The Radeon HD 5000 series may have been the first to market with this technology, and game developers are using these cards for development on right now, but the GF100 has the potential to increase the actual usability of that feature, at least on paper.
However, right now there are unknown variables about the GF100. While we have a good idea about the inner workings of the GPU, there are other factors we simply do not know yet. Not much is really known about pixel shader performance. Without knowing how the clock domains are setup, and what the actual clock speeds are, we can’t even guess at what pixel shading performance will actually be like.
Then there are the variables that will make or break this GPU. We don’t know what the cost of video cards based on this GPU will be. We don’t know how much power this GPU is going to demand and what kind of power supply will be required to operate it. We don’t know the kind of heat output it will generate and what kind of cooling solutions will be required. We don’t know NVIDIA’s production and yields of this GPU. Lastly, we don’t know what availability is going to be like.
When all is said and done, the GF100 sounds impressive in terms of feature set and performance for gaming, but we are far away from being able to tell you what kind of gaming experience it is going to bring to the table. We are also far away from being able to tell you if it is a good value or not based on other physical factors. We’ve been burned before.
NVIDIA has an uphill battle to win with the GF100, a lot is riding on it, and the Radeon HD 5000 series is certainly its watermark. Only a handful of GF100's are actually in game developers hands at this point while Radeon HD 5000 series cards have been in GameDevs’ hands since last summer. Don’t think AMD will remain silent when the GF100 is launched. Don’t expect to see GF100 until late March either.
With GF100 hardware on the horizon, and information on the GF100 as it relates to gaming now somewhat known, the question is, "Do I wait for GF100 or do I purchase a Radeon 5000 series card now?" In my opinion, the answer is quite simple right now. With all these unknown variables I would buy a Radeon 5000 series video card right now and enjoy gaming with the fastest current GPU for gaming, and enjoy an Eyefinity experience. If GF100 is released, and it turns out to offer more than the Radeon HD 5000 series for the factors that matter most to me, then I would sell my Radeon HD 5000 series video card and upgrade to the GF100. If however, it turns out it doesn’t offer what I need, then I would rest happy that I made a good buying decision. Either way, it is still gaming full of win.
Kyle’s GF100 Thoughts:
Kyle’s GF100 Thoughts:
To be perfectly honest, I have not been looking forward to GF100. My expectations have been terrible at best. I think more than a few folks share my expectations. The bad news is NVIDIA has certainly had to weather more than a few "negative" months in the hardware and gaming communities. NVIDIA has not produced and simply gotten man-handled by AMD’s 5800 series. The situation would be much worse for NVIDIA had TSMC actually been able to build 40nm 5800 GPUs for AMD the way AMD expected. The good news is having a buying public with lower expectations, in that those guys are harder to disappoint, so maybe all of this will work out for Team Green. Still there are plenty of folks sitting around with expectations of "I waited for it this long, it better not suck." And "suck" in the hardware community means, when you come out second and don’t totally smash the competition. Funny, but it feels that way sometimes.
The whole PolyMorph (terrible name) out of order engine changes are most likely brilliant. NVIDIA was specific as this being the reason as to why GF100 is late to market. Levels of AA and screen resolution (outside of multi-display Eyefinity and upcoming NV Surround) are no longer a gaming issue for the most part, but we all know we could use more polys on the screen and this is exactly what NVIDIA is looking to deliver and I think it is a great idea. ATI has been traveling down this road for years now, but really never gotten to a true destination on the desktop yet. If we see NVIDIA trump now-AMD in this department it will be a big slap in the face.
GF100 still supports 3D Vision. I have to say, I still don’t like it. I have committed to NVIDIA to spend more time with it in the upcoming months so as to give it more of a chance. Please take the poll in our forums and let us know your thoughts as well.
GF100 will also support NVIDIA Surround (another terrible name not to be confused with anything Surround Sound related) which is its answer to AMD’s Eyefinity multi-display gaming technology. The fact of the matter is that NV Surround is a knee jerk reaction to the competition’s success in this arena. GF100 will NOT support 3 displays off a single-GPU video card. This obviously raises the price of entry for NV Surround tremendously. On the upside, NVIDIA is going to extend the NV Surround feature to work with the some of the current generations of GeForce video cards, surely the GTX 285 cards; beyond that I am unsure. And with a NV Surround compatible SLI GeForce configuration you will be able to easily use three DVI panels, which can be a hassle to do with Eyefinity. Multi-monitor gaming is "the next big thing," and I don’t care how you get there, but you do want to get there.
NVIDIA’s omission of discussing die size, power usage, and clocks is very disconcerting. It was explained that these facts were skipped over so as to not give AMD an informational and competitive advantage. While I am sure you could argue this as true, it is uncharacteristic for NVIDIA to "hide its light under a bushel." Let’s face it, these guys will send out self serving PR every time the paper gets changed in the executive washroom. I would suggest that these issues are not being disclosed because NVIDIA does not want to see its stock take a beating over building a huge power pig of a GPU. Hence forcing another TP change. GF100 is going to be big and hot and require plenty of air flow. Its stock has been heading steadily down since its CES announcements and I don’t think a day of "paper launches" is going to bring it back around.
After spending time with NVIDIA on GF100 I have actually come away feeling more positive about it actually performing quite well. When I saw the GF100 playing games, it was doing so at a very fast framerate. I think GF100 is going to be faster than 5870. Is GF100 going to be a better value? I am not so sure about that. This fact is the 5850 is the best value in "real" gaming right now and this is where NVIDIA is going to have to compete. I am not sure NVIDIA is going to be able to do that for quite a while.
The Bottom Line
We have come away more excited about GF100 (Fermi) than we have ever been. The design of the Fermi architecture is very innovative and NVIDIA engineers get big kudos for thinking outside of the box, or in this case, the traditional graphics pipeline. This thing has got several magazine covers in its future and an engineering award or two for sure. Hopefully the next green GPU "cover" you will see here will be on GF100 performance, value, desirability, and whether or not those concepts got baked in with the last GPU spin.
"Rumored Fermi Videos" - Yes, these are in fact authentic videos of a GF100 triple card setup in action. 3 X GPGPU for Ray Tracing Demo, Single GPU for all others. The systems are all 3.2GHz Intel Core i7 boxes and the comparison box being used in the FarCry 2 benchmark shots are of a GTX 285. The videos were leaked accidentally by PCPer.com.