This is the tagline that AMD now proudly totes to everyone that will listen. Fusion, for those that do not know is a CPU that has a GPU built into it. Why is this so important, after all Intel has already done it right? Well my take on this is a bit different from the many pundits that attended this years CES and spent their time wetting themselves over the ability of a netbook sized product being able to play a video game. While Fusion looks pretty today, it is tomorrow that I am looking at to get me geek excitement rolling. Lets however begin first with a bit of a history lesson.
A long, long time ago in a PC era far, far away we had the 286 processor, this chip was mighty for it’s day and delivered outstanding performance but lacked the ability to handle floating point calculations well. However that did not matter because the industry told us that floating point was for the real uber geeks and science types. After all our 286 could run X-Wing well, the big game of the day.
The next generation of chips came along , the 386 and gave us more power but no floating point. However for an additional fee we could put in a floating point co-processor to give us that extra umph when dealing with floating point calculations. The geeks where all over this and I know of few DIYers that did not add the FPU. In fact the first system I ever built from scratch was a 386 based computer, I recall happily using a 386DX40 (AMD chip) to stave off the needs of moving to a 486 and skipping straight to the Pentium.
While the 386 had a readily available FPU for add-on it was still an add-on that was not used except for the real geeks of the world and the science types. However computer gaming had begun to show it could benefit from the use of an FPU and since the gamers of the day where adding them in the games started using them more.
So Intel, always looking to make a buck saw it’s first real chance to gouge the “enthusiast” market of the day and build two models of the next generation, the 486. The SX series was less expensive but did not have the FPU built in, or we could opt to spend more money and the the DX model which came with a shiny FPU built right into the chip. From that point forward the FPU has been in CPUs. In fact we are at the point today that the current generation of “enthusiasts” likely cannot imagine the FPU as a separate chip.
Flash forward to a few years ago, 2007, and a dark lab in the nVidia research park. In this dark lab came out the first beta’s of something known as CUDA. It seems some whiz kid there had come up with this neat idea, video cards did not have to be just for putting things on the screen, they could be used in other capacities! (As a disclaimer I do not know if nVidia was the first to conceive this or not but this my story and I will tell it like I want) So the push began to make the GPU more than it was before. Of course as happened back in the day we where told by many that we did not need this extra ability, only the uber enthusiasts or the science community would make use of it. (Sadly I must admit I was one of those in the early days but over the last year have changed my position)
Fast forward to today and we see that demand growing. nVidia has leveraged their system to the point it can be used to enhance physics in gaming environments and assist in various scientific calculations. AMD has seen the advantage grow enough that they are now investing in building an infrastructure that is more open than CUDA to do essentially the same thing. Microsoft, seeing where this is going has jumped in the game with the addition of DirectCompute to DX11.
We see this technology of use in the computing world with various cooperative computing projects, most notably Folding@Home. There are of course games supported by nVidia that make use of PhysX ( their Physics model) in games but to me the biggest sign of where this is heading is when Firaxis used DirectCompute in the release of Civilization V to handle texture compression. The way they did this was to leverage the parallel horsepower of the GPU, a way of computing that the traditional CPU has trouble with, to allow for a more complex compression system that can work faster thanks to the way the GPU calculates information.
You see while the CPU is a powerful brute there are other ways of calculating data that the CPU is not as efficient at. Wait that sounds a lot like the 286 days doesn’t it. With this in mind then it only makes sense that the GPU begin to make it’s way into the CPU. Instead of existing as a co-processing unit that only a few people will use it can now be considered a part of every PC with the coming of the Fusion chip.
This inclusion of a fully functional GPU that works with the various GPU computing environments means that programmers now have another tool in their hands to make programs do more and do it faster. I had to opportunity to see a program, still in development, that allows the use of this power to be harnessed for facial recognition. Imagine the possibilities for the none gamer. The genealogist that has a few thousand photos could identify a person in just a few of them and then sit back while the computer goes through all the pictures and finds the person they are looking for. Oh sure this can be done now but it would takes hours, I am talking about doing it in minutes!
The integration of the FPU into the CPU all those years ago was a big deal then but today is common fair. The same will happen with the GPU being added today but I feel for the wrong reasons. Yes it is cool that the new chip will allow for lower cost PCs and gaming on smaller system. Yes it is neat that the APU as it has been dubbed, uses less power and runs cooler. But forget graphics for a minute and imagine the new possibilities. This is something that will touch everyday users not just geeks.
Now in fairness I am looking 5 years and more down the road. The APU that AMD has released is just the birthplace, not the realization of the vision. There is still work to be done on the programming end and AMD needs to get more aggressive about it’s efforts to get DirectComputer and OpenCL into the mainstream. However we are beginning to head down that road.
When I say Fusion is the future, I am saying that we are seeing today a fundamental change in the way the CPU functions. A change that is bigger than the various new processor designs we have seen over the years. We are seeing the potential of the CPU allowing programmers to be even more inventive in what they create because they now will have standardized a whole new set of programming options open to them.
I believe what we are seeing today is more than the ability of a single chip to put graphics on the screen or even play games. We are seeing the beginning of a new processor design just like the 486DX, we are seeing the integration of a new co-processor to the CPU that will open it’s capabilities.
Fusion Segment as it Aired Live on 9 January 2011