As someone with a Computer Engineering degree from a pretty well respected university, you might think my opinion on this subject might actually have some sort of merit. I have been thinking about it a lot, and I really don’t think anything I can say on this subject is any more relevant then what a 6 year old would say, but I’ll give it a shot.
Recently, a couple of things have happened between Intel and nVidia, a little bad blood you could say. Essentially Intel has made claims recently that in the not-so-distant future the graphics card as a separate pc component will become unnecessary, as Intel works to move the graphics processing features onto the die of the CPU itself. Now before I go into nVidia’s rebuttal, I would like to focus on this for a moment, and analyze it.
I believe that in the future chip manufacturers will absolutely be able to bring the graphics processor onto the CPU chip, in fact there is absolutely no-doubt in my mind they could. Could be as good as a separate card, absolutely, probably better. Now while I have no doubt in my mind that this will happen, the question remains, will it put the 3rd party GPU out of the picture? I tend to think not. There are many reasons for this.
We are rapidly approaching a point where the speed of the desktop CPU is so fast that it need not be any faster. When I say desktop, I am of course referring to the consumer desktop market. If you have the latest quad-core Intel CPU in your desktop, I cannot think of many common everyday tasks you would need a new CPU to accomplish anytime in the forseeable future. The only thing, in my experience, that has EVER warranted a PC upgrade within the last 5-7 years has been to keep up with gaming demands. This means that if Intel integrated a powerful enough graphics processor into their CPU’s to play the latest games during the time in which they come out, when that graphics processor becomes outdated; which will probably be quickly (<1 year) as is the case with current graphics technology it would force consumers to essentially upgrade their entire PC. Since the CPU is the core of it, weather or not they actually needed a faster CPU or not they would have to upgrade. Consumers don’t like this. I would not like this. Upgrading an entire system simply because you can not upgrade it on a component by component basis is very unfair to consumers.
On the same line of thought as my above comment, I tend to think that third party GPU as an addon card into an existing system is quite convenient. If the powers that be could simply settle on one interface which is fast enough to support the demands of at least a few years to come, a decently built system with the right CPU/Ram/Graphics Interface would likely be able to support several generations of graphics cards. Upgradability at a fraction of the cost of an entirely new pc is essential.
We are getting to a point where graphics cards will not be pushing the limits of the rest of the system as they have in the past. I do not think you will see as many cards coming out which are bottle necked by other components in a PC. There was a time when you could upgrade your video card past that of your system, I tend to think that the most recent systems specced out properly should last much longer. The only reason I have upgraded anything recently was not to support a faster CPU, in fact Intel has been kind enough to keep with the same damn socket for almost 3 years now. I upgraded to obtain things like PCI-express 2.0, Faster Bus Speeds, and DDR3 memory. I probably didn’t even need any of it, but it was a convenient excuse.
This leads me to comment on what nVidia had said in refute to Intels claims of the third party GPU going the way of the dinosaur:
Basically the CPU is dead. Yes, that processor you see advertised everywhere from Intel. Its run out of steam. The fact is that it no longer makes anything run faster. You don’t need a fast one anymore. This is why AMD is in trouble and its why Intel are panicking. They are panicking so much that they have started attacking us. This is because you do still [need] one chip to get faster and faster – the GPU. That GeForce chip. Yes honestly. No I am not making this up. You are my friends and so I am not selling you. This shit is just interesting as hell. [More Here]
I think what they are saying here is a bit harsh and presumptuous to suggest that Intel is dead, or even on a decline is simply ridiculous. Intel is kicking ass right now. They are destroying AMD, entering into the hand-held market, and making better processors cheaper and faster then ever before. However, I do think nVidia is correct in trying to defend itself. They should be careful though, the best motivation is always the motivation to prove someone else wrong, which is exactly the motivation nVidia is giving Intel here.
The GPU being a separate entity, while in the grand scheme of things, might be in the way of progress in the computing world; is a necessity in the economy. Competition breeds progress, and makes that progress in turn more accessible to the rest of us.
Since ATI merged with AMD which I thought was a good move, since I used to be a big ATI/AMD fan, I have always thought of the possibility of nVidia merging with Intel. After seeing this little bout between the two companies I have become doubtful of this possibility but if it ever happened I have a feeling they would be an unstoppable force in the market, which, would probably lead to some sort of evil monopoly. Until that happened though, I have a feeling you would see some real innovation there.
My dream is to one day build a gaming PC which won’t become outdated within 3 days of installing an OS on it; and if it does become outdated, I would much rather drop a $500 graphics card in it and keep on playing then to have to upgrade the entire system from the ground up. If the rest of the system does not need to be upgraded then it will be companies like ATI and nVidia who will allow me to do this, and for that alone, I am rooting for them.
Between the current bus speeds, DDR3 memory, and all the other new superfast bells and whistles the latest technologies have brought us, what they should really do is just put a socket on the motherboard itself for a GPU instead of having to keep changing the damn interface every few months. PCI-e 2.0? Jesus, looks like its time to upgrade, again.