Introduction
Microsoft just released Windows Vista Service Pack 1 but that doesn't mean they are resting on their laurels. Amongst other things, Windows Vista Service Pack 1 includes support for DirectX 10.1, something ATI would be profoundly happy to mention time and time again as only their graphics cards at the moment support DirectX 10.1.
Isn't it odd that NVIDIA has yet to include support for DirectX 10.1 even in their latest releases, the GeForce 9600 GT and the GeForce 9800 GX2? In fact, they have virtually thumbed their noses at that "insignificant" update. Well, they have their reasons. Six years ago at SIGGRAPH 02, NVIDIA's chief scientist, David Kirk, was already talking about ray tracing :
“I’ll be interested in discussing a bigger question, though: ‘When will hardware graphics pipelines become sufficiently programmable to efficiently implement ray tracing and other global illumination techniques?’. I believe that the answer is now, and more so from now on! As GPUs become increasingly programmable, the variety of algorithms that can be mapped onto the computing substrate of a GPU becomes ever broader.
As part of this quest, I routinely ask artists and programmers at movie and special effects studios what features and flexibility they will need to do their rendering on GPUs, and they say that they could never render on hardware! What do they use now: crayons? Actually, they use hardware now, in the form of programmable general-purpose CPUs. I believe that the future convergence of realistic and real-time rendering lies in highly programmable special-purpose GPUs.”
Since then, NVIDIA has taken quiet but real steps in moving towards ray tracing as the future of real-time 3D rendering for games. The G80 and subsequent G92/94 architectures, for example, have been designed for general purpose programming. NVIDIA has even created a hybrid CPU/GPU ray tracing renderer called Gelato to make use of the new GPGPUs (General Purpose GPUs).
Even ATI has not been sitting still. Their bluster about the advantages of DirectX 10.1 over DirectX 10 aside, the guys in ATI are not stupid. They have seen the writing on the wall. This is one of the reasons why ATI chose to merge with AMD.
Needless to say, Intel has been actively working on ray tracing. It is to their advantage if ray tracing takes off, because unlike rasterization, ray-tracing works best on multi-core processors. It also obviates the need for the GPU which has stolen much of the limelight in recent years.
In October, 2007, Jeffrey Howard wrote two Research@Intel articles (first article, second article) about Intel's ray-tracing work. The first quoted Daniel Pohl's work in which he was about to modify Quake IV to work with the Intel ray-tracing engine. Just using an 8-core Intel processor, Daniel was able to achieve almost 100 fps at the resolution of 1024x1024.
With further optimizations and even faster processors, it's not unthinkable that even the latest crop of games would be able to run at incredible frame rates without the help of any GPU. The best thing about ray-tracing, according to Jeffrey, is that it's extremely scalable. That means if you replace the 8-core processor with a 16-core processor, you will get twice the frame rate, and so on.
That leads us to Microsoft. Even they know that the release of DirectX 10.1 is not going to make Windows Vista any more attractive to gamers. Numerous articles have been written about the image quality differences between DirectX 10 and DirectX 9 and they all tell the same story - it just isn't all that significant.
Now, ray-tracing would certainly be something else altogether. Take a look at the example on the right (from Intel). See the rendering difference between a rasterized image and a ray-traced image? Notice the more realistic reflections and shadows in the ray-traced image. Would you switch to Windows Vista if doing so allows your games to look THAT realistic? Hell, yeah!
That brings us to the crux of the matter - DirectX 11...
Still ray tracing is too intensive for current day systems. Sure in the future we might have 8 cores or more but they should be trying to do it with a game like crysis. Also Ray tracing CANNOT be hardware accelerated last time I checked. I seriously doubt that nvidia or even AMD would allow for it. Then again AMD owns ATI so they might find a way to incorporate it.
StarBound wrote:Vista... DX11 woot! System buster epic fail.
Still ray tracing is too intensive for current day systems. Sure in the future we might have 8 cores or more but they should be trying to do it with a game like crysis. Also Ray tracing CANNOT be hardware accelerated last time I checked. I seriously doubt that nvidia or even AMD would allow for it. Then again AMD owns ATI so they might find a way to incorporate it.
Of course it can be hardware accelerated, but it's still awhile before we see it in games. Intel Larrabee is supposed to have some capabilities, SiGraph had a distributed model system doing ray tracing in real time, not 30fps but it's a start (and probably cost a fortune). nVidia and AMD are striving to incorporate more programmability and general purpose functionality that makes them one step closer to ray tracing on board. Ray tracing is highly parallel, which makes it a prime candidate to make use of multi-core CPUs, we should be hitting 32 cores by 2010 (Sandy bridge uarch - it is the new architecture successor to Nehalem). Only 2 years away before we start arguing on PCF about whether high clocked 16 cores are better than 32 slower cores.
All computer calculations work faster when they're binary (LSH faster than MUL instruction, RSH faster than DIV instruction). Nothings really prevents us from having 3, 6, 19 or any arbitrary amount of core CPUs except the manufacturing facilities of the designer and CPU architecture. eg. Intel cannot do a 3 core CPU easily because their C2 cores are linked together in groups of 2, their quads are really just 2 duals stuck together on the same socket. A 7 or 8 core native CPU might have really bad yields making it impractical for today's fabs.
The PS3 has 1 PPE core and 7 SPE cores, the full Cell has 1 PPE and 8 SPE cores. 2 more asymmetrical arrangement depending on how you view them.
B0r0m1r wrote:1 noobish question: Why is it also 1,2,4,8,16,32... But not like 20 or 24?
I know Amd has a 3 core cpu.
It's a quad core with one core disabled, so, 4 - 1 = 3
Damn I wanted to say that, but let me give more detail , Unlike Intel that has two Core 2 Dues slapped together , AMD has 4 independant cores, giving a higher chance for one of the cores to go faulty, seing this AMD decided to sell those " faulty 3 legged inbread CPU's to the public" and call it soemthing cool like a 3 core sytem in stead of a " cripple 4 core CPU.
Those images you posted, are they not the same ones used to illistrate the Diffrence between DX 9 and DX 10, pre Vista Relese ?? I will only beleive such an increase in the image quality when I see it at acceptable frame rates in games, cuz where else would the average Joe use it ??
B0r0m1r wrote:1 noobish question: Why is it also 1,2,4,8,16,32... But not like 20 or 24?
I know Amd has a 3 core cpu.
It's a quad core with one core disabled, so, 4 - 1 = 3
Damn I wanted to say that, but let me give more detail , Unlike Intel that has two Core 2 Dues slapped together , AMD has 4 independant cores, giving a higher chance for one of the cores to go faulty, seing this AMD decided to sell those " faulty 3 legged inbread CPU's to the public" and call it soemthing cool like a 3 core sytem in stead of a " cripple 4 core CPU.
Those images you posted, are they not the same ones used to illistrate the Diffrence between DX 9 and DX 10, pre Vista Relese ?? I will only beleive such an increase in the image quality when I see it at acceptable frame rates in games, cuz where else would the average Joe use it ??
Let me correct you. The Intel quads are 2 Core 2 Duo dies stuck together on the same package. The Phenom is 4 cores on the same die. NOT 4 SEPARATE CORES!
What happens then is that Intel on sorting dies can bin a defective (where one core is dead) Core 2 dual die as a single core CPU and replace that die with a working one.
Because AMDs cores are all on the same die if one or 2 cores is dead they have a choice of either binning to 3 or 2 core.
And there is nothing wrong in this approach, all chipmakers work like this including GFX chip manufacturers.
I suggest you quit your IT support job and go spout your ignorance as a Incredible Connection employee.
B0r0m1r wrote:1 noobish question: Why is it also 1,2,4,8,16,32... But not like 20 or 24?
I know Amd has a 3 core cpu.
It's a quad core with one core disabled, so, 4 - 1 = 3
Damn I wanted to say that, but let me give more detail , Unlike Intel that has two Core 2 Dues slapped together , AMD has 4 independant cores, giving a higher chance for one of the cores to go faulty, seing this AMD decided to sell those " faulty 3 legged inbread CPU's to the public" and call it soemthing cool like a 3 core sytem in stead of a " cripple 4 core CPU.
Those images you posted, are they not the same ones used to illistrate the Diffrence between DX 9 and DX 10, pre Vista Relese ?? I will only beleive such an increase in the image quality when I see it at acceptable frame rates in games, cuz where else would the average Joe use it ??
Let me correct you. The Intel quads are 2 Core 2 Duo dies stuck together on the same package. The Phenom is 4 cores on the same die. NOT 4 SEPARATE CORES!
What happens then is that Intel on sorting dies can bin a defective (where one core is dead) Core 2 dual die as a single core CPU and replace that die with a working one.
Because AMDs cores are all on the same die if one or 2 cores is dead they have a choice of either binning to 3 or 2 core.
And there is nothing wrong in this approach, all chipmakers work like this including GFX chip manufacturers.
I suggest you quit your IT support job and go spout your ignorance as a Incredible Connection employee.
One moment as I remove the holy broom of supreme knowlege from your behind. OMG you are a know it all, I never said it was wrong, and so my wording was wrong, shoot me, but what you said is what I meant,
And dont go and insult people and tell them to quit there jobs and go work for IC, when you have no idea what so ever who or what he is. That my friend is a sign of ignorance . . .