Mobile Ray Tracing By Intel


bnolsen

Member
Joined
Jan 12, 2008
Messages
107
Real time raytracing for mobile may be coming soon...and it should have been here already.

http://blogs.intel.com/research/2008/02/re...g_in_your_p.php

Ray tracing is the future, the execution units are simpler and no more "hack on hack on hack" of rasterization techniques. Less hardwre to do way more. And Intel has nothing to lose by pushing this technology, ATI & nvidia have everything to lose.

At work we have a high performance ray caster which scales almost linear with cpus. Our basic implementation is about 96% efficient with 8 cores. It's not a traditional center perspective system but each and every input scanline has a unique position/attitude fix.

Probably pandora is safe for a couple of years before this stuff takes off. At that point though rasterization will be pretty much "that old stuff".
 
Cool, that means Pandora2 (squared) will have something like this in it.
 
I've never seen a ray-tracing engine with normalmap support. Everything looks blocky especially in Q4RT. Will game devs have to use hi-res models?
 
bnolsen said:
Probably pandora is safe for a couple of years before this stuff takes off. At that point though rasterization will be pretty much "that old stuff".
That's a bet... Professional game devs I have talked to about that told it wouldn't happen before many years.
 
Last edited by a moderator:
Hmm, "new" technology straight from 1970?

I am just a little bit skeptical. (Are you aware that Wolfenstien 3D and Descent were raycast? AND that they were 320x240 pixels?).

QUOTE
How is this possible, you might ask?

It’s because Ray-Tracing draws a scene in 3D by tracing rays of light from the pixels on the screen, to the surfaces of objects in view. And in the case of a UMPC, when one is viewing 3D space from the viewable area of a 4.5” LCD screen, fewer rays are required, and hence, the CPU requirements are substantially less. For example, you might prefer viewing a high definition (1280x720 resolution) display on your PC, but with the much smaller viewable area on a Sony* VAIO* UX Micro PC, smaller resolutions may be quite acceptable (such as 480x272, for example). Using this lower resolution, it would only require 8% of the CPU requirements that had been needed to render in high definition. To put this into perspective, a 480x272 screen is two and a half times the resolution of the Nintendo* DS (per display, at 256x192).


I notice that we have an 800x480 pixel screen, therefore close to 3 TIMES as many pixels as their example (2.79 Times as many pixels 356000 vs 130560).

In theory Ray-Tracing could do all this, but we are definitely not at that stage yet, and for sure not with handhelds.

I take issue with this statement as well:
QUOTE
Moore’s Law working in our favor

Moore’s Law works in favor of Ray-Tracing, because it assures us that computers will get faster - much faster - while monitor resolutions will grow at a much slower pace. As computational capabilities outgrow computational requirements, the quality of rendering Ray-Tracing in real time will improve, and developers will have an opportunity to do more than ever before.


Displays are moving to 2560x1800 (Quad HD?), and mobile displays are moving to higher resolutions, we have 800x480, as does the EeePC, and most UMPC's get 1024x768, his argument is seriously flawed. The handheld market is moving faster than the regular market in display res increase.

I think he may be right, but for toys, not serious mobile computing or gaming options (and the Pandora is the most serious option I know of.)


If you check out the Tile based rendering in the PowerVR SGX of the Pandora you will see that Tile-Based rendering is a much more up-to-date rendering scheme that is really great for what it is doing (think 5th gen chip to the Dreamcast's 3rd gen chip, yeah, we already have 3D performance thank you.)

http://www.beyond3d.com/content/articles/38/

http://guru3d.com/review/guillemot/3dproph...00/index2.shtml

http://csdl2.computer.org/persagen/DLAbsTo...SD.2004.1333306

http://www.google.com/search?hl=en&q=t...based+rendering

This seems like a way to push Intel processors on people by promising them "leet graphics", when in truth they have seen the light and licensed the PowerVR SGX for all of their "non-toy" mobile platforms.
 
nubie said:
This seems like a way to push Intel processors on people by promising them "leet graphics", when in truth they have seen the light and licensed the PowerVR SGX for all of their "non-toy" mobile platforms.
Yeah and what's even funnier is that Intel is using SGX for its upcoming (1-2 years) SoC targetting UMPC ;)
 
Last edited by a moderator:
The real problem facing mobile graphics isn't processing power, it is memory latency, bandwidth and size, and that is exactly what tile-rendering does, "bins" the renders into manageable chunks that can be optimized for memory accesses and usage.

Not that I am knocking the tech, just the application in the current hand-held market, and with Built in LED DLPs coming to hand-helds, by the time the tech hits won't it be too late?

I suppose for ultra-super-micro mobile it will have a benefit (IE the $14 Motorala at Walmart will have this for its 3D menu icons and tetris/bubble bobble knock-off game on its 200x200 screen), but I don't think that the Pandora crowd is queuing up for this kind of tech, or will be in 2 years.
 
No reason why they ray tracing couldn't be done at 400x240 & run through an AA filter after the fact.

Also the 8 cpu system intel ran used general purpose cpus, not designed for optimized ray tracing. The article talking about moore's law was kind of stupid. Real time ray tracing today can benefit hugely (orders of magnitude) by improving algorithms and how hardware runs the ray tracing.

Could the pandora DSP unit be abused to run ray tracing?

I've been following real time ray tracing for a while. A university in germany worked with programmable FPGAs and were getting amazing efficiencies comparing clock rate & transistor count with nvidia/ati parts.

Ray tracing by itself doesn't need a vastly fast bus or vastly fast memory since fill rate isn't so important with this method.

The big deal is that intel is throwing huge amounts of money at it today. With ray tracing being a true physical model I would bet hard money that they'll be able to match current high end rasterized parts with spending a tiny fraction of the r&d.
 
At least this is something that can be done with the additional cores that'll continue to pile up in CPUs. Technologies like this that are highly scalable per CPU without linearly increasing memory requirements are going to be more and more valuable.
 
bnolsen said:
Could the pandora DSP unit be abused to run ray tracing?

Given it has no floating-point, I don't think you would get acceptable results.

QUOTE
I've been following real time ray tracing for a while. A university in germany worked with programmable FPGAs and were getting amazing efficiencies comparing clock rate & transistor count with nvidia/ati parts.

Ray tracing by itself doesn't need a vastly fast bus or vastly fast memory since fill rate isn't so important with this method.

The big deal is that intel is throwing huge amounts of money at it today. With ray tracing being a true physical model I would bet hard money that they'll be able to match current high end rasterized parts with spending a tiny fraction of the r&d.
Ray tracing needs big amounts of RAM to hold pre-processed data in order to run efficiently. And I am not sure ray traced images always look better than good rasterized images, unless you start adding stuff that doesn't fit nicely in RT such as global illumination.

Also note that Intel just bought a studio that has been developing a rasterized based engine for years. I am ready to bet they did that because they learned the hard way they would not move game studios from rasterization to ray tracing before many years.

Don't get me wrong, I am all in favour of RT, I can at least understand it and its algorithms. It's time has not come yet :)
 
Last edited by a moderator:
Does it look like the PowerVR architecture is already using hybrid Ray-Tracing?

http://www.beyond3d.com/content/articles/38/3

QUOTE
The Hidden Surface Algorithm is based on Ray Tracing Principles. It works like this : PowerVR sends a ray into the 3D scene for each pixel it has to render, this Ray intersect several polygons in the scene. Now using advanced maths the PowerVR can determine how far this intersection on the ray is starting from the origin. So we have a ray that starts at the origin ( the eye of the observer ), this ray goes through the Image plane more precise through the pixel of the image plane that is being rendered. The Ray then continuos into the 3D scene and will there probably intersect with several objects. For each intersection the mathematics figure out how far the ray has travelled from the start point. This means that we get some kind of depth/Z-value but along the ray not along some fictional Z-axis. So for each intersection we know how far it is from the start point. It should be clear that the intersection with the smallest value is closest to the observer and thus will determine the color of the pixel.


It seems to me that the PowerVR is just a way to batchify the pixels into easily streamable chunks so that the processor can eat it in pieces.

The Ray-tracing algorithm is apparent. Does this make the PowerVR a ray tracing graphics processor already? Hmm.
 
nubie said:
Does this make the PowerVR a ray tracing graphics processor already?
Generally it doesn't naturally model a lot of things ray tracing would, such as shadows, reflection, and curved geometric surfaces (yes, I know that there are techniques to do all of these things with polygons and even hardware acceleration to assist it but it's not the same as having it naturally modeled as a side effect of the input data instead of more explicitely).

However, one thing that this allows it to do that raytracing can and conventional polygon rasterization cannot is order independent translucency. This particular feature is a serious headache for Dreamcast emulation on normal PC graphics cards.
 
Last edited by a moderator:
Exophase said:
However, one thing that this allows it to do that raytracing can and conventional polygon rasterization cannot is order independent translucency. This particular feature is a serious headache for Dreamcast emulation on normal PC graphics cards.
what is this? i tried searching for info on it, but all i could find was a demo that required directx 10 to run -_-
 
Last edited by a moderator:
nubie said:
Does it look like the PowerVR architecture is already using hybrid Ray-Tracing?

No, but suffice to say that rasterization and tracing of primary rays are basically equivalent algorithms anyway, only their performance characteristics (and thus hardware requirements) differ. Visibility testing is rarely the bottleneck so the exact algorithm used here doesn't make a difference to applications.

Snu said:
what is this? i tried searching for info on it, but all i could find was a demo that required directx 10 to run -_-

Usually, with rasterization, applications have to sort transparent objects by depth and render them back-to-front to get a correctly blended result. This object sorting may be expensive, and complex geometry might have to be split into several parts.

The Dreamcast grapics chip was capable of performing this depth sorting automatically, per pixel, so you would get the correct result without the application having to do anything. This feature is not supported by PowerVR SGX, though.
 
Last edited by a moderator:
Take a look at what can be done : http://ioquake3.org/2008/03/04/raytracing-ioquake3/
It looks like the game runs at rather low rez (512x384) @ ~15 FPS on a 3800 X2.

BTW here is what the author said on Icculus Q3 mailing-list:

QUOTE
But I have to be honest: I don't think that raytracing will replace rasterization anytime soon. As long as rasterization can deliver the great illusions we see in recent AAA games, there's no need for accurate simulations of light transport. Faking reflections/refrections/etc. will be fine for the average gamer. It's an interesting field to do research in though. :)
 
I was looking at ray tracing before I posted those articles a while back and I came across:

OpenRT

Google has this description:
QUOTE
A new interface that is supposed to become the OpenGL of interactive ray tracing. Contains publications, gallery and some links.
 
Back
Top