Psuedogl


Pleng

Well-Known Member
Joined
Dec 28, 2006
Messages
3,030
Last night I had a though.

Would it not be possible to create a Pseudo OpenGL graphics driver that renders OpenGL graphics in software and then pumps the output to the standard graphics driver? Similar in a way I guess to how pseudo drivers are used to mimic CR-Rom drives from ISO files, and pseudo network drivers bridge network connections between VMWare and the host OS.

Now, of course essentially turning OpenGL into a software renderer would be painfully slow and it probably wouldn't be much good for 3D graphics. But there are also 2D graphics libraries that rely on OpenGL rendering. These may well work with a software GL renderer. And perhaps, in time, if such a driver was made to work, it could later be offloaded to the DSP.

Any thoughts?
 
I think the wrapper-library approach is better .. you can still use OpenGL calls, but they're translated to GLES1.1/2.0 calls, accordingly. There isn't any real reason to do this at the 'driver' level, either.

But I thought there was hope on the horizon for a full OpenGL 2.0 driver for PowerVR SGX chipsets, somehow? Seems to me that it would be a lot nicer to have hardware-level support for OpenGL 2.0 from the vendor, or .. ?
 
Sure a wrapper is nice but do we have one that traps EVERY open GL call? Hmm I suppose even then it would be easier to beef up the current wrappers than implement a whole software solution from scratch.

Where did you hear about a full openGL driver? That would be tits.
 
I read about it first here:

http://en.wikipedia.org/wiki/PowerVR

Then I saw this thread, which I possibly totally misunderstood (and apologize for in advance if that is the case):

http://www.gp32x.de/board/index.php?/topic/57766-powervr-sgx-support-for-opengl-2-0/
 
Last edited by a moderator:
hm, isn't mesaGL that what you are looking for? i could be mistaken though: http://www.mesa3d.org/
 
Hmm yes it looks like mesaGL would do it. If we got MesaGL included in the base OS does this not mean we would be able to run OpenGL apps, albeit rendered in software?

And TorPor, reading that thread I don't really see any indication of full GL drivers coming our way :(
 
I guess we'd have to ask TI to get the drivers licensed for us, and so on .. something for Craig/MWeston, perhaps? Though, like you, I don't have high hopes ..
 
I also think I read somewhere that full OpenGL would have quite a negative impact on battery life?
 
Pleng said:
I also think I read somewhere that full OpenGL would have quite a negative impact on battery life?

There are a number of these around, all in various states of use.

This is pre-coffee for me, but for instance theres (I think) TinyGL that is used for the Quake ports; given it wraps Quake1/2/3 so well you'd think it would be useful for other apps, but in general... not :)

Likewise there is a similar wrapper made by Aventus but I'm not sur eif any projects use it; it looks pretty promising. Again, not a full GL implementation obviously (GL is _huge_ which is why they made GLES), but its certainly in the right direction. Not in active development.

MESA is a big heavyweight in this sort of arena, able to run GL on GLES and vice versa, and use all sorts of drivers; its a bit of a pig to build and isn't super fast, but it ought to work.

Theres also 'live emulators' which take GL calls and do all sorts of magic to make them work, such as in the PVR SDK; not usable for us really.

In the end .. few of these types of thigns would make games playable, but you never know.. Q3 runs great for instance.

<rant, get it off my chest>

For me, my personal annoyance, is at how scattered the tools for just developing GLES are; ie: really, you want to dev on a VM on your desktop or on your native desktop, but developing GLES on oyur desktop is basicly a nightmare of random tools depending on your video card, your OS and drivers, the chipset you're targeting, etc. It'll be 'lots better' in months/years, since WebGL is basicly GLES and so every desktop will need GLES suppor soon .. but who knows what the nvidia's and ATI/AMD's will do.. support old cards, or pussh for new card adoption with drivers only for new cards etc. So right now you use a mix of GLES emulators, hackneyed desktop GLES drivers, wrappers, or like 99% of projects -- writing multiple render paths.. GL + GLES, or even GL + GLES1 + GLES2 + DX etc. Its painful :) (not to mention to you use SDL, QT, roll your own UI, webkit, etc etc.. and getting GL and GLES to work with those is all pain..) ... Being a developer doing GLES _right now_ stinks :)

</rant>

jeff
 
Last edited by a moderator:
Is the issue with the full GL driver from TI/Imagination Technologies the licensing cost? If so, how much are we talking about here? Is this something we could set up a bounty for?
 
skeezix said:
<rant, get it off my chest>

For me, my personal annoyance, is at how scattered the tools for just developing GLES are; ie: really, you want to dev on a VM on your desktop or on your native desktop, but developing GLES on oyur desktop is basicly a nightmare of random tools depending on your video card, your OS and drivers, the chipset you're targeting, etc. It'll be 'lots better' in months/years, since WebGL is basicly GLES and so every desktop will need GLES suppor soon .. but who knows what the nvidia's and ATI/AMD's will do.. support old cards, or pussh for new card adoption with drivers only for new cards etc. So right now you use a mix of GLES emulators, hackneyed desktop GLES drivers, wrappers, or like 99% of projects -- writing multiple render paths.. GL + GLES, or even GL + GLES1 + GLES2 + DX etc. Its painful :) (not to mention to you use SDL, QT, roll your own UI, webkit, etc etc.. and getting GL and GLES to work with those is all pain..) ... Being a developer doing GLES _right now_ stinks :)

</rant>

Doesn't this apply to most aspects of Linux development?
 
Last edited by a moderator:
skeezix said:
<rant, get it off my chest>
For me, my personal annoyance, is at how scattered the tools for just developing GLES are; ie: really, you want to dev on a VM on your desktop or on your native desktop, but developing GLES on oyur desktop is basicly a nightmare of random tools depending on your video card, your OS and drivers, the chipset you're targeting, etc. It'll be 'lots better' in months/years, since WebGL is basicly GLES and so every desktop will need GLES suppor soon .. but who knows what the nvidia's and ATI/AMD's will do.. support old cards, or pussh for new card adoption with drivers only for new cards etc. So right now you use a mix of GLES emulators, hackneyed desktop GLES drivers, wrappers, or like 99% of projects -- writing multiple render paths.. GL + GLES, or even GL + GLES1 + GLES2 + DX etc. Its painful :) (not to mention to you use SDL, QT, roll your own UI, webkit, etc etc.. and getting GL and GLES to work with those is all pain..) ... Being a developer doing GLES _right now_ stinks :)
</rant>
jeff

I seem to remember the plan is to support ES on the desktop when OpenGL 4.0 is released.
 
Last edited by a moderator:
My experience is that software OpenGL implementations are not really optimized for performance, so they work out even poorer than you'd expect them to. Using a fallback like this for 2D would probably not work out very well, especially if the reason they chose OpenGL for 2D is because they're doing a lot of affine transformations and blending.

The wrapper approach should work out pretty well. It'd be great if Adventus is around more and could offer some assistance to the people looking for it. For things that aren't using shaders at all, an OpenGL 1.x to ES 1.1 wrapper would make a lot of sense too, and would be much more direct.
 
skeezix said:
....
For me, my personal annoyance, is at how scattered the tools for just developing GLES are; ie: really, you want to dev on a VM on your desktop or on your native desktop, but developing GLES on oyur desktop is basicly a nightmare of random tools depending on your video card, your OS and drivers, the chipset you're targeting, etc. .....

No, it is not. You just download an emulator library running on top of Open GL and you are basically done.
 
Last edited by a moderator:
This is why I say, compile-onboard! :) Cross-compiling and testing in a foreign environment is one thing, but having direct access to the target environment from your code/test/run cycle is another thing entirely ..
 
yup and squinting at those tiny digits on that tiny screen, and getting thumbache from those rubber keys is awesome too :p
 
Ermm, I have my Pandora connected by USB-Network to my main Linux machine, where I mount the Pandora disk using sshfs, so I can edit files on the Pandora filesystem with my 24" screen, normal keyboard and mouse, press a single button on the Pandora to build the project and test it .. no worries there, mate! :) It works bloody brilliantly, I even have the ability to type ":make" in my vim windows and get compiler messages back over the ssh connection, smooth as butter. I don't even notice that the Pandora is doing the actual compiling ..
 
That sounds like it'd be more difficult than I had setting up my cross compiling environment :p
 
<scattered reply, brain is overloaded and fried and in a conf call, but replying anyway, sorry! :>

torpor -- some of us want our apps to work on multiple devices .. including desktops :) But yeah, I've found in practice its just easier to either crosscompile and direfct run on panda, or just build on panda for the GLES renderpath. For my current game I'm trying to stick to a tiny tiny subset of GL and GLES so that the same renderpath more or less works for both, since it just makes life easier so can concentrate on 'business logic', and worry about making it prettier on both render paths later .. but such a PITA :) For my part, I keep a full distro with every tool under the sun on SD card so I can do crazy stuff there without disk space issues, but I boot from NAND most of the time for compatibility testing since you want stuff to work on standard pandoras, etc. Same process I've been doing the whole time really, but a nice fast SD card is essential :)

warmi -- and yes, it is; 'just get a wrapper lib' is not so easy, depending on your target devices, language of choice, feature set needed, target chipset, chipset you're building on, you name it. PVR's GLES emulator for GL systems is pretty fesity, but its only one. Considering dstuff like recent NV drivers sometimes support GLES on desktop, but I think last I checked my chipsets weren't supported; certainly some of the GLES systems I want to target don't have any useful GLES emulator/wrapper. Stuff like doing it with python and trying to use pyGL for example etc. You have layers upon layers upon layers of all fast moving stuff, so its a right pita .. theres not much current accurate documentation, since everything is work in progress. IOts not to say its always like this for everyone, but for many combinations of source dev system and tatrget device, its a total mess :) Just sayin'.

jeff
 
WizardStan said:
That sounds like it'd be more difficult than I had setting up my cross compiling environment :p


Silly! Of course it isn't: USB-Net is easy to set up out of the box, mount -t sshfs ip_of_pandora ./pandora_mount_point, etc ..

Geeze. Do I have to write a tutorial? ;)
 
Last edited by a moderator:
Back
Top