Qt 4.x as a game engine


Talking about this stuff motivates me to actually try to finish something for once.
I know what you mean. I code loads of small prototypes and proof-of-concepts and game mechanic tests, but actually making a "finished" game is so much more work, and then you have to do all the boring stuff too :)

I seriously recommend not bothering with all those questions. Just write a couple of QTified classes, gather some artwork. When you actually hit a performance wall, we can all have some fun trying to fix it or annoy some nokia people to write us a better framework :p


If you would take a look at the original Quake3 sources from ID, you wouldn't believe what crap C++ design style they used. You can actually see what they bolted on the engine last minute. There's maybe 3 functions that are true genius. And still it's one of the top games ever.


I spend allot of time *wasted* on exercises in over-engineering, just to give up and start over.


Take a look at the giant list of non working abandoned linux games. Beautiful code, but no game.
I don't mean to optimize early or over-engineer. I mean that we should try to compare different approaches, so we can avoid the same pitfalls. There's more than one way to do stuff, and more often than not the selected method has a flaw that makes you rewrite big portions of the game to fix. I'm a pretty hands-on kind of guy, an experimentalist if you will, though I can also appreciate a good architecture and well thought out abstractions. I just don't want to waste time on bad solutions that have hidden pitfalls.


I don't see an obvious good solution to updating the game world. QTimer gives no guarantees whatsoever on update frequency, which is why I've usually used a QTime to get a delta time to use with updates. QTimer silently discards timeout events, if the system can't keep up with them (link). This gives way to jerky updates, possible tunneling and undeterministic updates.


Probably the "Qt-est" way of doing the updates would be to have the objects update themselves using the "advance" method. The "advance" methods of QGraphicsItems are called when the parent QGraphicsScene's "advance" method is invoked (link). However, the "advance" method does not give any idea on how long has passed to the QGraphicsItems, so the items need a way to keep track of time themselves. The two obvious solution would be to use a QTime per item to keep track of time, but then you're even worse off than with a single QTime in the QGraphicsScene, because now you get not only different update times per frame, but different update times per item.


One way is to use QTimeLines for each game item to control the animations. This gives nice bonuses like ease curves for animations and alike. However, QTimeLines are much better suited to GUI animations than game world updates. They give no actual benefits over a QTimer/QTime and don't provide a time delta.


A separate game loop gives good stable updates. You can still use time deltas and do all you can to get constant intervals between updates. I have not, however, seen a good "Qt" implementation of a game main loop. As it has been pointed out, you still have to run the Qt main loop.


The above is mostly rambling and I usually use the QTimer+QTime route in my projects, but it's not nearly optimal (in terms of end result, performance aside). I just want to raise some conversation to get opinions and learn new, maybe better, ways to do the common stuff. Game world updating is just one thing. Collisions, particle management (QLinkedList has a mutating iterator that lets you delete items from a QLinkedList while iterating over it, very handy for particles), network multiplayer (I recently coded a quick test on real-time networked multiplayer using Qt) and alike have different approaches with different pros and cons and different levels of "Qt-ness". We all make our solutions as we go along, but it might be beneficial to share those solutions to find the best ones for common problems, so everyone won't have to invent the wheel again by goind from a triangle to square to pentagon to circle.

If you have released a QT4 based game, you are either a KDE programmer or a Nokia Employee, probably both.


Us mere mortals are just wetting our feet.
They're not that rare, are they? It's a very promising platform for games. I would guess there are games for Qt4 released by other instances as well?


PS: This post is very much flow-of-thought. I just wanted to get some of my reasoning out in the open. Apologies for incoherent rambling.
 
QTimer gives no guarantees whatsoever on update frequency, which is why I've usually used a QTime to get a delta time to use with updates. QTimer silently discards timeout events, if the system can't keep up with them (link). This gives way to jerky updates, possible tunneling and undeterministic updates.
Won't you have tunneling on any frame where your time delta is really high? In that case, I would just put a maximum on the time delta. If the game can't get enough CPU, it slows down a little and misses an update, but it doesn't affect the game's perception of time. If the game has more than enough CPU, it becomes smoother by using a smaller delta.


I don't think dropping a frame is an issue, but here's a thing I thought of that might work if you really care deeply about this:


I considered a different approach with a single-shot timer. Before starting a frame, the game records the current time. Once the frame is done, the game records the time again and determines how long it took, like in an FPS counter. It starts a single-shot timer (probably a member, don't recreate the QTimer each frame) with (idealTimePerFrame - actualTimeOfLastFrame) interval, with a minimum of 0. According to http://doc.qt.nokia.com/4.6/qtimer.html#interval-prop, a QTimer with an interval of 0 will still process any events before calling timeout(), so even if the game becomes CPU-bound, it won't freeze Qt's event processing, and if a frame takes 21 milliseconds to process instead of 20, the update will be called immediately instead of waiting 19 milliseconds for the next timeout().


The timer could probably be a continuous one, but since the interval is timed from the end of one frame to the start of the next, it will have setInterval() and start() called regularly, so a continuous one would be restarted every frame anyway. It would just as well be single-shot.


You could actually just call QTimer::singleShot, which is static, but that would probably create and delete a QTimer every frame. It might not be a disaster but it would be unnecessary.


It seems like QBasicTimer would also work for this. According to http://doc.qt.nokia.com/4.6/qbasictimer.html#start, each call to QBasicTimer::start will change the interval and reset the timer.
 
Last edited by a moderator:
I am working towards an 8-degree of freedom RTS


currently all the fixeditems have their own timer and verry small animations.


a small random seed makes shure they don't all update at the same time


you shouldn't notice the animation because 20 objects doing 4FPS could result in 80 dirty rectangles being invalidated over a 1 second period, so there's alot going on on the screen to distract your eyes, without the seed they would all update at 4FPS, and you would easily notice when the animation is a couple of milliseconds off, also 20 objects updating togheter @ 4FPS is boring to look at


well i have 1 animatedsprite for now, but this technique makes large groups of objects look like they run at much higher animation rates, you really have to focus on 1 item to see the slow updating.


i am using this logic:


80 dirty rectangles / sec uses less drawing calls then 4 Frames of full rendering. so i should be good on the performance.


Lots of overlapping animated objects are not recommended it would update the same rectangle to much


I haven't any idea how much FPS scrolling a QGraphicsView does, i am asuming as fast as posible


thats where i am at


for movingitems, i just plan on letting the animation framework do the thinking for me, it seems to do smooth animations on every demo


sounds like a good plan for now


EDITED for clarity
 
Last edited by a moderator:
Won't you have tunneling on any frame where your time delta is really high? In that case, I would just put a maximum on the time delta. If the game can't get enough CPU, it slows down a little and misses an update, but it doesn't affect the game's perception of time. If the game has more than enough CPU, it becomes smoother by using a smaller delta.


I don't think dropping a frame is an issue, but here's a thing I thought of that might work if you really care deeply about this:
Well, yes, there is always the possibility of tunneling with deltas. The thing that I care about is at least trying to keep a steady framerate. I'm not really that concerned about an off-chance tunneling, but general jerkiness. QTimers are really not accurate enough to provide a really steady framerate no matter how powerful the system is. They are good enough for most games, I'm not arguing that. I would, however, like to know if other people have other means that have different pros and cons, so I can choose the best tools for a given job instead of using just one.


Dflemster, how would you go about implementing your game loop + sleep version?


I'll probably continue using a QTimer/QTime+delta combination for most real-time projects and QTimeLines for event-based gameplay, if no better alternatives surface. As I said, I'm just curious :)

I considered a different approach with a single-shot timer. Before starting a frame, the game records the current time. Once the frame is done, the game records the time again and determines how long it took, like in an FPS counter. It starts a single-shot timer (probably a member, don't recreate the QTimer each frame) with (idealTimePerFrame - actualTimeOfLastFrame) interval, with a minimum of 0. According to http://doc.qt.nokia.com/4.6/qtimer.html#interval-prop, a QTimer with an interval of 0 will still process any events before calling timeout(), so even if the game becomes CPU-bound, it won't freeze Qt's event processing, and if a frame takes 21 milliseconds to process instead of 20, the update will be called immediately instead of waiting 19 milliseconds for the next timeout().


The timer could probably be a continuous one, but since the interval is timed from the end of one frame to the start of the next, it will have setInterval() and start() called regularly, so a continuous one would be restarted every frame anyway. It would just as well be single-shot.


You could actually just call QTimer::singleShot, which is static, but that would probably create and delete a QTimer every frame. It might not be a disaster but it would be unnecessary.


It seems like QBasicTimer would also work for this. According to http://doc.qt.nokia.com/4.6/qbasictimer.html#start, each call to QBasicTimer::start will change the interval and reset the timer.
This would probably be a better alternative. A little less chance for tunneling at least, because the delta would be smaller if a frame was just over the ideal time. On the other hand, this would make the delta at most one frame length shorter, so it wouldn't help that much. On the jerkiness side this solution would probably have less jerkiness compared to a repeating QTimer if there frames were dropped constantly, because this approach keeps the framerate as high as possible if frames are dropped instead of attempting to sync to the desired framerate.


All in all, the above solution seems a bit better than the one with just a normal repeating QTimer. I may have to look into this a bit more :)
 
If you're using continuous collision detection, you should never get tunneling. I know Box2D has this, and I use it for the bullets in my platformer, though I haven't ported it back to the player and enemy physics yet.
 
Dflemster, how would you go about implementing your game loop + sleep version?
My name is difficult to spell, isn't it! :D


Anyways, I've done a lot of research related to writing games in a reactive style (y'know, the model where game objects subscribe to reactive events and produce mapped updates - extremely efficient because there's no need to call huge amounts of update methods; you do minimal work for maximum profit) so it's been a long time since I actually used the normal imperative update-draw-repeat method.


But it's pretty basic (Read the POTENTIAL IMPROVEMENTS section for the really interesting stuff - I'm too lazy to do some serious coding ATM):



Code:
unsigned int currentTime(); //implementation omitted

void sleep(unsigned int time); //implementation omitted

const float WANTED_FPS = 60;

const int WANTED_DELTA = (unsigned int)(1000.0f / WANTED_FPS);


class UpdateContext {

public:

  UpdateContext(unsigned int d, bool p) : delta(d), paused(p) {}

  const unsigned int delta;

  const bool paused;

  //...

}


class RenderContext {

public:

  virtual Texture loadTexture(const TextureIdentifier &identifier) = 0;

  virtual void renderSprite(const Texture &texture, const Foo &bar) = 0;

  virtual void clearScreen() = 0;

  virtual void doWhatever() = 0;

  //...

}


//Multiple inheritance? Bad practice? Not for purely virtual classes...


class Updatable {

public:

  virtual void update(const UpdateContext &context) = 0;

}


class Renderable {

public:

  //The thing mustn't modify itself when rendering; enforce this here with const.

  virtual void render(const RenderContext &context) const = 0;

}


int main(const int, const char **) {

  bool running = true;

  unsigned int lastTime = currentTime();


  setStuffUp();

  const Updatable[] updatables = initUpdatables(); //I don't bother with using std::vector or QList or whatever; just raw NULL terminated arrays for this example

  const Renderable[] renderables = initRenderables();

  //(There might be duplicates: something can be both renderable and updatable. But don't enforce that something HAS to be both!!)

  //You can use MetaClasses in Qt to check whether something is Renderable etc., so you can load a huge amount of objects from some unknown location and sort them into the arrays above for example.


  while(running) {

    unsigned int newTime = currentTime();

    unsigned int delta = newTime - lastTime;


    const UpdateContext updateContext(delta, false);

    for(int a = 0; updatables[a]; a++)

      updatables[a].update(updateContext);


    const RenderContext renderContext = getRenderContext();

    for(int a = 0; renderables[a]; a++)

      renderables[a].render(renderContext);


    if(delta < WANTED_DELTA)

      sleep(WANTED_DELTA - delta);

  }


  //POTENTIAL IMPROVEMENTS:

  //  - Run two while loops on different threads: one for rendering and one for updates.

  //  - Use different UPS and FPS values. Set the rendering thread to lower priority. There'll be no racing since Renderable uses a visitor pattern with a constant this pointer.

  //  - Add function pointer fields to UpdateContext like "const void (*pause)(bool);" and "const void (*shutdown)(bool);" that update booleans respectively.

  //  - Add function pointer fields to UpdateContext like "const void (*addUpdatable)(const Updatable *);" and "const void (*addRenderable)(const Renderable *);"

  //  - Add other "ables" like "Inputable" and "Collidable" that follow the same pattern.

  //  - Don't use "able" as a suffix! Come up with better names.


  tearStuffDown();

  return 0;

}

[/sloppycodethatsnottestedatallandmighthavesyntaxerrorsforalliknow]
 
Thanks for answering, but more than the actual loop I wanted to know how you integrate it with Qt. Qt has its own main loop to run, so you can't stay in a loop forever or you freeze Qt's event system. Would you use a separate thread? Or would the sleep function allow Qt to run its own stuff? How would you time the return to the main loop?


Be that as it may, I still found the code you wrote interesting. It's way too seldom that I see design patterns used knowingly in code, which is sad.
 
Thanks for answering, but more than the actual loop I wanted to know how you integrate it with Qt. Qt has its own main loop to run, so you can't stay in a loop forever or you freeze Qt's event system. Would you use a separate thread? Or would the sleep function allow Qt to run its own stuff? How would you time the return to the main loop?


Be that as it may, I still found the code you wrote interesting. It's way too seldom that I see design patterns used knowingly in code, which is sad.
Using the pattern above, I would have an Updatable object whose update method only contained a call to "QCoreApplication::processEvents();". And then I'd just never call "QApplication::exec()". I like being in control of the whole program flow, and I wouldn't surrender it to Qt.
 
Using the pattern above, I would have an Updatable object whose update method only contained a call to "QCoreApplication::processEvents();". And then I'd just never call "QApplication::exec()". I like being in control of the whole program flow, and I wouldn't surrender it to Qt.

I have to say, for some reason this has never crossed my mind :D . Thanks dflemstr, once again you gave me something to think about :)
 
Has anyone tried QTopengl by chance?


ie: hellogl has an ES example


I wonder if our pandora QT is compiled ith gl support built in? (or if not, how successful we might be in building our own QT with gles support built in, and then just including that QT versionin the pnd-file.)


(and by extension pyqt->qtopengl would be neat, but at this number of layes of abstraction, it just gets to be hell to port :)


jeff
 
Has anyone tried QTopengl by chance?


ie: hellogl has an ES example


I wonder if our pandora QT is compiled ith gl support built in? (or if not, how successful we might be in building our own QT with gles support built in, and then just including that QT versionin the pnd-file.)


(and by extension pyqt->qtopengl would be neat, but at this number of layes of abstraction, it just gets to be hell to port :)


jeff

The default libqt4-x11 package in Ångström is compiled with ES support. But then again, someone might have chosen a different Qt version when baking the image (that has happened a lot lately...), so yeah.


Also: Qt makes your GL code portable automatically. So if you follow the ES-2.0 spec perfectly in your code, you'll be able to run your unmodified code under GL-2.0, GL-3.0, GL-3.1, GL-4.0 etc as well (The Qt shader compiler includes "#define highp\n#define lowp\n#define mediump" etc for GL-2.0 compatibility in your shaders).
 
I need to write up a minimal-line Hello World experiment here, see if it works; don't suppose you know of a QTopengl experiment that produces useful performance metrics by chance?


In general, opengl es seems to be a total mess, unless you write it all from scratch in C++; doign it in python or random other languages is all doable, but information is a scattered mess :/ I'm tempted to look at PyQT -> QTOpenGL but it might just be too tenuous :)


jeff
 
Damn! I started porting over some simple OpenGL code to QtOpenGL and it was going okay on the desktop and in my VM; mostly I just wanted to plop in QGLFormat::whatEverItWas() that returns the Enum of the features, so I could list them off on desktop and pandora and compare.


Went to look in the firmware libs, and we don't include QtOpenGL libs with QT4, so we're probably building without that. *darn!*


I've not looked much into it yet .. really hoping to avoid doing a full OE build on my laptop just to get a set of QT libs for myself :)


Then again, just noticed:


http://www.angstrom-distribution.org/repo/?pkgname=libqtopengl-dev


Maybe I can yoink those and with luck they might even be friendly with out libs (you never know.) I'll look into it more tomorrow.


I'm really liking the idea of going PyQT -> QT -> QtOpenGL :)


jeff
 
Went to look in the firmware libs, and we don't include QtOpenGL libs with QT4, so we're probably building without that. *darn!*


Try building Qt from source, on the pandora, with all the GL headers present on the system. Qt should autodetect it should build the GL ES backend.


If things fail, try the N900 sources.

Went to look in the firmware libs, and we don't include QtOpenGL libs with QT4, so we're probably building without that. *darn!*


Try building Qt from source, on the pandora, with all the GL headers present on the system. Qt should autodetect it should build the GL ES backend.


If things fail, try the N900 sources.
 
Damn! I started porting over some simple OpenGL code to QtOpenGL and it was going okay on the desktop and in my VM; mostly I just wanted to plop in QGLFormat::whatEverItWas() that returns the Enum of the features, so I could list them off on desktop and pandora and compare.


Went to look in the firmware libs, and we don't include QtOpenGL libs with QT4, so we're probably building without that. *darn!*
Nope, the recipice containt information to create the qtopengl ipk file (angstrom have it and are working fine). I even asked ED to add this to the firmware (see minitube thread on the other boards) but I got no support :(


There is a few QT4.7 build around for pandy. And I know mine does include ES2 backend.
 
You have your QT4.7 with ES2 anywhere i can leach it down?


When you spend most of your time at work, the rare time at home is precious; leaching ftw :)


jeff
 
Fiddling with sebt3's QT build right now; if I nab the QT 4.6 sample (got the qt4.6 src handy), the normal widgets render ok on the pandora, but all I get is random fuzz garbage in the QtOpenGL widget portion.


If I build an older version of the QT ES sample (old enough I can link against the QT I have installed in my VM for x86), then I can run it just fine on the desktop, but also just static in the QtOpenGL portion of the UI on pandora.


I get no runtime errors (confirmed the glwidget object in the sample is instanciated for instance), and no dyn-link errors (linking against GLES_CM seems to cure all the OpenGL related comnplaints; doesn't seem like I need to link against EGL.so direct.) (I'm a total OpenGL ES noob, so forgive me on that count.)


Anyone happen to have random ideas?


Note 've not yet had time to do my own full QT4.7 build or the like.


jeff
 
I just upgraded a laptop to Xubuntu 10.10, that comes with QT 4.7 RC1


just playing around in QML for the moment


anyone ran a QML app on the pandora yet?
 
hi all (i'm new in this forum). i'have arleady programmed a asteroid-like game using SDL (c++) and Swing(Java). Now i need program this game using Qt and the Graphics View Framework. The problem is that, looking at the documentation (the examples: "Colliding mices" and "ported Asteroids" ), i havn't found what i seek: in these examples there is a timer, but i can't find a "classical" approach at the game-loop like:


loop:


{


updateGameLogic()


renderFrame()


waitUntilFrameEnd()


}


and this is a big problem for me, because i've always programmed game in this way.


Reading in this forum, fortunately, i have found some useful informations; for example:

Using the pattern above, I would have an Updatable object whose update method only contained a call to "QCoreApplication::processEvents();". And then I'd just never call "QApplication::exec()". I like being in control of the whole program flow, and I wouldn't surrender it to Qt.

this is very interesting, and, if possible, i would like to know more about this. has someone tryied this method or something similar? it would be amazing to see some working code to use as a ispiration for my game...


Sorry for my bad English and thanks in advance.
 
Back
Top