Code::Blocks with C/C++ Compiler

Discussion in 'C /C ++' started by ptitSeb, Sep 3, 2012.

  1. ptitSeb

    ptitSeb Serial Porter

    Joined:
    Aug 15, 2012
    Messages:
    8,180
    Location:
    France, near Lyon
    Qt 5.11.0 is built for more than a week (without WebEngine, that doesn't want to build because libc is too old).
    I was struggling with QtWebKit... The build doesn't want to finish and got killed (with 4GB swap) on Document.cpp (and yes, I have disabled the monolitic built). I have a last try running, not sure what I will try if it still fail (probably try to cut the file, but that complicate).

    Not sure I want to build a new Qt for a minor bump in the version, unless there is something uber important in the ChangeLog.
     
  2. canseco

    canseco Very Active Member

    Joined:
    Jun 1, 2004
    Messages:
    883
    Location:
    Spain
    I think QtWebKit is not worth it anymore, as i only see 3 programs still using it on my Manjaro Linux desktop repos (KdenLive, K3B and Marble).

    Shame about glic being old because of this fixes:

    - Fix build with GCC 8.1

    [QTBUG-68752] Fix compilation with opengl es2

    https://code.qt.io/cgit/qt/qtwebengine.git/tree/dist/changes-5.11.1/?h=v5.11.1
     
  3. Magic Sam

    Magic Sam Forever Homebrew

    Joined:
    Aug 10, 2007
    Messages:
    2,117
    Location:
    Innsmouth, MA
    Hi all :)

    @ptitSeb : I'm trying to build some OCR program (tesseract) on the Pandora with your latest Code::Blocks release (GCC 8.1), but it fails with the following error:

    Do you understand what I'm doing wrong ?

    Cheers, Magic Sam
     
  4. ptitSeb

    ptitSeb Serial Porter

    Joined:
    Aug 15, 2012
    Messages:
    8,180
    Location:
    France, near Lyon
    Yep, you are using -fsingle-precision-constant (it's by default in CFLAGS and CXXFLAGS because it's faster on Pandora), and this mess up the template resolution... Either force the template, or remove the flags from you CFLAGS (or add -fno-single-precision-constant to the CFLAGS). Because it's OCRa nd may need the full double precision I suggest the second option.
     
    levi and Magic Sam like this.
  5. elvissteinjr

    elvissteinjr Very Active Member

    Joined:
    Aug 23, 2013
    Messages:
    363
    Location:
    Germany
    hey, just a quick question so I'll put it in here:

    The gigahertz Pandora's GPU clocks about twice as fast, but what is the frame rate difference in practice on average? I only have a good old CC Pandora myself and I was finally gonna release something for it. The CC one doesn't quite cut it, with 40 - 45ish fps in the most demanding scene of the game, so I would default to frameskipping on. If a gigahertz one manages just fine I'd like to autodetect the model and default to full framerates for a better experience, though.
    I'd also like to avoid auto-frameskip in order to maintain a consistent framerate during the game, so there's that.

    I could of course build a PND with just that scene and upload it for someone to test, but if there's past experience that points towards a good enough improvement I could skip this step.

    Thanks!
     
  6. levi

    levi Still fresh, damnit!

    Joined:
    Oct 6, 2008
    Messages:
    9,943
    Location:
    Somewhere off the coast of the EU
    Maybe ptitseb could hazard a guess, but it's a very difficult question to nail down positively. Firstly, where's your code spending most of its time in a profiler? If it's missing 60fps mostly because of the render step or game logic (especially any of the fiddly bit inside that) it varies. There's also the RAM speed to consider on the different units, because of the way the graphics unit writes the framebuffer back to RAM on ARM systems.
     
  7. ptitSeb

    ptitSeb Serial Porter

    Joined:
    Aug 15, 2012
    Messages:
    8,180
    Location:
    France, near Lyon
    Well, the Gigahertz is faster, but not 2* faster all-in-all (but like 50% is some cases, compared to my CC @800Mhz).

    But as @levi mentionned, it will depend where is your bottleneck. Using and ssh window, try to have a look at "sudo perf top", and see if it's caping at 100% CPU (then it's probably cpu limited) or if cpu idle a bit (then it's probably GPU limited). CPU on Gigahertz is also faster the CC (more cache), but you wont double the speed...
     
  8. elvissteinjr

    elvissteinjr Very Active Member

    Joined:
    Aug 23, 2013
    Messages:
    363
    Location:
    Germany
    I'm already pretty sure I'm bottlenecked by the GPU, but I re-checked everything.

    According to the engine's internal performance counters it's spending about 50µs in the game logic code, and around 20000µs per frame in the rendering code. With frameskip on it's taking just ~3000µs - ~4000µs there, possibly since it's not running behind anywhere.

    The scene in question uses multiple somewhat large textures with alpha transparency. I'm aware how taxing this is on that thing, but resizing or using compressed ones isn't an option for me here.

    Didn't copy it off (had some issues getting the cli tools run over ssh, eh), but perf top looked like this:
    Code:
    25.50% libmikmod.so
    7.86% libGLES_CM.so
    5.40% kernel
    2.00% ProjectTWC
    Last thing is the game's executable.
    I'll think about replacing the tracker module audio with streamed ones for this since I did that for another platform already anyways.
    Overall just running top the total cpu usage of the process maxed out around 37% us. My Pandora runs at 969 MHz.

    So yeah, I think this is on the GPU.
    Guess I'll prepare something if someone here is willing to run it real quick. Not today though.
     
  9. ptitSeb

    ptitSeb Serial Porter

    Joined:
    Aug 15, 2012
    Messages:
    8,180
    Location:
    France, near Lyon
    If you have a lot of alpha transparency, try to use GL_BLEND instead of GL_ALPHA_TEST if possible.
     
  10. levi

    levi Still fresh, damnit!

    Joined:
    Oct 6, 2008
    Messages:
    9,943
    Location:
    Somewhere off the coast of the EU
    Wasn't there a slowness issue with tracker music in some of your ports? I may be misremembering, and I can't remember what you did to fix those cases anyway to be honest.
     
  11. elvissteinjr

    elvissteinjr Very Active Member

    Joined:
    Aug 23, 2013
    Messages:
    363
    Location:
    Germany
    Sorry for the delay, couldn't find quite the time to actually prepare package it up properly.

    Already doing that.


    So if anyone has a bit of time and a 1 GHz Pandora, could they try running this for me? This is a limited build of the game, starting right in the problematic area.
    Menus are disabled, debug overlay with fps and frametimes is enabled. Pressing Start will exit the game.
    There's nothing really that has to be done as far as controlling goes. You can move around, but interactions and such were removed for this.

    I'm only really interested if it reaches 60 fps just fine. The frametimes are hard to read as they are displayed right now, I know.
    Thanks!
     

    Attached Files:

    undexsym and KidPaddle like this.
  12. KidPaddle

    KidPaddle Member

    Joined:
    May 3, 2009
    Messages:
    100
    Location:
    Germany
    FPS is between 58 and 59
    BGStep: most 0us, sometimes more
    Obj Steps: 30us
    Render: 2900-3000us
    Cleanup: 0us
    Total: not stable, can not clearly read, seems to be around 5000us
    Used: 30-33%

    Game objects are really small, had to go very close to the screen to recognize something.

    Thomas
     
    Last edited: Nov 15, 2018
  13. Pickle

    Pickle Mega GP Mania

    Joined:
    May 30, 2006
    Messages:
    5,398
    Location:
    Detroit, Michigan
    elvissteinjr,

    which TI SGX driver are you using?
    are you using vsync?
    are you using SDL/SDL2?
    if no are you using EGL directly?
     
  14. elvissteinjr

    elvissteinjr Very Active Member

    Joined:
    Aug 23, 2013
    Messages:
    363
    Location:
    Germany
    Been a while since I messed with the driver. The choice on CC isn't that great. I suppose it was the default one as I just went through the newer ones and the only one that even started the game froze it after a couple of seconds of worse performance. Huh.
    VSync is requested via SDL_GL_SetSwapInterval(1). I don't see any tearing on my unit, despite GLShim telling me to set an environment variable to enable it for real (guess I'll still do that), so I suppose it's somewhat working? I'm not using the powervr.ini trick.
    SDL 2, I'm using Code::Blocks and the libraries of this PND to build it. I think it's set up to go through GLShim by default, though my code is only using GLES 1.1 functions.


    These render times look much better than on my unit. Thanks! The 2 frames missing here and there may be from VSync not working as intended, as there's a lot room in the frame timing.
    The screen size is really not ideal here, yeah. The game's resolution was designed to scale nicely for modern 16:9 resolutions. Pandora's 800x480 is a nasty in-between. Uneven scaling factors are a sin on pixel art and adapting the game to have a low enough resolution to do 2x on it doesn't quite work out in other parts of the game (I know it seems like I'd be just fine in what I've uploaded).


    So looks like I can disable frameskiping on the 1 GHz Pandora.
    ...probably should've asked earlier, but what would be the easiest way to reliably detected that?

    Edit: Going with checking if "/etc/powervr-esrev" is 5, which should be suitable for what I want to check.
     
    Last edited: Nov 16, 2018
  15. ptitSeb

    ptitSeb Serial Porter

    Joined:
    Aug 15, 2012
    Messages:
    8,180
    Location:
    France, near Lyon
    VSync set with SDL_GL_SetSwapInterval(1) don't really work. You need to do the "powervr.ini" trick to have a flicker free swapbuffer.

    SDL2 is built for both GL and GLES, because it find a libGL, GLES is ignored, unless you used special environement variable. To get true ES1.1 context, use:
    Code:
    SDL_VIDEO_GLES2=1 SDL_VIDEO_GL_DRIVER=libGLES_CM.so
    
    (yep, it's SDL_VIDEO_GLES2, even for GLES1.1)
     
  16. elvissteinjr

    elvissteinjr Very Active Member

    Joined:
    Aug 23, 2013
    Messages:
    363
    Location:
    Germany
    Hi, as soon as GLES 1.1 is forced, creation of the window fails.

    Optional settings aside, the executed context creation code in this case boils down to this:

    Code:
    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
    
    SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
    SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 6);
    SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);
    
    SDL_GL_SetAttribute(SDL_GL_BUFFER_SIZE, 16);
    SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 0);
    
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 1);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_ES);
    
    SDL_CreateWindow(title, SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, 800, 480, SDL_WINDOW_OPENGL | SDL_WINDOW_FULLSCREEN);
    No error string from SDL, but it prints "SDL: Forcing GLES2" before window creation.
    This works on other devices (i.e. Android or even Steam Link) and doesn't look much different from what recommended on the wiki either. Am I missing something?

    Edit: It's working after using the latest beta of the Code::Blocks PND. Whew.
     
    Last edited: Nov 25, 2018

Share This Page

Loading...