Code::Blocks with C/C++ Compiler


Qt 5.11.0 is built for more than a week (without WebEngine, that doesn't want to build because libc is too old).
I was struggling with QtWebKit... The build doesn't want to finish and got killed (with 4GB swap) on Document.cpp (and yes, I have disabled the monolitic built). I have a last try running, not sure what I will try if it still fail (probably try to cut the file, but that complicate).

Not sure I want to build a new Qt for a minor bump in the version, unless there is something uber important in the ChangeLog.
 
Hi all :)

@ptitSeb : I'm trying to build some OCR program (tesseract) on the Pandora with your latest Code::Blocks release (GCC 8.1), but it fails with the following error:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h: In member function ���double TBOX::x_overlap_fraction(const TBOX&) const���:
rect.h:463:65: error: no matching function for call to ���max(float, double)���
return std::max(0.0, static_cast<double>(high - low) / width);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:61,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:219:5: note: candidate: ���template<class _Tp> const _Tp& std::max(const _Tp&, const _Tp&)���
max(const _Tp& __a, const _Tp& __b)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:219:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:463:65: note: deduced conflicting types for parameter ���const _Tp��� (���float��� and ���double���)
return std::max(0.0, static_cast<double>(high - low) / width);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:61,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:265:5: note: candidate: ���template<class _Tp, class _Compare> const _Tp& std::max(const _Tp&, const _Tp&, _Compare)���
max(const _Tp& __a, const _Tp& __b, _Compare __comp)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:265:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:463:65: note: deduced conflicting types for parameter ���const _Tp��� (���float��� and ���double���)
return std::max(0.0, static_cast<double>(high - low) / width);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:62,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3462:5: note: candidate: ���template<class _Tp> _Tp std::max(std::initializer_list<_Tp>)���
max(initializer_list<_Tp> __l)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3462:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:463:65: note: mismatched types ���std::initializer_list<_Tp>��� and ���float���
return std::max(0.0, static_cast<double>(high - low) / width);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:62,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3468:5: note: candidate: ���template<class _Tp, class _Compare> _Tp std::max(std::initializer_list<_Tp>, _Compare)���
max(initializer_list<_Tp> __l, _Compare __comp)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3468:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:463:65: note: mismatched types ���std::initializer_list<_Tp>��� and ���float���
return std::max(0.0, static_cast<double>(high - low) / width);
^
rect.h: In member function ���double TBOX::y_overlap_fraction(const TBOX&) const���:
rect.h:485:66: error: no matching function for call to ���max(float, double)���
return std::max(0.0, static_cast<double>(high - low) / height);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:61,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:219:5: note: candidate: ���template<class _Tp> const _Tp& std::max(const _Tp&, const _Tp&)���
max(const _Tp& __a, const _Tp& __b)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:219:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:485:66: note: deduced conflicting types for parameter ���const _Tp��� (���float��� and ���double���)
return std::max(0.0, static_cast<double>(high - low) / height);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:61,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:265:5: note: candidate: ���template<class _Tp, class _Compare> const _Tp& std::max(const _Tp&, const _Tp&, _Compare)���
max(const _Tp& __a, const _Tp& __b, _Compare __comp)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algobase.h:265:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:485:66: note: deduced conflicting types for parameter ���const _Tp��� (���float��� and ���double���)
return std::max(0.0, static_cast<double>(high - low) / height);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:62,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3462:5: note: candidate: ���template<class _Tp> _Tp std::max(std::initializer_list<_Tp>)���
max(initializer_list<_Tp> __l)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3462:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:485:66: note: mismatched types ���std::initializer_list<_Tp>��� and ���float���
return std::max(0.0, static_cast<double>(high - low) / height);
^
In file included from /mnt/utmp/codeblocks/usr/include/c++/8.1.0/algorithm:62,
from ../../src/ccutil/genericvector.h:23,
from boxword.h:23,
from blamer.h:26,
from blamer.cpp:21:
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3468:5: note: candidate: ���template<class _Tp, class _Compare> _Tp std::max(std::initializer_list<_Tp>, _Compare)���
max(initializer_list<_Tp> __l, _Compare __comp)
^~~
/mnt/utmp/codeblocks/usr/include/c++/8.1.0/bits/stl_algo.h:3468:5: note: template argument deduction/substitution failed:
In file included from boxword.h:24,
from blamer.h:26,
from blamer.cpp:21:
rect.h:485:66: note: mismatched types ���std::initializer_list<_Tp>��� and ���float���
return std::max(0.0, static_cast<double>(high - low) / height);
^
make[2]: *** [Makefile:504: blamer.lo] Error 1

Do you understand what I'm doing wrong ?

Cheers, Magic Sam
 
Yep, you are using -fsingle-precision-constant (it's by default in CFLAGS and CXXFLAGS because it's faster on Pandora), and this mess up the template resolution... Either force the template, or remove the flags from you CFLAGS (or add -fno-single-precision-constant to the CFLAGS). Because it's OCRa nd may need the full double precision I suggest the second option.
 
hey, just a quick question so I'll put it in here:

The gigahertz Pandora's GPU clocks about twice as fast, but what is the frame rate difference in practice on average? I only have a good old CC Pandora myself and I was finally gonna release something for it. The CC one doesn't quite cut it, with 40 - 45ish fps in the most demanding scene of the game, so I would default to frameskipping on. If a gigahertz one manages just fine I'd like to autodetect the model and default to full framerates for a better experience, though.
I'd also like to avoid auto-frameskip in order to maintain a consistent framerate during the game, so there's that.

I could of course build a PND with just that scene and upload it for someone to test, but if there's past experience that points towards a good enough improvement I could skip this step.

Thanks!
 
Maybe ptitseb could hazard a guess, but it's a very difficult question to nail down positively. Firstly, where's your code spending most of its time in a profiler? If it's missing 60fps mostly because of the render step or game logic (especially any of the fiddly bit inside that) it varies. There's also the RAM speed to consider on the different units, because of the way the graphics unit writes the framebuffer back to RAM on ARM systems.
 
Well, the Gigahertz is faster, but not 2* faster all-in-all (but like 50% is some cases, compared to my CC @800Mhz).

But as @levi mentionned, it will depend where is your bottleneck. Using and ssh window, try to have a look at "sudo perf top", and see if it's caping at 100% CPU (then it's probably cpu limited) or if cpu idle a bit (then it's probably GPU limited). CPU on Gigahertz is also faster the CC (more cache), but you wont double the speed...
 
I'm already pretty sure I'm bottlenecked by the GPU, but I re-checked everything.

According to the engine's internal performance counters it's spending about 50µs in the game logic code, and around 20000µs per frame in the rendering code. With frameskip on it's taking just ~3000µs - ~4000µs there, possibly since it's not running behind anywhere.

The scene in question uses multiple somewhat large textures with alpha transparency. I'm aware how taxing this is on that thing, but resizing or using compressed ones isn't an option for me here.

Didn't copy it off (had some issues getting the cli tools run over ssh, eh), but perf top looked like this:
Code:
25.50% libmikmod.so
7.86% libGLES_CM.so
5.40% kernel
2.00% ProjectTWC
Last thing is the game's executable.
I'll think about replacing the tracker module audio with streamed ones for this since I did that for another platform already anyways.
Overall just running top the total cpu usage of the process maxed out around 37% us. My Pandora runs at 969 MHz.

So yeah, I think this is on the GPU.
Guess I'll prepare something if someone here is willing to run it real quick. Not today though.
 
Wasn't there a slowness issue with tracker music in some of your ports? I may be misremembering, and I can't remember what you did to fix those cases anyway to be honest.
 
Sorry for the delay, couldn't find quite the time to actually prepare package it up properly.

If you have a lot of alpha transparency, try to use GL_BLEND instead of GL_ALPHA_TEST if possible.
Already doing that.


So if anyone has a bit of time and a 1 GHz Pandora, could they try running this for me? This is a limited build of the game, starting right in the problematic area.
Menus are disabled, debug overlay with fps and frametimes is enabled. Pressing Start will exit the game.
There's nothing really that has to be done as far as controlling goes. You can move around, but interactions and such were removed for this.

I'm only really interested if it reaches 60 fps just fine. The frametimes are hard to read as they are displayed right now, I know.
Thanks!
 

Attachments

  • Benri_Test.pnd
    11.5 MB · Views: 375
FPS is between 58 and 59
BGStep: most 0us, sometimes more
Obj Steps: 30us
Render: 2900-3000us
Cleanup: 0us
Total: not stable, can not clearly read, seems to be around 5000us
Used: 30-33%

Game objects are really small, had to go very close to the screen to recognize something.

Thomas
 
Last edited:
which TI SGX driver are you using?
Been a while since I messed with the driver. The choice on CC isn't that great. I suppose it was the default one as I just went through the newer ones and the only one that even started the game froze it after a couple of seconds of worse performance. Huh.
are you using vsync?
VSync is requested via SDL_GL_SetSwapInterval(1). I don't see any tearing on my unit, despite GLShim telling me to set an environment variable to enable it for real (guess I'll still do that), so I suppose it's somewhat working? I'm not using the powervr.ini trick.
are you using SDL/SDL2?
SDL 2, I'm using Code::Blocks and the libraries of this PND to build it. I think it's set up to go through GLShim by default, though my code is only using GLES 1.1 functions.


FPS is between 58 and 59
BGStep: most 0us, sometimes more
Obj Steps: 30us
Render: 2900-3000us
Cleanup: 0us
Total: not stable, can not clearly read, seems to be around 5000us
Used: 30-33%

Game objects are really small, had to go very close to the screen to recognize something.

These render times look much better than on my unit. Thanks! The 2 frames missing here and there may be from VSync not working as intended, as there's a lot room in the frame timing.
The screen size is really not ideal here, yeah. The game's resolution was designed to scale nicely for modern 16:9 resolutions. Pandora's 800x480 is a nasty in-between. Uneven scaling factors are a sin on pixel art and adapting the game to have a low enough resolution to do 2x on it doesn't quite work out in other parts of the game (I know it seems like I'd be just fine in what I've uploaded).


So looks like I can disable frameskiping on the 1 GHz Pandora.
...probably should've asked earlier, but what would be the easiest way to reliably detected that?

Edit: Going with checking if "/etc/powervr-esrev" is 5, which should be suitable for what I want to check.
 
Last edited:
VSync set with SDL_GL_SetSwapInterval(1) don't really work. You need to do the "powervr.ini" trick to have a flicker free swapbuffer.

SDL2 is built for both GL and GLES, because it find a libGL, GLES is ignored, unless you used special environement variable. To get true ES1.1 context, use:
Code:
SDL_VIDEO_GLES2=1 SDL_VIDEO_GL_DRIVER=libGLES_CM.so
(yep, it's SDL_VIDEO_GLES2, even for GLES1.1)
 
Hi, as soon as GLES 1.1 is forced, creation of the window fails.

Optional settings aside, the executed context creation code in this case boils down to this:

Code:
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 6);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);

SDL_GL_SetAttribute(SDL_GL_BUFFER_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 0);

SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_ES);

SDL_CreateWindow(title, SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, 800, 480, SDL_WINDOW_OPENGL | SDL_WINDOW_FULLSCREEN);

No error string from SDL, but it prints "SDL: Forcing GLES2" before window creation.
This works on other devices (i.e. Android or even Steam Link) and doesn't look much different from what recommended on the wiki either. Am I missing something?

Edit: It's working after using the latest beta of the Code::Blocks PND. Whew.
 
Last edited:
Hello again,

I'd like to detect when the lid is being closed and opened. I've found "op_lidstate", but polling that is far less than ideal.
Any good options? I loosely remember reading before that it also acts as a key, but can't find any reference for that, nor does it actually generate a key event in SDL.
 
Nothing? Okay, not super important, so let's drop that then.

More importantly, I'm running into issues when running my game from a PND. It's running fine for the most part, except that all mouse input goes past the window to the desktop or whatever is behind the fullscreen application and the game leaves black dirty rectangles on the screen after exiting. Any ideas? The exact same directory, but not inside a PND, runs without these issues.

Edit:
Hope nobody spent too much time trying to help and then gave up or something... the fault lies within me, obviously. The main difference between running as PND and directly was the different config directory and as it turned out, the game didn't run in fullscreen mode when running from the PND. Some default config shenanigans were wrong for the Pandora and well, the fullscreen/windowed part is removed from the settings in the game as it does not work anyways, so I didn't notice it from there.
The behavior in the broken windowed mode is still very weird to be honest, but oh well.
One step closer to release. I swear this isn't quite worth it, considering what I'm putting out.
 
Last edited:
@ptitSeb Do you have a preferred method of launching the Dev command line from your PND while ssh'ing into the Pandora?
 
Back
Top