GP32 Timer Problems


Pirotic

Certified Guru
Joined
Feb 16, 2004
Messages
593
yet another post by me! (everytime i fix something, another problem crops up)

For a while now i've noticed the frame-rate counter wasn't accurate, and after adding some debug displays i actually noticed the tick counter is going super speed.


for example, the FPS counter is on a timer.. eg

GpTimerOptSet(0,1,0,FPStimer);
GpTimerSet(0);

however, the FPStimer function is being called about 2.5 times a second :p

also my frame-rate limiter uses the GpTickCountGet() function, and despite telling it to wait 1000, it only waits about 1/3 of a second.

could it be that the tick counter is linked to the clockspeed? and that they are running too fast due to my application using 99mhz rather than the 60 default

GpClockSpeedChange (99000000, 0x3a002, 1);

the official SDK makes no mention of this, so any advice would be apprechiated.
 
I've been having similar problems with no solution yet :(

if it is linked to clockspeed, it is a little strange, as you have increased the clockspeed by 65% but the timer is 150% too fast... :blink:
 
maybe, but when i lower the clock speed the timer does indeed get closer to the correct timing.

but like you said, its not as simple as just working out the % the clock speed is faster than default then subtracting from the timing, as it doesn't seem proportional.
 
The timer is based on the standard 40mhz cpu speed. (I beleive thats the standard...)

Any CPU speed above that will increase the timer speed.

What I do, is simply have delay variables

#define delay40mhz 50
#define delay133mhz 320

int realdelay;
int usedmhz;
/* somehow let the user (OR YOU) select the mhz */

if (usedmhz == 133) realdelay = delay133mhz;
else if (usedmhz == 40) realdelay = delay40mhz;

Etc... and then use the variable "realdelay" wherever you have timers.
 
fixed it now - figured i should explain so anybody with the same problem can solve it also.

my problem was, i dont use delays to manage the frame-rate, instead i make it work out how many ticks each frame should take to make it run at the desired FPS, then if any frame takes up less time - wait for the correct amount of time to pass before drawing the next frame, meaning it doesn't speed up at all.

i created this value
speed_up = ((1 / clock_speed_default) * clock_speed);

then added it onto this value, which is how long each frame should "take"
interval = (1000 / FRAMELOCK) * 2.475;

and worked for me, the 1 second long animation seems to be taking a second now, weirdly the timer was tricky to fix.

i tried

GpTimerOptSet(0,2.475,0,FPStimer);

but according to that my app is running at 4fps, which it clearly isnt :p

so instead i changed the timer so that its back to:

GpTimerOptSet(0,1,0,FPStimer);

but then made it simply times the FPS by 2.474, and its its now running at the correct speed, displaying the correct frame-rate.

hope that helps!

something else i was chuffed with, was that until now i thought my progam actually was running at only 17fps max - after i fixed my code up its running silky smooth at 60fps with everything on, can't wait to release the next demo now ;)
 
That makes sense. Although its good to know the default clock speed is 40 and not 60 as I previously thought - all the numbers add up correctly now. :lol:
 
Back
Top