Don't Panic! games cannot use even 256MB of ram.


Joined
Sep 12, 2010
Messages
282
An Open Pandora game can only use roughly 22MB of ram per frame if it wants to run at 30 FPS (11MB for 60 FPS)


So developers cannot realistically make 512MB of RAM a requirement, the game would start to crawl before using even 128MB actively.


according to http://www.pengutron...0100702_en.html


performance of the beagle board (DDR-133): 254.05MB/sec (look at that, its almost 133x2 = 266MB/sec)


the Pandora has faster ram (DDR-333), let's assume the same approximation holds with an unattainable peak of 660MB/sec.


In reality the CPU has to do other stuff, so you don't get that in reality, plus higher clock speed usually need longer initial access delays.


so, if you want to have a game running at 60fps or 30fps that's


660MB/sec / 60 fps = 11MB per frame usable


660MB/sec / 30 fps = 22MB per frame usable


and that's if in theory the CPU was only reading/writing from RAM and wasn't doing anything else (such as 3D math and game logic code)


or if you prefer, using all of 256MB of ram during a game loop would make the game run below 2.6 frames per second.


some more info: http://en.wikipedia....orking_set_size
 
Beagleboard doesn't have DDR-133 RAM, and even if it did your calculation seems to assume it's 8-bits wide.. but also neglects that copy bandwidth includes both read and write operations (in typical code you tend to see more reads than writes). The theoretical maximum bandwidth is the same as Pandora: 166MHz * 2 * 4-bytes = 1.33GB/s. It gets about half of this. Biggest reason is that Cortex-A8 doesn't have any automatic prefetching, so you end up getting dominated by latency over bandwidth. Switch to NEON code with preloads and the performance goes up. Cache configuration (write through vs write back, write allocation on/off, etc..) makes a difference too. You get the best performance if you can utilize the preload engine, which is like DMA to/from L2.


And of course other things access the RAM. No matter how bad the CPU is at it that doesn't mean that the DSP is as bad, much less the display controller or GPU which are designed to be good at utilizing bandwidth. But I don't really think I get the point. You don't have to constantly hit all of RAM for it to be useful. Look at a first generation CD-ROM platform like PCE-CD: the CD access speed was only 150KB/s and the seek times were absolutely horrible: over a second wasn't unusual. And you only had a pitiful amount of RAM to use. But this still allowed for a lot to be done with games, enough to let the CD-ROM attachments outlive the card based system's early decline to the 16-bit generation, to the extent that the CD games outlived the 16-bit ones themselves. I know this isn't a perfect analogy since we're talking raw storage and not RAM and Pandora already has access to big SD cards, but the basic point remains. If you don't like that consider the Arcade Card add-on instead. It was nothing more than 2MB of RAM and I guarantee you that a PCE game could not come anywhere close to accessing all of it in one frame. More to the point, there's no way a game could even access anywhere close to the RAM the system itself had in one frame, and pretty much every game was 60Hz based. But the fighting games that require it couldn't have been made without it. They wouldn't have come anywhere close.


Having lots of RAM lets you have big levels with big textures. This works well precisely because it's both fast and random access. It also works well because it's fast to modify, allowing for big levels that you can create, like in Minecraft.
 
Last edited by a moderator:
allright, I'll bite.

The theoretical maximum bandwidth is the same as Pandora: 166MHz * 2 * 4-bytes = 1.33GB/s. It gets about half of this.

so 1333 MB/s divided by two ... oh would you look at that: 667MB/sec, man, I was way off.


you can't make a parallel between the memory starved PCE (256KB of RAM in the Duo version) getting a 2MB memory upgrade to store 2D animation frames on a fighting game.


that's a 900% ram upgrade.

Having lots of RAM lets you have big levels with big textures. This works well precisely because it's both fast and random access. It also works well because it's fast to modify, allowing for big levels that you can create, like in Minecraft.

big textures are something to be avoided as they are the main performance killer on mobile systems, the working set has to fit within the memory bandwidth.


big levels can be streamed (they are in any decent game engine)


same for Minecraft-like games, you can (and should) stream in and out compressed chunks and use RAM as a cache/buffer for the real permanent level storage: the HDD (or SD card)


170KB of level storage:


http://rv6502.com/img/gen3d_cubeworld_13.png


not many games could really require over 256MB of ram within the GPU/CPU/RAM bandwidth limitations of the OP hardware without any simple enough solutions to make them fit.


specially one made by an OP developer in his spare time, we're not talking Capcom or Konami -sized time and money dev budgets here, anything else would fit.


even if someone somehow would make a Mortal Kombat -style fighting game with >200MB of compressed animation for the selected characters (really o_O ??? think about how much video time that is in MPEG2 for a sprite!) you can stream animation data off the SD card, all you need are the first few data blocks buffered and ready to display while you stream the rest, and an SD card has virtually no seek time, you can easily stream 3 animations at the same time (Character 1 & 2 and BG)
 
so 1333 MB/s divided by two ... oh would you look at that: 667MB/sec, man, I was way off.

But going by the work you did to get that number that's a complete coincidence. As I said, read + write bandwidth is not really big ticket because a program will tend to have more reads than writes. Never mind that it won't typically want to do nothing but read + modify + write all of memory constantly.

you can't make a parallel between the memory starved PCE (256KB of RAM in the Duo version) getting a 2MB memory upgrade to store 2D animation frames on a fighting game.
that's a 900% ram upgrade.

Yes, you can, because your entire argument is that RAM capacity is bottlenecked by how much you can access in one frame. It doesn't make any sense that this argument only suddenly gains relevance when going from 256MB to 512MB.

big textures are something to be avoided as they are the main performance killer on mobile systems, the working set has to fit within the memory bandwidth.

Your working set from a performance point of view for textures is texture cache, not RAM. Of course, texture cache is tiny and is much more useful for spatial locality than temporal locality. That said, loaded textures doesn't really define your working set. Larger textures means more LODs, where that detail is received from part of the texture as you need it. Trying to stream in textures this way is a real pain in the ass and has definite overhead.


I would like to see what a company like say, Epic, thinks about 256MB vs 512MB of RAM using a bandwidth argument like the one you've presented. Epic Citadel owes a lot of its appeal to huge textures.

big levels can be streamed (they are in any decent game engine)


same for Minecraft-like games, you can (and should) stream in and out compressed chunks and use RAM as a cache/buffer for the real permanent level storage: the HDD (or SD card)

Not everything streams well, and a lot of game level design can be hampered by these restrictions. If you view RAM as a big cache having more of it helps...


Which is great if you care about single-color voxel topographies, I guess.

not many games could really require over 256MB of ram within the GPU/CPU/RAM bandwidth limitations of the OP hardware without any simple enough solutions to make them fit.


specially one made by an OP developer in his spare time, we're not talking Capcom or Konami -sized time and money dev budgets here, anything else would fit.

Now we're getting somewhere. Maybe try saying this instead of the technical line of reasoning you tried in your post. Note that I've never been arguing that Pandora is suddenly going to see a bunch of new games that need the RAM.


But it will be able to run more bloated PC ports that won't necessarily be too slow for it (your argument).

even if someone somehow would make a Mortal Kombat -style fighting game with >200MB of compressed animation for the selected characters (really o_O ??? think about how much video time that is in MPEG2 for a sprite!) you can stream animation data off the SD card, all you need are the first few data blocks buffered and ready to display while you stream the rest, and an SD card has virtually no seek time, you can easily stream 3 animations at the same time (Character 1 & 2 and BG)

Yes, I agree that a 2D fighting game would be hard pressed to benefit from so much RAM. Just because I used it in an analogy doesn't mean I think it applies directly.
 
Last edited by a moderator:
Guys/Gals,


All the technicality aside, were I for example to start writing a python/pygame-based game where all I need to do is create the next frame and blit to screen a number of times per second, would I ever hit any of the technical limits mentioned in the above posts if I just adhered to the Pandora's resolution, number of colour bit-planes etc.? Would I need to have such intimate knowledge of the underlying hardware and software tech to mitigate hitting these limits or can I just happily dump my graphical ideas into simpleton code and just know that it would work each and every time?


I know that with such conversations as the above it helps dispel the reasoning that more mem=always better (at least in Pandora gaming graphics terms) but since I'm just your average high-level-language programming joe, would your explanations and counter-arguments need to be of concern in MY low-level-ignoramus view of how to go about programming such a game? I'm not prepared to go into such detail just to put something animated on the screen, as you can guess ;)
 
And one thing to keep in mind is your use case for the device.


It's always fun talking computers with non-technical people; ask them how much memory they need in their desktop or laptop, and it's always 'More! More!'. So you end up with people who 'need' 8GB of memory for web browsing. And writing essays in Microsoft Word. They'd easily get by with 4GB (or even 2GB). No one ever thinks of why they need more memory, they just want it (and when 4GB more of memory is so cheap these days, it is understandable).


Most (if not all) of the 400+ apps in the repo work quite well with 256MB of memory. All of the applications I've run on the Pandora work well in 256MB of memory. Heavy desktop use has been really the only case where more memory would be useful (having Abiword, a web browser and a music player all running at once). So just analyse your use to determine if the extra memory is beneficial or not to you.
 
Who is panicking? It hardly seems like it would end up limiting the device and if anything it would have some benefit somewhere. Previously people have been loathe to get into DSP related stuff due to the RAM usage but this will negate that. So inadvertently maybe there will be some performance benefits if any of the coders go down that route.
 
Thanks to optimization we have programs which require a few MB storage,
and the software only which should need just a few MB of RAM, certainly not all of it.


______________________________


>A wild application appears!


>>Developers use optimization.


>>>It super effective!


______________________________


All in all, as people have already said - the major use for more RAM is in using the Pandora as a dedicated desktop.


Good for word processors, spreadsheets, web browsing, audio DAW; not sure about GIMP-esque drawing or Blender but certain those will also benefit from more RAM. Mandelbrot fractals?
What I more or less do, thus my interest in the upgrade...


Im scratching at the RAM limit already and am forced at times to swap...


Getting rid of that to allow some proper gimp use or some minimal letter writing or simply improve webbrowsing would help.
 
Strange how specific mame games dont load and when virtual memory is enabled they load (dragon blaze strikers 1945 3)so id say its a big thing todays handhelds should have at least igb of memory so users have peace of mind that future builds of other OSes and games load.Im pretty sure future android and linux builds will require said memory
 
Last edited by a moderator:
Im pretty sure future android and linux builds will require said memory

Just speculating, but the kernel itself probably only needs a few MB to boot up, get modules ready, and maybe even start a shell session.


Anything beyond that is just for the user's GUI / WebKit / sound / lights and magic.


Android probably will "require"* GBs of memory, because they want to let people make huge multimedia applications with lots of bitmaps and HD video. I'm not sure how much the kernel / compositor / Dalvik / random background services use on their own, but it's probably not a whole lot.


* afaik the most important part of Android is the hardware specification that they hold the manufacturers to. The OS / GUI itself doesn't need all that RAM, but they have to demand it in case some user-installed application wants to use it.
 
Last edited by a moderator:
Years ago when RAM was much more expensive, it was not uncommon for lower to middle priced PC's to be sold with the minimum recommended amount or just above the requirement for Windows. Those people would come to someone like myself and ask why once they loaded programs, antivirus etc. the computer began to crawl. They would be told they had too little RAM and that adding more would and usually did speed things up. So now when even budget PC's ship with 1-2GB of RAM, people who do not know alot about computers but have had them before tend to want more RAM upfront, just to be sure they didn't have to deal with what they considered to be a problem that was just a matter of time.


This is what I believe to be the reason someone would have 8GB for web browsing. It's overcompensation for sure, but they are happy anyway. For someone like myself who is somewhat knowledgable about PC's, I find the above conversation of frame rates and RAM over my head as I don't write code. I just wanted to share my belief as to why the non-technical out there have the drive for excess RAM.


Heck, my neighbor acrossed the street was wondering if he needed 4+GB of RAM for his PC that I am to build for him and was shocked when I told him 1-2GB would be plenty. Then I had to explain why. He just assumed since everything else was bigger, that a ton would be necessary just to play Windows built in games and web surf. Then again, I still frequently talk to people who think memory is the hard drive and say their new PC should have a larger memory cause they want to put music on it...lol.
 
Last edited by a moderator:
Years ago when RAM was much more expensive, it was not uncommon for lower to middle priced PC's to be sold with the minimum recommended amount or just above the requirement for Windows. Those people would come to someone like myself and ask why once they loaded programs, antivirus etc. the computer began to crawl. They would be told they had too little RAM and that adding more would and usually did speed things up. So now when even budget PC's ship with 1-2GB of RAM, people who do not know alot about computers but have had them before tend to want more RAM upfront, just to be sure they didn't have to deal with what they considered to be a problem that was just a matter of time.


This is what I believe to be the reason someone would have 8GB for web browsing. It's overcompensation for sure, but they are happy anyway. For someone like myself who is somewhat knowledgable about PC's, I find the above conversation of frame rates and RAM over my head as I don't write code. I just wanted to share my belief as to why the non-technical out there have the drive for excess RAM.


Heck, my neighbor acrossed the street was wondering if he needed 4+GB of RAM for his PC that I am to build for him and was shocked when I told him 1-2GB would be plenty. Then I had to explain why. He just assumed since everything else was bigger, that a ton would be necessary just to play Windows built in games and web surf. Then again, I still frequently talk to people who think memory is the hard drive and say their new PC should have a larger memory cause they want to put music on it...lol.

Well since Windows alone now takes up near a gig of RAM just sitting idle, add web browsing, some anti-virus software and perhaps some other useless programs that the guy thinks he needs running in the background.. the guy is most likely using well past one gig and closing in on or exceeding two Gigs of system RAM which causes the system to start caching from the hard-drive and slows performance..To me four Gigs of RAM is a safe minimum for system RAM in the current day Windows environment.
 
Last edited by a moderator:
I concur, with memory cost being so low, 4GB is what I'd go for a new box.


My primary computer at the moment is a three year old netbook with 2GB, and it's more than enough to run Debian + Gnome (or XP, though I had Win7 on it for a bit without issues (aside from some graphical issues with Diablo II and Starcraft, but I digress)). I'll usually take a GB to use for a memory filesystem for all sorts of web browser caching and the like.
 
Panic! I use the pandora as a full desktop, and as a mobile test platform for my electronics and microcontroller experiments. It usually has at least 3 programs running within 5 minutes after startup.


Really want that extra ram, but can't really afford it atm (really can't is more accurate probably)...


But then again, i was lucky enough to have mine received a year and a half ago already. That makes it easier to bear ;)
 
Currently I know exactly two programs of which I am 100% sure that they can make good use of the memory:


Firefox and Battle for Wesnoth. Yes, the memory is used basically for data structures, not for data required during each display refresh. There the result of "not enough memory" is simply a crash due to "out of memory". Those should already happen less often once stuff like ZRam (a compressed area in the ram resulting in effectively more ram space being available for a slight impact on performance) is used (likely to happen in the .next firmware).
 
And NX client, which needs a bit more RAM than we have ;)


Okaym the discussion above is old, and someone grave-robberred it, but still..


The argument above is pretty odd .. the _size of RAM_ is not related to the _bandwidth_ at all, unless you're ripping through your entire RAM per frame to do somethingm, which is really not the case; in a 2d title you're using a couple of framebuffers and updating them, so you're updating only fragments of the display at a time anyway (or at worst, the whole thing, which is not _that_ big.) In 3D, youi're uploading lists mostly (textures are preloaded), so again, not that much.


Bandwidth is a limiting factor for sure, but its not that harsh a limit (come on, as Exo was suggesting, we've been workgin with much less for most of history, by definition, right? :) It was not that long ago I was writing games where we couldnt' do recursion (not enough stack, on a platform where we couldn't move the stack around very easily..); the Pandora is very easy to work with by comparison to most any mobile gaming device ever, and is relatively beefy (superior to most laptops not that many years ago.) The full 800x480x2 screen is a lot of pixels to push in terms of full framebuffer writes, mind :)


But RAM size is really nothing to do with it.


Or am I missing something?


jeff
 
the bandwidth will limit the amount of memory you can access per frame, which in turn limits the size of textures, geometry data, and the data working set.


everything else is just data that is waiting to be used eventually (the rest of the level, sounds/music that aren't being played right now, etc) which with a proper caching system can be "swapped" out of memory and streamed back in by the game engine more efficiently than the OS can on its own.


you can also use mmap to load your game data files as a read-only "swap" spaces, this lets the OS recycle the memory pages without having to first write it to a swap file since its already on disk and it does not cause any wear of the flash medium.


this is much better than reading your data into regular read-write memory using ( malloc + read, or equivalent )

In 3D, youi're uploading lists mostly (textures are preloaded), so again, not that much.

those "preloaded" textures still have to be accessed by the GPU, and in the case of the OMAP3 this is done from main RAM, there is no real "VRAM" its just an area of regular memory that's reserved for GPU use.


the GPU reading those texture to draw the frame eats up at the same bandwidth as the CPU reading data so you have to limit the texture size for performance reasons = use less memory.


texture and geometry (along with audio samples) are usually what eats up the most RAM in a game and in this case we're limited on what we can use per frame


the rest does not need to be held permanently in ram, it can, and should be streamed off the permanent storage as needed.


if games can do this off a DVD drive with horrible latency, we can certainly do that off an SD card with no seek delays and at least 12MB/s.


you have an active zone corresponding to the (2D screen area / 3D camera frustum) around the player/camera


a ready zone that's appropriately sized around the (2D screen area / 3D camera frustum) ready to show data if the player moves/turns around


and a preload zone around the ready zone that's being prepared by a background thread.


the rest is held off memory or cached compressed in ram.


same for audio:


you have the first second or two of soon-to-be-used samples already decompressed


the next ~10 seconds of the samples held compressed in ram


the rest is left sitting on the media.


with proper programing you only need to keep in RAM whatever you cannot stream off the permanent storage in time to be used.
 
Back
Top