Emulators Optimization


Prometheus said:
I'm only not using GPFCE, which is, I'm guessing, far more optimised, because I don't much like the non-toggleable 2xSaI-style graphical filter that it currently has. :p Once that's gone, I will drop NesEmu in a heartbeat, because it's given me a ton of hassle.
Just disable the filter before running with
Code:
sudo /usr/pandora/scripts/op_videofir.sh none
If you know how to edit .pnds you can add that in the startup script.
 
Last edited by a moderator:
I thought that was for disabling the so-called "blurry" one, such as is used by GINGE by default? The one GPFCE is using isn't that. Or will that work for that one as well?
 
Yeah, the two filters have nothing to do with each other. GPFCE is forcing a soft 2xSai-like filter that I personally don't enjoy.

WizardStan said:
I don't see how the two are mutually exclusive. The NES emulator probably doesn't have a dynamic recompiler for the Pandora yet, which means it is doing emulation in the slowest possible way. Nothing magic about the hardware is going to change that, just someone taking the time to write an ARM recompiler.

No decent NES emulator has or needs a recompiler. Even on a GBA NES emulation doesn't need dynamic recompilation (not that there's really room for it), and that's just 16.7MHz ARM7. The time it takes to emulate video w/o skipping frames is probably a lot more than the time it takes to emulate a 6502 @ 1.8MHz with even a moderately optimized interpreter. Games probably typically push under 400,000 instructions per second. You'd be much better off optimizing video as much as possible first, but if you did manage to do that then chances are the difference between interpreter and recompiler would be the difference between skipping no frames a second and skipping a few frames a second. Not worth it.

Meanwhile you'd probably be trading in timing precision that a lot of NES games do need, since most recompilers execute entire blocks unconditionally until a branch is hit, and thus can't be interrupted at a finer grain. You could do an emulator that has more precise timing, but you'd lose the ability to perform a lot of optimization and you'd increase emitted code size a lot, quite possibly to the point where the damage to instruction cache exceeds what's gained by not interpreting. There are some possible hybrid approaches but I haven't seen anyone try them.
 
Last edited by a moderator:
Prometheus said:
I thought that was for disabling the so-called "blurry" one, such as is used by GINGE by default? The one GPFCE is using isn't that. Or will that work for that one as well?
Ah so you meant native gpfce port, the GP2X version under GINGE shouldn't be forcing any filters.
 
Last edited by a moderator:
Exophase said:
No decent NES emulator has or needs a recompiler.
Well, there's this one... https://tuxnes.sourceforge.net

x86 only though

Exophase said:
Meanwhile you'd probably be trading in timing precision that a lot of NES games do need, since most recompilers execute entire blocks unconditionally until a branch is hit, and thus can't be interrupted at a finer grain.
That's what it seems to be doing. Which games need exact timing precision?
 
Last edited by a moderator:
Ari64 said:
Well, there's this one... https://tuxnes.sourceforge.net

x86 only though

Yeah I know of it, and it was probably harsh to call it sub-decent, but these days pretty bog standard emus coming off of the NESdev production line can easily surpass it, it's just at a big natural disadvantage for no real gain.

Ari64 said:
That's what it seems to be doing. Which games need exact timing precision?

I can't tell you, but it certainly wasn't unheard of for games to count cycles to hit scanlines. What I heard, I believe from Neko, is that the compatibility is around 90% if not including games with off by one bugs. I could be off on that though, if he's reading he can tell us better (or he can tell us if I have the wrong person altogether)
 
Last edited by a moderator:
Exophase said:
I can't tell you, but it certainly wasn't unheard of for games to count cycles to hit scanlines. What I heard, I believe from Neko, is that the compatibility is around 90% if not including games with off by one bugs. I could be off on that though, if he's reading he can tell us better (or he can tell us if I have the wrong person altogether)
This emulator has a tile-based renderer which is fast but inaccurate, and a scanline renderer which is about the same speed as fceu. I don't remember exactly which games were glitchy with the tile-based renderer.

The reason it could run on <200MHz CPUs is because it could drop frames to keep up. Due to the dynarec, the 6502 emulation always ran full speed.
 
Last edited by a moderator:
Neko said:
This emulator has a tile-based renderer which is fast but inaccurate, and a scanline renderer which is about the same speed as fceu. I don't remember exactly which games were glitchy with the tile-based renderer.

The reason it could run on <200MHz CPUs is because it could drop frames to keep up. Due to the dynarec, the 6502 emulation always ran full speed.

From what I recall you saying, the glitches were from the recompiler; if it's from the renderer then that changes things quite a bit.

No offense to the emulator and its creators but these things don't sound especially impressive, several other emulators could do that without recompilers. You could emulate 6502 with an x86 interpreter on a low-end 486.

You also shouldn't need > 200MHz x86 to render NES with a scanline renderer. Having a tile based renderer is usually a good tip-off that the scanline renderer isn't that optimized because when they are they tend to be dramatically slower than tile based ones. At the very least it should do some buffering to be able to support hybrid tile/scanline.

Seems like the goal was more in making a 6502 recompiler because it was interesting, not so much in making an especially fast NES emulator. Which is fine, of course. Then again, it looks like it's much faster than the Pandora NesEmu port is..
 
Last edited by a moderator:
Exophase said:
From what I recall you saying, the glitches were from the recompiler; if it's from the renderer then that changes things quite a bit.
This is something that I played with ten years ago. I remember a few games had problems, but I've lost track of exactly what they were.


Exophase said:
No offense to the emulator and its creators but these things don't sound especially impressive, several other emulators could do that without recompilers. You could emulate 6502 with an x86 interpreter on a low-end 486.
There were emulators that could run on a low-end 486. Although not recompilers, these were written in assembly code and not C. I don't think any of these old emulators are comparable to pure C code emulators like fce or nesemu.

People often say, "Why is this so slow, I used to run x on a 486!" but they forget just how buggy and limited that old software actually was. Also it didn't run at 60fps.
 
Last edited by a moderator:
Neko said:
Exophase said:
No offense to the emulator and its creators but these things don't sound especially impressive, several other emulators could do that without recompilers. You could emulate 6502 with an x86 interpreter on a low-end 486.
There were emulators that could run on a low-end 486. Although not recompilers, these were written in assembly code and not C. I don't think any of these old emulators are comparable to pure C code emulators like fce or nesemu.
Now i'm going to be a bit rude and offensive. But this is exactly what i said to start with. Coders aren't spending the time to do it the right way, and we end up with hackjobs that get the work done in a crude an inefficient way. As grateful as we might be that we get software that runs, it is just a "good enough" project, and nowhere near what it has the potential to become. On the other hand, as i said, i doubt we will get to see that many softwares that take advantage of the hardware platform, and code for it directly. Most will code for the software layer, and be pleased with the outcome of their efforts. However, that is why we wont see the performance that could be had... Sorry, i didn't intend to rub any developer the wrong way, but thats the way i see it.
B!
 
Last edited by a moderator:
You're talking about an NES emulator that emits x86 instructions, I think using an x86 interpreter or even x86 rendering code is pretty fair game for comparison. Now, I did say "could emulate 6502." When it comes to rendering video without skipping frames you need something quite a bit more capable, but not surpassing 200MHz x86, not if you have an efficient design.

It's true emulators are slower because they're more accurate and more portable, and also because they're often saddled with slower interfaces to video and audio. But a lot of them are also just not as optimized, most aren't really designed with that level of performance in mind since it really isn't necessary. Contrast with gpfce on GP2X which does fine at well under 200MHz for ARM9, and I don't think I need to tell you that GP2X's CPU/memory isn't that impressive compared to typical 200MHz x86 machine with a superscalar Pentium and on-motherboard L2 cache.

So yes, I think you can manage NES at typical 200MHz x86 without skipping frames, with rendering code written in C. At 61440 pixels on the screen at 60Hz that's 3.69 million pixels per second. With 150MHz available for video that's a good 40 cycles per pixel, for a background that usually has attribute changes per 16 pixels and can be drawn several pixels at a time and sprites that take up at most 1/4th of the line, and video data that's pretty cache friendly. No it's not trivial but it can surely be done if you work at it.
 
Mr B said:
Neko said:
There were emulators that could run on a low-end 486. Although not recompilers, these were written in assembly code and not C. I don't think any of these old emulators are comparable to pure C code emulators like fce or nesemu.
Now i'm going to be a bit rude and offensive. But this is exactly what i said to start with. Coders aren't spending the time to do it the right way, and we end up with hackjobs that get the work done in a crude an inefficient way. As grateful as we might be that we get software that runs, it is just a "good enough" project, and nowhere near what it has the potential to become. On the other hand, as i said, i doubt we will get to see that many softwares that take advantage of the hardware platform, and code for it directly. Most will code for the software layer, and be pleased with the outcome of their efforts. However, that is why we wont see the performance that could be had... Sorry, i didn't intend to rub any developer the wrong way, but thats the way i see it.
B!
Well, you could make a NES emulator with a dynarec, possibly at the cost of some compatibility. You could optimize the renderer by rendering 8bpp and doing palette-switching hacks (if the hardware supports that). You could remove the mmc2/punchout stuff from the rendering code so it doesn't slow down the other games. You could do a tile-based renderer that looks like PocketNES on GBA. You could remove Game Genie support to make memory accesses faster.

And then you would have a 1990s-style emulator that has glitches and missing features, but could run at 200MHz.
 
Last edited by a moderator:
In fact, you can do an NES emulator with very high compatibility that runs on an interpreter and on a 200MHz x86 without frameskip. It doesn't need a tile-based renderer, if it has an intelligently written scanline renderer with sprite binning. It doesn't need to be 8bpp, in fact you don't gain an awful lot in doing it this way. MMC2 affects state between scanlines, it's hardly a big deal for a scanline renderer.

Is gpfce really not good enough evidence for you?

BTW, PocketNES is fully capable of scanline-based effects, I'm not sure why you brought it up.

Also think about how Game Genie works, a cartridge on a Nintendo system can't block RAM accesses, the RAM in the system is going to drive the bus whether the game likes it or not. Game Genie just involves spoofing data on the cart, which normally involves patching the ROM. This may include having to intercept calls to on-cart WRAM and VRAM, but this isn't going to be hit like the internal RAM is, your zero-page isn't going to be there, it isn't going to be especially expensive to deal with it.

This stuff about NES emulators with recompilers is over the top, I don't know why you keep repeating it, because I know you know better and you're perpetuating a misunderstanding WizardStan had. None of the fastest NES emulators ever employed recompilers!
 
notaz said:
Prometheus said:
I'm only not using GPFCE, which is, I'm guessing, far more optimised, because I don't much like the non-toggleable 2xSaI-style graphical filter that it currently has. :p Once that's gone, I will drop NesEmu in a heartbeat, because it's given me a ton of hassle.
Just disable the filter before running with
Code:
sudo /usr/pandora/scripts/op_videofir.sh none
If you know how to edit .pnds you can add that in the startup script.
How do you edit pnds?

I did it before to change catagory names but that was it.
 
Last edited by a moderator:
DaveC said:
How do you edit pnds?

I did it before to change catagory names but that was it.
They're just disk images: mount, copy data, edit, make new image.

mksquashfs is pretty easy to use.
 
Last edited by a moderator:
Exophase said:
It doesn't need to be 8bpp, in fact you don't gain an awful lot in doing it this way. MMC2 affects state between scanlines, it's hardly a big deal for a scanline renderer.

Is gpfce really not good enough evidence for you?
The 8bpp thing is so it could just change the palette without have to redraw all the tiles. Some emulators did this, and it didn't work for games that changed the palette on a per-scanline basis. MMC2 changed the mapping in the middle of a scanline, so every byte has to be tested. Maybe not huge, but slows things down.

And by the way, have you actually tried running Mike Tyson's Punch-Out in gpfce? Guess why there is the "Accurate renderer (slow)" option.

Exophase said:
Also think about how Game Genie works, a cartridge on a Nintendo system can't block RAM accesses, the RAM in the system is going to drive the bus whether the game likes it or not. Game Genie just involves spoofing data on the cart, which normally involves patching the ROM. This may include having to intercept calls to on-cart WRAM and VRAM, but this isn't going to be hit like the internal RAM is, your zero-page isn't going to be there, it isn't going to be especially expensive to deal with it.
The problem with the Game Genie is that you don't know in advance where parts of the ROM are going to be mapped. So you need to check every rom read, or patch those locations whenever the mapping changes. For many mappers, it's possible to patch every byte in the ROM that's likely to be mapped to that address, but this is not 100% accurate.

If I'm not mistaken, this is exactly why Game Genie doesn't work in gpfce.

Exophase said:
This stuff about NES emulators with recompilers is over the top, I don't know why you keep repeating it, because I know you know better and you're perpetuating a misunderstanding WizardStan had.
WizardStan implied that a dynamic recompiler would make nesemu much faster. I know that there are other factors involved, and just speeding up the 6502 emulation wouldn't achieve the results he was hoping for. I only brought this up as one of several possibilities which all have disadvantages.
 
Last edited by a moderator:
Neko said:
The 8bpp thing is so it could just change the palette without have to redraw all the tiles. Some emulators did this, and it didn't work for games that changed the palette on a per-scanline basis.

Okay, I thought you meant going to 8bpp mapping the post-lookup palette. Saving a palette lookup per-pixel is pretty minor for something of this resolution/rate. It isn't even close to necessary to get decent performance at 200MHz x86.

Neko said:
MMC2 changed the mapping in the middle of a scanline, so every byte has to be tested. Maybe not huge, but slows things down.

And by the way, have you actually tried running Mike Tyson's Punch-Out in gpfce? Guess why there is the "Accurate renderer (slow)" option.

It's just the 2 tile bytes between tiles (each 8 pixels) and of course only when MMC2 is present - if it's really a huge deal you can cache attributes and track changes in a bitmask which I doubt the game will invalidate all the time (certainly not on a per-line basis), but I really doubt this would be necessary. This is just a slightly different means of attribute tile address decoding.

I haven't looked at gpfce's renderers but I wouldn't be surprised if it was like PicoDrive's, which only did scanline in the "accurate" renderer - but when I suggested notaz do sprite binning he found that the accurate renderer was no longer that much slower than the normal one. It might be labeled "slow" here but I wonder how slow it really is. Tile-based rendering is not necessary for fast NES emulation.

Neko said:
The problem with the Game Genie is that you don't know in advance where parts of the ROM are going to be mapped. So you need to check every rom read, or patch those locations whenever the mapping changes. For many mappers, it's possible to patch every byte in the ROM that's likely to be mapped to that address, but this is not 100% accurate.

Patching the locations whenever the mapping changes has got to be orders of magnitude faster than checking every ROM read in any sane use case with any sane game, and if the game goes insane you can cache ROM patchings....

Neko said:
WizardStan implied that a dynamic recompiler would make nesemu much faster. I know that there are other factors involved, and just speeding up the 6502 emulation wouldn't achieve the results he was hoping for. I only brought this up as one of several possibilities which all have disadvantages.

None of which are necessary either, to make NES emulation fast enough on a Pandora (or a 200MHz x86)
 
Last edited by a moderator:
Exophase said:
I haven't looked at gpfce's renderers but I wouldn't be surprised if it was like PicoDrive's, which only did scanline in the "accurate" renderer - but when I suggested notaz do sprite binning he found that the accurate renderer was no longer that much slower than the normal one. It might be labeled "slow" here but I wonder how slow it really is. Tile-based rendering is not necessary for fast NES emulation.

It's not quite 60 fps on the Pandora at 600MHz. With frameskip enabled it will drop a few frames, it's not bad.

The problem is that people are whining that NES emulation on the Pandora isn't perfect, and making comparisons to emulators on GP2X, GBA, and old x86 CPUs. None of those emulators were perfect, and in many ways were much worse than what we have today. People have a very distorted view of history.
 
Last edited by a moderator:
Neko said:
It's not quite 60 fps on the Pandora at 600MHz. With frameskip enabled it will drop a few frames, it's not bad.

I meant how fast on say, GP2X and Wiz - looking at the Pandora version isn't that fair since it's a fixed port with a mandatoary 2xSai-like scaling.

Neko said:
The problem is that people are whining that NES emulation on the Pandora isn't perfect, and making comparisons to emulators on GP2X, GBA, and old x86 CPUs. None of those emulators were perfect, and in many ways were much worse than what we have today. People have a very distorted view of history.

The truth is somewhere in between what they're saying and what you're saying, ie between better emulators and slower emulators. The NesEmu port seems to fall more under the latter category than the former. To really get great accuracy we'd be looking at different requirements, but we're not talking about that, just enough to get "most" games and features working correctly. If your limitations are "scanline renderer" and "per-opcode timing granularity" then you can definitely get fullspeed on a < 200MHz x86 without frameskipping and with compatibility otherwise as good as those two limitations allow. If you're willing to put extra work into detecting mid-scanline and mid-opcode conditions then you could probably get an emulator with those things pretty fast too, under normal conditions.

With all the people who have done NES emulators and worked out decent techniques over and over I'm sure the fastest being done now is a lot faster than you think.
 
Last edited by a moderator:
Exophase said:
I haven't looked at gpfce's renderers but I wouldn't be surprised if it was like PicoDrive's, which only did scanline in the "accurate" renderer - but when I suggested notaz do sprite binning he found that the accurate renderer was no longer that much slower than the normal one. It might be labeled "slow" here but I wonder how slow it really is. Tile-based rendering is not necessary for fast NES emulation.
From what I remember (it has been many years) both renderers are line based, and the "slow" one is stock FCEU renderer that handles mid-line changes, like ROM bank switches and such. Punch-Out is buggered due to some unrelated hack, I think you just need to set "first/last visible line" (or something like that) somewhere in menus and it will work fine with "fast" renderer (or just use EU version).
 
Last edited by a moderator:
Back
Top