"iGame" V10 Android Game Console


7) Two nubs both having their own controllers. I'm sure any Chinese handhelds with analog inputs have them going straight to the SoC.

And both needing Atmels, which aren't cheap as well.


yes, its definitely overkill, this could be a cost-saving trick for P2 :)


analog nubs don't need a high sampling rate, anything above 20Hz is more than good enough, especially if its pre-filtered in hardware to something close (but above) that frequency.


for speedy twitchy games like 2D fighting you got the digital D-Pad (even tho analog @ >= 20Hz still works fine)

OPT doesn't have the source code, it a proriatary blob from the nub company.

which we don't need.


reading analog joysticks isn't that hard, they sell ready-made analog-only controls and all you need is some form of ADC and a tiny bit of look up tables in the driver if they're not linear.


the pandora nubs are over-engineered.


the cheapest solution is to use a 555 or similar timer as was used on PCs and "horrible" software polling.


on the pandora, you may even use the DSP for this mostly-software ADC solution and leave the main CPU free.
 
which we don't need.

reading analog joysticks isn't that hard, they sell ready-made analog-only controls and all you need is some form of ADC and a tiny bit of look up tables in the driver if they're not linear.


the pandora nubs are over-engineered.


the cheapest solution is to use a 555 or similar timer as was used on PCs and "horrible" software polling.


on the pandora, you may even use the DSP for this mostly-software ADC solution and leave the main CPU free.

My rule of thumb is - if you need to buy an additional multi-channel ADC for an apps processor and it doesn't have to be high resolution (> 12bit) or high speed (> 1MHz) then you're best off just buying a microcontroller. It's a lot more flexible and often cheaper. They're really not expensive solutions. Even Cortex-M3 and Cortex-M0s are being made available for less than a dollar now.


On Pandora the PMIC has extra ADC channels, since they're need for the touchscreen controller. I don't know if it has enough available though. For nubs you need at least 4.


I think the external sampling rate should be at least 60Hz. Which means the frequency response of this output must be completely flat at 30Hz. The problem with using a really low sampling rate and relying on an analog anti-aliasing filter is that a cheap single-order filter has a very gradual drop-off that'll give you much less bandwidth than half the sampling rate (depending on the stop band attenuation you need). Better filters need active components (op-amps), from one to multiple per channel depending on how good you want the filter to be. It's going to be really easy to once again raise the price and board complexity beyond what you get using a microcontroller that samples at a very high rate and performs digital filtering. I've personally filtered 8-channels sampled at 16KHz with a 255th order FIR on a little 32MHz STM32F (that's four years old now). You don't need that much power to do a decent amount of filtering. But you do need a steady input stream at a high sampling rate. You may be able to rig something like the OMAP3530 to DMA at a high rate from something like its PMIC's ADC, I don't know, but it sounds like a pain.


Personally I'd have a single microcontroller that handles all inputs. Including touchscreen, because Pandora already has problems with noise on its touchscreen lines, although I guess this will depend on exactly what hardware is available for whatever touchscreen controller is used. Then the microcontroller handles sampling and calibration and is of course fully program-modifiable. It sends single packets for all of the input state, and it has (programmable) thresholds such that it only sends these updates when the state has changed enough. The microcontroller could of course also buffer events and only give them to you when the program running on the SoC requests input. Either way, when your Pandora is sitting there doing but handling really low frequency events the CPU doesn't have to be woken up constantly to deal with interrupts.


The alternative of using the DSP for this has the problem of needing a decent preemptive OS on the DSP, because you don't want to waste the whole thing by only being able to use a small percentage of its ability running analog sampling and filtering (this is assuming the DSP has full access to the same external peripherals, haven't checked). But on this DSP architecture real time interrupts are a bad idea, because interruptible code takes a performance hit.


The biggest problem with Pandora's design is using TWO microcontrollers, that run immutable software blobs with questionable auto-calibration. But using a separate microcontroller isn't necessarily a bad idea.
 
Last edited by a moderator:
I could have changed the nubs to use just one microcontroller in the code but the problem is board routing. That middle area is where the internal layers go nuts and there are plenty of power and signal traces on the outside layers to further complicate things. On top of that is the possibility of signal noise getting injected on the long analog traces. It was something that would have needed to be tested on a much different board revision which was expensive and time consuming. It was easier for ED to just keep going with how things were designed years ago.


The source code for the nubs is not exactly a binary blob. It is an object file that I build into my code so I wrap interrupts, timers and the hardware abstraction around this object file and build it into a project. It gives me some flexibility but no control over the meat of the nub operations. It's the same with iCP.
 
I could have changed the nubs to use just one microcontroller in the code but the problem is board routing. That middle area is where the internal layers go nuts and there are plenty of power and signal traces on the outside layers to further complicate things. On top of that is the possibility of signal noise getting injected on the long analog traces. It was something that would have needed to be tested on a much different board revision which was expensive and time consuming. It was easier for ED to just keep going with how things were designed years ago.

Of course, the nice thing about doing high sampling rates and custom digital filters is you can work around higher frequency noise picked up from stuff like this. Not that I blame you for going with two controllers in this situation. It's not like you could have modified one ATmega yourself w/o work from the nub manufacturer, and at some point trying to do your own solution would have been a lot costlier.


Does it make any sense for a Pandora-like console to have the keyboard, nubs, and other I/O on a separate board, with lower power components on the top/bottom between the two boards? Of course this would raise overall board area a lot and probably costs, probably manufacturing and testing as well; I'm not sure how much lowering the complexity of the individual boards would counteract this. Or if the increased modularity would reduce typical repair costs. I do wonder if the added board space would help for facilitating DDR3L instead of LPDDR2, if such an option isn't too hard on power consumption. Naturally this would also add thickness to the unit, maybe offset by a thinner (longer?) battery.

The source code for the nubs is not exactly a binary blob. It is an object file that I build into my code so I wrap interrupts, timers and the hardware abstraction around this object file and build it into a project. It gives me some flexibility but no control over the meat of the nub operations. It's the same with iCP.

Hm, okay. Partially binary blob then :D It's all kind of moot on Pandora when end users can't change it to begin with. This is not the case on iCP though, right?


One of these days I should reverse engineer this object code ;p (I'd have to bone up on AVR ASM I guess)
 
Last edited by a moderator:
Back
Top