Please do not use arm cortex-a15 for pandora2!!!!!!!!


That's the minimum runtime with a 10,1" FullHD LCD with full brightness AND wireless active and turned on. You are aware that this screen probably needs A LOT more power than the Tegra itself, do you?
 
at max load I dont believe that the tegra 4 will use less power than the display, I believe that of the ~9W power draw a good 3-5W belongs to the soc...

That's 9 hours with normal usage (probably Wifi switched on as well), so far away from the 2 hours you mention...
 
the 2h34m runtime   is max load for that device.

You need to compare the SoCs, not the full devices.
 that is true.

maybe the tegra 4i, with quad a9s would be a better fit for long battery life with a ~15WHr battery, then again I did read somewhere that nvidia want a minimum order in the 6 digits...
 
Jebe

A GL-bench doesn't have to be CPU intensive.

Hardware accelerated video playback definitely isn't.

X86_64 is top lel on scaling down to the idle levels of ARM, over time your problem is that drain, not what you do in access of it.
A ) The CPU and GPU's power budgets aren't independent on a governed SoC that intergrates both like the Tegra 4 in the Shield almost assuredly is.  You're making a distinction without a difference.

B ) That was implied, yes. You'll note I was using it to reference likely power consumption of the screen.

C ) Your statement doesn't make any sense.  So clarification, citations, and qualifiers please?
 
Last edited by a moderator:
That's the minimum runtime with a 10,1" FullHD LCD with full brightness AND wireless active and turned on. You are aware that this screen probably needs A LOT more power than the Tegra itself, do you?
 at max load I dont believe that the tegra 4 will use less power than the display, I believe that of the ~9W power draw a good 3-5W belongs to the soc...
I wouldn't be sure of that.

Backlighting of an LCD needs quite a lot of power, especially under full brightness.

The larger the display is, the more power it needs.

The LCD of the Pandora with max brightness uses 831mW in total, 681mW of that are the backlight LEDs.

The LCD is 4,3". The 10" display is more than twice as big, so you can assume it's at least 1,5W just for the display.

But 3-5W for a QuadCore A15 SoC under full load (CPU and GPU) is pretty good, I'd say.
 
just make sure to evaluate whatever SOC you choose carefully. Personally I don't care so much about peak performance but rather about battery longevity.
 
I think it's a good idea for P2 to have a provision for 2nd battery add-on if the users choose to similar to battery add-on for iphones which doubles battery life...
 
I think it's a good idea for P2 to have a provision for 2nd battery add-on if the users choose to similar to battery add-on for iphones which doubles battery life...
Or... design the batteries to be hot-swappable (+ a small built in battery that holds charge for a couple of minutes?)... that way you can buy as many as you need for your purposes.
 
Just shut down your Pandora for 30seconds and put in another battery... good heavens!

I am glad, that we can do that.

BTW: Can you do that with the GCW0 or Shield? I haven't got these.

Sadly it's not possible with 3DS and noone complains there.
 
Last edited by a moderator:
Ok, haven't noticed that since I am only at these boards and do not use the 3D function at all.

Being able to swap batteries is huge anyway and was naturally at GameBoy-times.
 
Or... design the batteries to be hot-swappable (+ a small built in battery that holds charge for a couple of minutes?)... that way you can buy as many as you need for your purposes.
Hmmm...

On my laptop I can just freeze to disk, swap and restore from frozen state.

Can't see why the Pandora couldn't do this with some changes to boot.

For example, freeze to SD card.

Not like it's terribly bothersome to just reboot instead... just a thought...
 
The LCD of the Pandora with max brightness uses 831mW in total, 681mW of that are the backlight LEDs.


The LCD is 4,3". The 10" display is more than twice as big, so you can assume it's at least 1,5W just for the display.


But 3-5W for a QuadCore A15 SoC under full load (CPU and GPU) is pretty good, I'd say.
The area of a 10" screen is more than four times that of a 4.3" screen, since its twice as big in each dimension.

That puts the power use in the realm of 3W
 
The area of a 10" screen is more than four times that of a 4.3" screen, since its twice as big in each dimension.

That puts the power use in the realm of 3W
So surface area is 5.4x higher, but also resolution is 6x higher, so the pixel density is higher (which blocks more light per area). Average brightness is 328 cd/m^2, vs 280 cd/m^2 for Pandora. Pandora's uses LTPS TFT, this tablet uses IPS, and apparently IPS demands higher power consumption (http://wiki.mobileread.com/wiki/LCD)

I've looked at LCD panel power consumption specs a few years ago. When trying to find better response times than Pandora's screen I noticed that that power consumption shoots up a lot. Pandora's LCD is really very low power consumption compared to most others I've seen. This display could be consuming over 4W at max brightness.

Their test is kind of meaningless though, except to try to get a max power consumption level.. who knows how much the SoC is being throttled while running all four CPU cores with a demanding load plus running the GPU, which is what stability test does. I doubt it ran everything at maximum clock speeds throughout that 2.5 hour duration.
 
Last edited by a moderator:
There is always a price, CCFL backlighting, uses power, but looks nice.

Not something you want on anything portable though ;)

A more complete Led matrix or Oled would be nice.

With faster response times it normally means the panel is just driven harder, you increase the voltage, and the time to rise seems small.

What actually happens is it gets there quickly, but overshoots the mark. In turn you get inverse ghosting, often seen as a pinkish trail of motion. And the panel doesnt last that long. Hitting the sweetspot on characteristics means it will cost, thats just how it is.
 
Last edited by a moderator:
Thats because of CCFL backlighting, uses power, but looks nice.

Not something you want on anything portable. A more complete matrix of backlighting leds uses more power too, as opposed to just the edge lighting.
Does anyone have any idea what you're responding to? I highly doubt SlateBook x2 isn't using LED lighting.
 
So after my initial comment about using a big.Little i went looking into it a bit deeper. I have been looking at a mediatek quad core a7 (forget the actual chipset number) but its a 28nm process and it runs at 1ghz on the 4 cores. The benchmarks are actually quite impressive considering the low clock. Some marks that were shown were even able to edge out a galaxy s3.

While it may not be competitive with the shield (that battle might not be worth fighting) it would have very adequate power and insane battery life (didnt see battery life specs) The S3 handles most emulators extremely fluidly and even drastic under the weight of android. It might be worth considering.

They are even offering the a7 in an octacore variety later this year with either a Mali-400 or powervr 544 @300mhz
 
Last edited by a moderator:
So after my initial comment about using a big.Little i went looking into it a bit deeper. I have been looking at a mediatek quad core a7 (forget the actual chipset number) but its a 28nm process and it runs at 1ghz on the 4 cores. The benchmarks are actually quite impressive considering the low clock. Some marks that were shown were even able to edge out a galaxy s3.While it may not be competitive with the shield (that battle might not be worth fighting) it would have very adequate power and insane battery life (didnt see battery life specs) The S3 handles most emulators extremely fluidly and even drastic under the weight of android. It might be worth considering.They are even offering the a7 in an octacore variety later this year with either a Mali-400 or powervr 544 @300mhz
The Shield is absolutely a battle worth fighting. I didnt pick one up for the very reason that I would rather have an equally or more powerful P2. I like what the shield is going for, but it would be even better if it was open source.

(what would be even beter-er imo is x86 so it could run SteamOS which would be baller)
 
... That doesn't seem like an option to me.

I was hoping the P2 would have at least a 2ghz processor, even if only dual-core.

And mali-400 isn't that great. It's rather dated as a gpu already.

My Note 8.0 is a 1.6 ghz quad with Mali-400... and I've already butted up against the limit of it's graphical prowess.

I find the Note 8.0 even more portable than my Pandora, and *if* I finally get my ICP2... I wouldn't see any point in getting a P2 based on the soc you mention.
 
Back
Top