5" 1080p screens becoming a reality just in time for P2?


Good article. Exactly what I thought. Depending on how much more CPU power and power it needs, it's most probably wasted.


Well, let's see what the market can offer for what price and with what kind of touchscreen for which resolution.
 
Ghosting vs. Resolution...


I think I'd lean more towards a faster screen than a densely pixelicious screen.


Especially if we had a mini HDMI out port or something similar that I could then use a massive external display for when I need it to see the potentially tiny text.


The nice thing about making the internal display a native 720p or 1080p though is that if we did have some form of HDMI out, hopefully it would be less work for them as they could code up to the native internal resolution and display the same thing to the external display with the same resolution.
 
Good article. Exactly what I thought. Depending on how much more CPU power and power it needs, it's most probably wasted.
Well, that's one single article. Since it's rather short-ish (and non-scientific), it would seem rather weird if it was THE FINAL ANSWER™.


For instance, read Why Retina Isn’t Enough, then re-read the previous article.
 
Last edited by a moderator:
Obviously every extra pixel adds more detail, and if you want to be able to use magnifying glasses and still not see pixels (like the guy in that "Why Retina Isn't Enough" article), then yes, something insane like 950 ppi could be needed. I'm supposing that we are going to be higher ppi than the Pandora 1 anyway. There's other things than resolution that determine the quality of a screen: things like refresh rate, color depth, glare, contrast, etc. are probably already more of a limiting factor than resolution.


Any increase in resolution inevitably leads to more power consumption if all other things remain constant. Some of the other properties of screens are not directly proportional to power use, see e.g. Pixel Qi screens: because they use reflective screens, no backlight is needed so you get less power consumption while sunlight visibility improves. Others are also proportional to power use, like refresh rate. Better refresh rate is always better, but somehow people aren't arguing for 90Hz or 120Hz screens for the Pandora 2: there seems to be a consensus that 60Hz is enough (if that refresh rate can be achieved without too much ghosting). Same with color depth: 16 bit per channel (48 bit color) is better than 8 bit per channel, but somehow there's a consensus that 24 bit color is enough. Or with sound: we could use 352.8KHz at 64 bit per sample instead of 44.1KHz at 16 bit per sample, or 22.2 surround (24 speakers) instead of just stereo, and yes, it would be better, but it would use more resources and nobody would hear the difference when playing lossy compressed CD-quality music through headphones (which is probably the most common use case).


So I don't really get why people keep insisting on ever higher ppi screens. Maybe it's because for things like color depth and sound quality the limits of human perception were reached earlier, so we have had time to get used to the idea that it's fine to stop increasing those numbers.
 
That article fails on a few points.

For most people, though, it won't matter. Photos are inherently fuzzy, so it won’t matter whether they’re viewed on a 1920×1080 or 1280×720 smartphone display; you’ll still see their imperfections. "Even the tiniest image detail in a photograph is always spread over more than one pixel," Dr. Soneira explained in a follow-up e-mail. "The image detail is never perfectly aligned with the pixel structure of the display."
IF the image you're looking at was poorly taken at the same or less resolution as the screen, then he is correct. If, on the other hand, you happen to have a real camera with real optics and a real sensor capturing at 10+MP, you have to zoom in several times before his comment becomes relative.

Where a 1080p smartphone display could really make an impact is with computer-generated content—that is, the user interface, buttons, and text.
That is true - however the 2nd third of the same paragraph is false and belies a lack of understanding of how the image-eye-brain work together.

"Only computer-generated images make full use of the pixel resolution of the display," says Dr. Soneira. "For graphics and text, maybe you want that kind of sharpness."

Then he makes this contrary statement:

"The image detail is never perfectly aligned with the pixel structure of the display."
This is a 100% great reason to go with a HIGHER resolution display - not a lower. The more pixels you have to display, the more accurate you can be to the source image data up until the zoom level passes the resolution of the image. If you're using a high quality image source zoomed out those extra pixels will help keep straight lines appearing straight.


Have you ever seen vertical lines or stripes in a movie or TV show look like they're changing colors and/or pulsating as they get panned over the screen? This is a combined effect of ghosting (pixel fall off) and lack of display resolution.


That article was clearly written with an objective in the author's mind. Either their 'expert' was taken out of context repeatedly, or he didn't really know his stuff. I'm more suspicious of the author's bias than the expert's.
 
[...] if you want to be able to use magnifying glasses and still not see pixels (like the guy in that "Why Retina Isn't Enough" article), then yes, something insane like 950 ppi could be needed.
Obviously, it is just yet another article, so it can not give THE FINAL ANSWER™ either. Testing actual sample screens is the only thing that can.


That being said, the relevant thing about that article in discussing the article before it is that it gives a pointer that '20/20' (the 'standard' vision) may be worse than actual vision of people below the age of ~60 or so years. (As I'm sure you understand, this would be affecing calculations regarding screen radability, pixel usefulness, and so on).


(On a sidenote: While all the other things that you wrote could be improved are important too, screen manufacturers currently are in a fight for high resolutions on small screens. This makes it seem likely that increased resolution will be more easily available and have fewer drawbacks than, say, a higher refresh rate, or a pocketable 22.2 sound system.)
 
Last edited by a moderator:
^ yep, it's all pissing in the wind IMHO until someone actually reviews a 1080p screen and compares it to an IPS screen.


They also need to do a battery life test.


I noticed LG had launched a 1080p model in Asia recently, unfortunately it has a small battery, but at least we should have some idea when they review it.
 
Last edited by a moderator:
Have you ever seen vertical lines or stripes in a movie or TV show look like they're changing colors and/or pulsating as they get panned over the screen? This is a combined effect of ghosting (pixel fall off) and lack of display resolution.

Ummm... nope.


The reason for that (called cross-color) is because when color TV was invented, it had to be compatible with B/W TV.


B/W TV used a signal with a bandwidth of 5MHz. The faster the change of contrast, the higher the frequency is. The more narrow stripes are, the higher their frequency is.


Composite video is a B/W signal with an modulated color signal. That signal was still compatible with old B/W TV sets.


The color signal was inserted at 4,43 MHz for PAL TVs and 3,58 MHz for NTSC TVs.


The cameras used back then didn't have the resolution to reach these high frequencies, so the picture did not affect the colors.


Cameras got better and later easily reached higher frequencies.


Frequencies that reached the exact modulation frequency (usually striped shirts or similar) triggered some weird colors, as the TV interpreted them to belong to the color signal.


That's where you got those weird pulsating colors from in striped areas.


Using a signal where color and luminance is not on one line (component, svideo, rgb, etc.) does not have that effect.


It only occurs with TV composite signals for the above mentioned reason.
 
Let's be honest, everyone would be OK with a 720p screen, no need for 1080p at extra cost.


You're probably aware of that, but please don't plan to make another 500€ pandora. This is simply too much; the reasonnable price of the pandora (back in 2008) for the hardware was one of the main interests, don't throw it away.
 
go for a cheaper screen if there's a significant price difference.


the current screen is just too good.


I have no other LCDs (11, not counting the DS/GameBoys) in my house that rival the OpenPandora's


I love the OP's LCD, its viewable on any angles, its just amazing.


but I could easily live with something more average.


1080p is too many pixels, it would consume a lot of memory bandwidth just to feed the framebuffer if the SoC doesn't have a separate framebuffer memory.


and it takes a huge amount of GPU time to draw at this resolution.
 
^ In fact, there were fears at the time that the screen would be to hi-res for the hardware to handle. I guess in a way those "fears" seem to be right.
 
Last edited by a moderator:
1080p is too many pixels, it would consume a lot of memory bandwidth just to feed the framebuffer if the SoC doesn't have a separate framebuffer memory. and it takes a huge amount of GPU time to draw at this resolution.

There is no doubt that future chips will handle 1080p


Having said that, they will probably perform better when driving a smaller screen
 
There is no doubt that future chips will handle 1080p


Having said that, they will probably perform better when driving a smaller screen

The Pandora's OMAP SoC has a hardware graphics scaler. Future SoCs will I suppose also have a hardware scaler.


It can scale things up or down to fit the screen. So, these chips won't "perform better" when driving a low-res screen. You can choose to use an 800x480 mode with the 1080p screen, and scale it up with anti-aliasing or without, at no extra cost. Many of our emulators are using Notaz SDL to do this now.


The hardware scaler might use a tiny amount of power, I don't think it will be significant compared to the CPU usage or GPU usage or actual LCD usage..


Or, you can use a 1080p mode, which may perform slower and use more power. I hope the desktop resolution will be adjustable.


So, having an 1080p screen should not necessarily degrade performance or battery life (unless the screen itself needs more power).
 
It can scale things up or down to fit the screen. So, these chips won't "perform better" when driving a low-res screen. You can choose to use an 800x480 mode with the 1080p screen, and scale it up with anti-aliasing or without, at no extra cost.

Why doesn't rendering fewer pixels result in a higher framerate?


I imagine that 1080p rendering is harder work than 480p rendering


You could render it a 480p and scale to 1080p, but then you're wasting the high-res screen, and scaling tends to make things fuzzy
 
Why doesn't rendering fewer pixels result in a higher framerate?

It would.

I imagine that 1080p rendering is harder work than 480p rendering

Yes.

You could render it a 480p and scale to 1080p, but then you're wasting the high-res screen, and scaling tends to make things fuzzy

Nothing is going to look fuzzy at 1080p. Also you could choose to start with 540p which is exactly half of 1080p.


My point is, we don't lose anything by using 1080p.


If performance is an issue for some app, then we can run it at a lower resolution and scale up.


With a 480p screen, we always have lower resolution and no choice about it. With 1080p, you can choose high-res with lower performance, or low-res with higher performance. There is not really a down side in terms of performance.
 
Back
Top