Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Nexus One display and subpixel pattern (javia.org)
115 points by jonknee on Jan 20, 2010 | hide | past | favorite | 27 comments


I consumed this article with much enthusiasm. It was filled with information and still very easy to read.

My immediate thought though: I wonder how much battery-time you would gain by making the device monochrome (black/green) when you just intend to use it to, say, call or perhaps even read something.

Maybe a mode like this already exists in Android? I don't have a Nexus one, so... :) Heck, if it doesn't exist, then I guess this might be an app for the Android app-store. Send me a cookie if you write one!


If (and this is totally hypothetical, I really don't know) the pixels respond non-linearly to power input, it might actually use less power to turn on all the subpixels to a low brightness than to make one subpixel really bright. It might take more power to go from 50% to 100% than from 0% to 50%.


I still want to see some real numbers on energy consumption between a 100% white and 100% black (but still powered) AMOLED. As the first two letters stand for active matrix, there's still a control voltage going through each pixel when it's at RGB000, so the difference may not be as great as it's often made out to be.


I put an ammeter on my Nexus One and measured 160 ma in black screen, and about 450 ma in white screen (with a "flashlight" application).

When on but screen powered down (normal "waiting for a call" mode) I measured 2 ma, average, with occasional blips much higher (to 400 ma, measured with a Fluke 87 in MIN/MAX mode).

When powered "off" the phone takes 60 uA - I attribute this to keeping an internal RTC active, possibly keeping some sort of pseudo-static RAM active - but those are just guesses.

The battery voltage measured 4.1

I did these current measurements by wedging in two thin pieces of copper tape with an insulator in between, placed between the + battery terminal and the phone's + input.

The phone indicated "4 bars" and Edge service (not 3G) when I made these measurements.


Kudos for actually making the effort to take the measurements. Thank you


Active matrix just means every pixel has it's own switching transistor, as opposed to just every row and column. There may be drive voltage at every pixel but not current.


Here's a good presentation from mid-2008 by 4D Systems, a company that makes integrated display modules http://data.4dsystems.com.au/downloads/micro-OLED/Docs/4D_AM... . I don't think they manufacture the panels themselves, but they still probably have a good idea what they're talking about. Slides 15-18 are most pertinent.

Basically, it looks like AMOLED uses 3x the power of an LCD for an all-white screen, and about 1/10 the power of an LCD for an almost all-black screen. The LCD power stays the same, the AMOLED power varies by 20-30x.


> It was filled with information and still very easy to read.

I completely agree--I knew nothing about how these type of displays work, and yet I found the article to be very understandable and enriching.


I was actually happy with the display in my 2nd generation iPod touch. Note the past tense. I shouldn't have clicked on this link...


That's actually pretty interesting. I wonder if it was hard to do a subpixel rendering system for fonts (like ClearType)? I don't have a Nexus One so maybe the pixel density is high enough that they don't need to render fonts like that.


Actually, it does make it hard. To do subpixel smoothing correctly, you need to be able to keep the total amount of color the same -- so if you want to move a part of the glyph right one, you'd light up a red from the next pixel right and turn off blue on the last pixel in your line on the left. This becomes both much harder and less useful when not all pixels have all colors.

But you are right on the other point -- at 800x480 on a 3.7 inch screen has a ppi of 250 -- or just normal font rendering looks as smooth as subpixel rendering on a computer screen, and I for one would be hard pressed to distinguish between subpixel rendering and normal when looking at text at size 10 or so on that screen -- just because I'd have such a hard time seeing it at all.

I wish we could get proper OLED computer screens.


From what I understand, the iPhone doesn't use sub-pixel font rendering either, because of the high pixel density (which is not even as high as the Nexus One).

Since the display rotates, normal desktop sub-pixel rendering algorithms won't work anyway.


I have heard that OLED screens are extremely difficult to read in daylight/direct sunlight, can anyone confirm or deny this? I think this would be a problem in on a phone in some places.


I have, to date, had zero trouble using my N1 phone anywhere yet. Two key points:

1) It's winter in New York City. Even outside there's not a lot of really bright sunlight.

2) The N1 has a feature (which I like) whereby, with a light sensor, it adjusts the brightness of the screen to compensate for ambient light automatically. Or: readable in light, not blinding at night.


Interestingly, many camera sensors also have a dithered sensor pattern: http://en.wikipedia.org/wiki/Bayer_filter - but a different arrangement of colors.


Yah. That's almost exactly the same pattern, just with the red and blue swapped.

This phone should be great for displaying raw images, since you can show the subpixels with a one to one relationship.


Very interesting! This may explain the un-Googley-ness of the Android interface. I was racking my brain to figure out why the interface didn't have a blue, Gmail mobile /web app look and feel.


It's an interesting connection to make, but I don't think OLED subpixel operating life estimates played a part in the UI design. Remember, wherever there's a white pixel, there's a blue subpixel going at full strength, and there's no shortage of white in Android.


It never has, even back when all the Android devices had LCD screens. I'd mostly chalk it up to not having the UI bits that normally were blue in (i.e.) gmail, so the blue just didn't show up. Text is black on white, buttons are black on grey, and there's not a lot of space left.


Interestingly enough, the color scheme and theme for Android icons and widgets changed in 2.0 (for the Verizon Droid) from colorful blues to greys. I wonder if that change in color schemes was specifically intended for the Nexus One, or just a coincidence...


800 horizontal pixels, but half of them can contain no red and the other half can contain no blue. Someone should cue the lawyers from the dithered LCD lawsuits. They will have a field day with this.


That's not accurate. The human eye uses primarily green to detect resolution. Red and blue are used for color.

I wish I could find it, but there was an online example, where someone took a picture, divided it into the three channels, then pixelated (mosaic) one channel at a time and recombined it.

When he did green you saw it right away. With red it took some pixelation before you could see it. And with blue it took a LOT of pixelation before you noticed anything.



Thank you! I've been looking for that page for so long.


Still, it's in the same category as the LCD panel dithering suit referenced. Both are measures that technically reduce fidelity but are effectively invisible to the naked eye.


All designs that are intended to be used by humans can profit greatly from knowing the abilities and constraints of human senses. Lossy audio compression works so great because of our initmate knowledge of how human hearing works.

The trick this screen uses isn't even exactly new. JPG compression has done something very similar for ages (http://en.wikipedia.org/wiki/YCbCr and http://en.wikipedia.org/wiki/JPEG#JPEG_codec_example). Digital cameras also use that trick.


The OLPC uses a similar technique; apparently the effective resolution is not totally obvious: http://pixelqi.com/blog1/2008/05/27/higher-resolution-than-w...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: