10W LCD screen part 2: The ideal display

Door mux op maandag 4 juni 2012 19:38 - Reacties (3)
Categorie: 10W LCD-scherm, Views: 7.718

Looking for the Dutch version?

In this blog series on display power consumption:
Part 1 (Anatomy) - Part 2 (You are here)


In the last installment of this blog series, we looked at the components inside a computer display. The purpose of dissecting the monitor is to find out where and how electrical energy is used. But instead of looking at existing displays, why not work from the ground up: what would happen if you would design a computer display. How to design a display that uses as little power as possible?



This blog series accompanies the blog series about my newest computer build, an Ivy Bridge desktop quad-core all-in-one with IPS display consuming only 20W when idle. If you like this series or have found the information in my blogs useful, please consider donating to me (donation link at the bottom of each blog) to help make my newest build as awesome as possible.

We can use the knowledge from the previous blog post and some other electrical notions to calculate the extremes of display power consumption. This is true science: (1) knowing exactly what the limits of physical reality are and (2) knowing what is stopping us from getting there. We are going to construct an ideal monitor in this chapter - that is: the lowest possible power consumption. I will also be explaining how we can verify in practice how close we are to these limits. You will see that in the next blog post on this subject, I will indeed be using these methods.
0. Index for part 2
Notion of zero-work elements
There are two main branches of electronics: power electronics and what we colloquially call microelectronics. The first one is only concerned with processing power: converting voltages and currents. The second one is concerned only with the manipulation of information. More importantly, the first one necessarily deals with power, energy and efficiency. The second one does not. Information has no inherent energy content.

So by this logic microelectronics should not need any power to work. However, information by itself is entropy, so in the creation and destruction of information we need to expend some minimal amount of energy. In displays though, the information entropy is just shifted from the video input to the panel. There should not be much in the way of entropy generation. The bottom line here, in case you couldn't follow, is: the microelectronics inside a computer monitor should not necessarily use any energy. Synonymously, they are zero-work elements (or zero total energy elements). This is analogous to potential field physics.

Display receiver and controller
A modern LCD display controller (from a Samsung PX2370)

Even in practice, which is never ideal, the power consumption of these elements should be very, very low. We can verify this by directly measuring the power consumption of these chips. In practice, these chips consume (much) less than 1W.

This was actually a convoluted way to identify zero-work elements. Another way of identifying zero-work elements is to look at its initial and end state. The LCD panel for instance should be a zero-work element. It starts white, and it ends white, only shifting between shades in the meantime. This is definitely a zero-work element. It is therefore sad to see these things use quite a big portion of LCD power consumption.

A part of displays that really is not zero-work is the backlight of LCDs or or the active pixels in OLEDs. It should be obvious that the initial and end state are very different: at the end of operation, many photons have left this element of the display. It has definitely converted electrical energy into light and therefore needs energy (work).

Classical approach with backlight
LED backlight

The ideal white light source and LEDs
Alright, so we have identified that the only thing that should consume power in a monitor is the backlight. That is a start. But we also know that there are some obstacles in the way between the backlight and the end user. Let us start with the backlight itself.

Theoretically, an ideal source of white light puts out about 250 lm/W. What is that? That is the amount of visible light, in a unit called lumens (lm), that a light source produces per watt of electrical power. Some people say an ideal source puts out 300 lm/W. Or 200. Or something near that number. Actually, 'white light' is not always the same thing. You would think such a fundamental part of optical theory would be a known fact, but especially with the advent of LEDs this has become a point of vigorous contention.

http://www.spaceflight.esa.int/impress/text/education/Images/Glossary/GlossaryImage%20049.png
The radiation produced by a blackbody radiator at different temperatures

In the past, the ideal source of white light was considered a blackbody radiator at 5800K, truncated to 400-700nm. That means: the light that is generated by something that has a temperature of 5800 Kelvin and does not reabsorb the radiation it produces. Such a 'hot' object does not just generate visible light, it also produces infrared and ultraviolet, stuff we cannot see. Truncating it to 400-700nm means that we cut out all of that invisible light. That is how you get to 250 lm/W.

However, such a light source puts out a bit more green than it does red and blue. You could also for instance make a light source that produces some more blue and a bit less green. This would make something that also looks very white, but has a different luminous efficacy. What I am trying to say here is that there is no one set number that defines the 'ideal' white source. It is somewhere between 200 and 300 lm/W, and that is all I can tell you. There are even ways to get more than that, by tinkering with the light spectrum.

http://www.nanocotechnologies.com/Resources/Images/ac0c38f3-6c36-4d78-9e5d-4cf066b02cb2.jpg
Different LED spectra that all produce light we perceive as 'white'

Now here is the kicker: LEDs are actually able to theoretically (and even currently in lab conditions) get very close to that number! No other light source can do this. All other light sources have theoretical limits on their luminous efficacy, but not LEDs. This makes them definitely the light source of the future. Let me reiterate this, because it is really important in understanding why LEDs are such important devices: they have no unsurmountable physical problems to prevent them from being an ideal light source! So let us pretend we live in the future and outfit our theoretical ideal LCD monitor with a backlight made of these ultra-good 250 lm/W LEDs.

The attenuating effect of polarizers
Polarization of light
This light 'wave' is 'vibrating', or polarized in the Y-direction

Now this light from the backlight passes through ideal polarizers. An ideal polarizer is an element that is perfectly transparent for light of a single polarization angle. How much of the light is left behind the polarizers if you shine that super-sexy LED backlight onto it? To calculate this, we need to regard the polarization angle of light as two perpendicular waves, X and Y. Only one of the perpendicular components, say Y, can pass through the polarizer, not the other. We also assume that all polarization angles exist equally in the light generated by the backlight. So we can regard the distribution of polarization angles as constant while the contribution of Y as a function of the polarization angle is simply the sine function:

Distribution of polarization angles

Now, we can visually see what proportion of incident light is passed through a polarizer: this is the area of the sine divided by the area of the polarization distribution, i.e. 50%

Excellent. There is an ideal polarizer for you. It passes on about 50% of the incident light if that light is universally polarized. The other part of the LCD panel that stops light is the color filter. If three colors are used and the spectra of these colors do not overlap and at the same time don't leave holes in the spectrum, irrespective of the relative size of the color filters, it will pass on exactly 1/3rd of the light*. The rest of our ideal panel is, well, ideal. The liquid crystals do not stop any light, nor do the various material interfaces. There is also no diffusion. This leaves us with the grand total of 16.7% of incident backlight that eventually passes through towards the user.

This is actually not dramatically different from practical results. See for instance this topic here. LCD transmittance here is tested as somewhere in the neighborhood of 5-9%. Now, this may seem very different from my theoretical 16.7%, but as you may remember from the first chapter, the transistor and wiring (especially on these old panels) really block a lot of light. The rest of the difference can be explained by the LCD itself not being 100% transmissive. In practice, the best current LCDs have a total transmittance of somewhere in the neighborhood of 9%.

The end result: the 1.2W 24" monitor
Alright, now we arrive at the pièce the resistance of our calculations. How much light does a monitor emit and, calculating back, how much power should that use? To answer the first question we need to know the surface brightness that we want. For a typical office or home environment, about 100 cd/m2 is a good number. This number means that the brightness of the monitor, when displaying a completely white image, causes a light flux of 100 lumen per steradian per square meter. As a monitor is flat, it emits over a solid angle of pi steradians. How large is our ideal monitor? We will tabulate that later on. For now, we assume a ridiculously big 1 square meter monitor (because I'm metric - I agree that a square foot would be more realistic in this case). With this rationale this monitor, at 100 cd/m2, emits 314 lumens of light out the front. This in turn is just 16.7% of the backlight flux. That means that the backlight needs to emit about 1880 lumens. At a luminous efficacy of 250 lm/W, this means the backlight should consume no more than 7.5W.

Yes, you heard it here first. A 1-square meter monitor should theoretically use no more than 7.5W, assuming an ideal build-up and no power conversion losses. For smaller monitors, this should be their power consumption (assuming 16:9 aspect ratio):

Screen sizeArea [m2]Ideal power consumption @ 100 cd/m2 [W]
20-inch0.110.84
24-inch0.161.2
27-inch0.21.5
30-inch0.251.9


Now as a small sidestep, why do actual monitors use so much more than this ideal value? Well, the losses multiply! First off, let us assume a panel with 10% actual transmittance - this multiplies the power usage by 1.6. Also, even the best LED backlights nowadays don't do more than 100 lm/W, a factor of 2.5x in power consumption. The backlight dc-dc converter and power adapter also have only about 50% combined efficiency. This means that a practical 24" monitor backlight and panel need about 8 times more power than my ideal system at 1.2W - that's 9.5W. Slap on the power use of a panel at 5W, a controller at another 1W and we have arrived at about 15W!

Amazingly enough there are actually monitors that use less power than I predict in the last paragraph. Really. Apparently, they have done a really good job somewhere along the lines to improve the nonideal qualities of monitors. For instance, take the Iiyama E2472HDD which has been tested to consume 13W at 140 cd/m2! This means that in this monitor they must have used either a much more efficient backlight inverter (about 70% total efficiency is feasible) or much more economical microelectronics. I am really excited about this kind of a find, to see that there are monitors that exceed my expectations on power consumption!

Now, one very important thing to note here is that LCD monitors need their backlight to run at full power even when displaying mostly black. A mostly black screen will thus be much more inefficient - the amount of light going out is reduced dramatically, but the light produced by the backlight is the same. If we want to improve on this, then we want an active-pixel monitor. How would such a monitor stack up?

Ideal wide viewing angle active pixel monitor
Even a theoretically ideal LCD monitor has inherent losses: of the light you put in, only at most (with a white screen) 16.6% actually makes it out. If we use active pixels, there is no light blocking anywhere so all our power should translate into light. Now, OLEDs being produced today have fairly bad lighting efficacy: between 10 and 20 lm/W is common. This is not a fundamental limit: in the ideal case - and this will eventually become reality - OLEDs should be able to rival 'traditional' white LEDs in efficacy. So they should be able to produce 250 lumens per Watt - if they would be white. We are using RGB OLEDs, though, which reduces their luminous efficacy to 1/3rd of that value. That means that an ideal active-pixel monitor should only use this much power:

Screen sizeArea [m2]Ideal power consumption @ 100 cd/m2 [W]
20-inch0.110.42
24-inch0.160.60
27-inch0.20.75
30-inch0.250.96


But wait, there is more. This is only when displaying white at 100 cd/m2. In practice, when doing office work your monitor only emits about 33-50% of the amount of light that a fully white screen emits. During gaming, watching a video or editing media this figure is even lower: between 5 and 20% of white. On average, the above number will be even lower:

Screen sizePower consumption (office work) [W]Power consumption (video/gaming) [W]
20-inch0.210.08
24-inch0.300.12
27-inch0.380.15
30-inch0.480.19


Perfect narrow-angle display
But there is more wiggle room. You thought sub-0.1W was great for a 30" monitor? Well, we can optimize screens much further than that by using for instance so-called lenticular lenses, as they are used today for 3D technology. In the above two models I assumed the screen emits light in all directions. However, only a really, really small portion of that light actually hits your eyes. The rest is lost in the space around you. If you outfit a screen with movable lenticular lenses, which are lenses that focus the light emanating from the screen into one specific direction, you can theoretically serve only the solid angle taken up by your eye. Let us calculate how much power that requires.

For video viewing, the maximum viewing distance (i.e. the distance from your eye to the screen) should be around three times the screen diagonal. In a computer monitor setup, this figure tends to get close to one time the screen diagonal. Let us assume the latter, because it is a worst-case scenario.

The human eyeball is (according to standardized metrics) 25mm in diameter. The only part that actually lets light through is the pupil, but the rest of the eyeball does need some incident light to be able to respond to changes in light intensity. This makes for a surface area of the eye of about 500 mm^2. Let's take a 20-inch monitor at a viewing distance of 20 inches. Light coming from the screen in all directions would be evenly spread over a half sphere, which at 20 inches distance would have a surface area of 1621464 square millimeters, of which only 500 is incident on one eyeball, so 1000 is incident on two eyeballs. Now we focus all light just on that bit of surface area. This means we only need 1/1621th of the light output we calculated earlier! Let me tabulate these results:

Screen sizeLCD power consumption [µW]APS power consumption [µW]
20-inch51886
24-inch51486
27-inch50885
30-inch50485


And finally, we have arrived at valhalla. Now, power use is practically irrespective of screen size, only dependent on viewing angle. And in this table, the viewing angles is held constant (all monitors are viewed at a distance equal to their diagonal). Also, just in case you did not notice: that table is in microwatts. That is how little light energy is actually needed to show you the information on the screen. All that other light is simply lost. And even now, a significant portion of light is lost around the pupil, we can optimize even further if we would want. Let us just stop here.

Even when you make the light source and matrix much more inefficient, such a display that directs the light beams directly at your eyes only needs a few milliwatts to function. I am really persuaded that this is the future of display technology, and theoretical proof that for instance heads-up displays (the Google Glasses) and visors are not at all something that just belongs inside science fiction movies. I said lenticular lenses are one way of making this happen - that is only when considering traditional stationary displays. If you would use this formula in for instance a heads-up display, one might employ lasers to direct the light directly into the pupil. Not only because that would actually work, but also because lasers are really cool. Star Wars has lasers.

I hope this chapter has given some insight into the theoretical limits of display technology. I've made quite a few assumptions and estimates so please, don't quote me on exact numbers. When talking about lumens, I already touched upon the problem that lumens are defined for a certain spectral density distribution (CIE, PAR, etc.). Lumens can be defined differently, and as such the power figures may change quite a bit. So far, this is only theoretical and it will take quite some time to even get close to the kind of power consumption figures I am stating here. But at least it shows that even though people consider LCD screens to be 'power-efficient', in reality they are still throwing away many orders of magnitude of power.

Is a 10W display possible in practice?
This is the question I actually wanted to answer. And the answer is: probably not! That means I can hopefully falsify my hypothesis by actually doing the handywork on a monitor.

I will not be modifying an active-pixel or narrow-angle display, as those are mostly theoretical beasts at the time of writing. So it is going to be a conventional LCD with LED-backlight and an IPS panel. The screen size I am going for is 23 or 24". This means the theoretical power consumption with an ideal panel and backlight would be between 0.84 and 1.2W. That is assuming a panel with 16.7% transmittance and a 250 lm/W backlight. A practical panel will have about 10% transmittance and likewise a practical, very efficient, backlight will be able to maybe squeeze out 125 lumens per watt. This means that for the backlight alone, I will need between 2.8 and 4W. I will need to design my own backlight converter, which will probably work at 90% efficiency, pushing the power consumption to 3.1 and 4.4W respectively. That leaves just a couple of watts for the electronics. It really depends on the efficiency of the microelectronics. It will be a challenge!

*Obviously, if you really want to go nitpicking here: what does 'exactly one-third of the light' mean? Light power? Number of photons? Luminous flux? There is no true answer, and usually the pigments in color filters are chosen such that (1) their area can be chosen to be similar or identical and (2) the luminous flux (i.e. lumens) through each subpixel is the same. Still, there are other approaches: RGBG PenTile displays for instance have green subpixels which by themselves put out much less luminous flux than the red and blue subpixels, due to the 'green gap' effect - the difficulty of producing green OLEDs with high luminous efficacy. Other panels use very different aperture sizes for each subpixel, although this is undesirable from many standpoints.

Disclaimer: There are more effects than what I have talked about in this blog post. For instance, most screens do not evenly radiate onto an imaginary half-sphere, but radiate less to the side and more perpendicular to the screen. That is generally a good thing for power use: that means that in the position where my head will be, the apparent brightness will be higher for the same power consumption. Unfortunately, the reason why this is actually the case is that to the side, the LCD matrix conducts light very badly (due to reflections and other optical effects). This is actually one of the main causes of light loss in the matrix: non-perpendicular losses. And there is more, there are some losses due to the dichroic effect of the polarizers, there is some more light focusing due to the panel build-up in LCD matrices (most obviously spotted when TN-panels go very dim when looked at from below) and so on. This in itself is enough to throw off my calculations by tens of percents in either direction. I am aware of these effects and if I find them to be significant, I will report on them in my build logs.

Thanks to my girlfriend, my mother, Devilly, Infant, pientertje, sebastius, Snowmiss and TheMOD for proof-reading



https://www.paypalobjects.com/en_US/i/btn/btn_donateCC_LG.gifNo PayPal? Message me or e-mail me e-mailadres

Volgende: 10W LCD scherm deel 3: Testresultaten 06-'12 10W LCD scherm deel 3: Testresultaten
Volgende: 10W LCD scherm deel 2: Het ideale scherm 06-'12 10W LCD scherm deel 2: Het ideale scherm

Reacties


Door Tweakers user Snuffel, maandag 4 juni 2012 20:36

Machtig interessant blog :)

Door Tweakers user wheez50, maandag 4 juni 2012 21:17

Mesmerised! Well, almost :)

What I thought about your big screens focussed on only your retina's, make 'em smaller. In-ears sound massive, but really are quite small. Small lenses give a huge screen, even while being minute. So why calculate from a big screen? And why not miniaturize the screen itself as well?

Door Tweakers user mux, maandag 4 juni 2012 21:23

wheez50 schreef op maandag 04 juni 2012 @ 21:17:
Mesmerised! Well, almost :)

What I thought about your big screens focussed on only your retina's, make 'em smaller. In-ears sound massive, but really are quite small. Small lenses give a huge screen, even while being minute. So why calculate from a big screen? And why not miniaturize the screen itself as well?
Well, if you would make a narrow-angle display very small, it would also appear small. You wouldn't be able to view the content on the screen. It necessarily has to be big if the screen is attached to a wall or anything else that isn't your eye or head.

The only way to also miniaturize the screen would be to do a whole different kind of projection, namely what the google glasses do. Such methods produce a virtual image, but this requires the 'display' to be mounted to your eyes or head.

Obviously, the miniature version will much, much cheaper and easier to make. I don't think the former solution will ever be more than a theoretical device, whereas the latter are already reality with the various HUDs and visors we have today. Google glasses are the most recent and in my opinion sexy solution.

Om te kunnen reageren moet je ingelogd zijn. Via deze link kun je inloggen als je al geregistreerd bent. Indien je nog geen account hebt kun je er hier één aanmaken.