When I first moved to the United States from England some fourteen years ago, my TV and movie viewing experience consisted of a 21" standard definition TV (with cable!), a VCR and, if I could be bothered to turn it on, a stereo amplifier that was connected to the TV.
How times have changed. Today, my wife and I enjoy High Definition (HD) video on a 52" LCD TV, with video sources from a Comcast dual tuner HD DVR, XBOX 360 Media Center Extender and a Blu Ray player. Oh, and don't forget the AV Receiver that is capable of blasting 7.1x140 watt channels in Dolby TruHD.
Sounds pretty awesome, right? Compared to my old setup it certainly is. However, while many of the components are Energy Star-rated, the dark side is that it's consuming quite a bit of power as well. But maybe not as much as you think.
Lets start with the TV. The good news is the TV I'm using (a Samsung LN52A850) is an Energy Star-rated LCD model that only uses about 170 watts of power in "eco power" mode. Equivalent sized plasma TVs can consume twice that or more. Fortunately, most new TVs consume virtually no power (less than a watt) when in standby mode, although I've seen some rather egregious cases of TVs consuming more than 70W in standby. CNet's reviews with their handy 'juice box' feature is a useful resource when evaluating TV power. At CES 2009, manufacturers were touting their green credentials and showing off TVs that used even less power, including a unit from Sony that turns itself off if no-one is in the room.
Now when I say that my TV 'only' uses 170W, we need to compare to where TVs used to be. To take just one data point, my 10+ year old 27" CRT TV pulls about 110W when powered on, which might sound better, but it also uses about 20W when in standby. This can really add up. If both TVs are on 5 hours a day and in standby for the rest, then my LCD would use about 850 watt hours, whereas my old TV would use about 930 watt hours! Over a year at 10 cents a kWh, that equates to about $31 and $35 respectively. While the savings are not very compelling in order to make a case to your spouse for new TV, you could save 380 watt hours a day by using a power strip to cut power to the CRT TV when you're not watching it. However, this sometimes isn't very convenient (that's what remote controls are for, right?), it takes a while to get into the habit of turning off the power strip, and many of the late generation CRT TVs will enter their startup mode every time it is turned on after power is removed.
Cable box vs Media Center vs Media Center Extenders
Now onto cable. Back in the old days, you used to just hook the cable directly to the TV which didn't consume any additional power. That all changed with the advent of 'digital' cable, and more recently with digital video recorders (DVRs). The Comcast HD DVR (Motorola 6412) apparently uses over 40W all of the time whether it's on or in 'standby'. This equates to 350 kWh used in one year, about half the average home's monthly power usage (and about $35). A Windows Vista Media Center PC on the other hand will use a little more, say about 75W when it is on, but its standby mode will only consume about 3W. Assuming it's on for 10 hours a day (5 hours watching and 5 hours recording), that's about 290 kWhs per year, 60 kWhs less than the cable box.
If you're using a Media Center Extender such as an XBOX 360, then the math can get a little tricky. The worse case scenario is when your Media Center PC records a program and then you watch it later. This means the PC is on to record the program and then both the PC and the XBOX 360 are on to watch it. The current generation of XBOX uses about 119 watts in media center, so to watch a program that was recorded earlier, this can add up to over 269 watts, and that's not including the TV or the amplifier power. In my setup, watching an episode of Heroes on via the XBOX extender and audio through my AV receiver uses over 700 watts, vs 400w using the Comcast DVR. But you can have a lot more storage for HD content on the Media Center PC, plus you have access to your shows on multiple TVs throughout the house (with additional extenders) and can make the shows portable (very handy for when you are on vacation with young children).
The silver lining on using an XBOX 360 as a media center extender is that the 360 can wake your sleeping PC when you want to watch media via the extender functionality. You can do this by enabling wake-on-LAN "magic packet" functionality on the network card of the PC. Instructions on how to do this are available here. When the media center extender software is started on the XBOX, the XBOX 360 automatically sends a wake-on-LAN "magic packet" to wake the sleeping PC. This way your PC only has to be on when you are using it (locally or remotely) or when it is recording a program. Even if the XBOX displays a message that it cannot connect, when the PC is ready the XBOX will connect without any further user intervention. It's fairly family-proof - my wife uses it all the time.
So what about the AV receiver? I just replaced my 7 year old Sony receiver to a new Onkyo TX-SR876 (also Energy Star rated), principally so I could utilize HDMI for my components (the old receiver is now downstairs). The receiver is very efficient when in standby (less than 1W) and doesn't come on automatically after a power outage like my old Sony receiver does!
The rub with AV receivers is that if you connect your components via HDMI to the receiver (and connect the TV to the receiver), you are going to be forced to watch your components with the receiver turned on. Given that the receiver and subwoofer use a fair bit of energy while on, I wanted to have the option of saving energy while watching TV shows that didn't justify the added energy cost of surround sound. As a compromise I have connected the cable box and the XBOX 360 directly to the TV (utilizing all three of the rear HDMI inputs), and connected digital audio (coax or optical) from these components to the receiver. Only the Blu-ray player is connected directly to the receiver. While this works pretty well with an advanced remote control (see below), this isn't particularly easy and kind of undermines one of the main value propositions of HDMI - i.e. only needing one cable.
To enable ease of use, I have a different 'activities' programmed on my Harmony Remote to watch TV and use the XBOX - three with the receiver on and three "green" activities that utilize the only the TV Speakers (and leave the receiver off). I didn't have this option with my old TV and receiver, so ironically this new equipment allows me to use it less and save energy. (For the record, while I generally like my remote, next time I would choose something like the Harmony 890 for its RF capabilities - getting "line of sight" to components can be difficult, and frankly the touchscreen on the Harmony One isn't ideal - I think hard buttons are probably better).
As it happens, both my TV and AV receiver support a new standard called HDMI-CEC. This feature allows you to control the receiver via the TV, so you could just use the TV remote if you wished. I noticed a short disclaimer in the manual that the use of this feature would significantly increase standby power on the receiver. This peaked my interest, and it turns out that they are right - with the feature enabled, it uses about 80W of power on standby, or 700 kWh's a year, about a month's worth of electricity usage. Kudos to Onkyo for pointing that out in the manual, but suffice it to say, that feature will not be enabled in my house!
Also, the Sony Blu-ray player has an option to start-up faster, but at the cost of using more energy on standby (again the Blu-ray player tells you that it is going to use more energy). I haven't measured the energy use of this fast startup mode, but given my previous measurements, it's not something I'm going to be enabling in a hurry.
So what's the upshot of all of this?
Needless to say this is a mixed bag. It's been clear for quite a long time that our penchant for higher fidelity and personalized entertainment is going to have ramifications for our energy bills. However, if configured optimally these newer devices can use little to no energy when turned off and may actually in aggregate consume less power than older devices. Cable and Satellite companies are hopefully taking note and planning set top boxes that consume near-zero watts when on standby.
Also, while I applaud device makers that are reducing both operational and standby power, there are some worrying signs that these efforts could be quickly undermined by brand new 'user friendly' features such as the aforementioned HDMI-CEC that wipe out any savings. Also, the industry could do more - in particular, "green" AV Receivers that provided a passive HDMI pass-through mode which consumes little-to-no power when active or in standby. This would make it much easier to install and configure systems that allowed families to enjoy hi-fidelity audio when it made sense, vs having the receiver on every time the TV was on, saving lots of energy as a result.
So, how green is your AV experience?