Quote:
Originally Posted by IndySam
But you calculations make my point. Instead of using average costs for the country you picked a rate that's almost 4 times what I would pay, a light bulb cost way too high and a usage rate above what my hardest used light bulb would generate. Things like that make my skeptic alarm go off, big time.
That's what salespeople do, over exaggerate things to prove their point.
|
I did the calculations for both Connecticut (as an extreme) and the US national average. The US average is $0.1153/kWh, and the stated average for an Indiana residential property in August 2010 is $0.091/kWh. Redoing the math from my first post using the Indiana electricity price yields an annual savings of $14.43. Plugging that into the annual worth equation yields annual savings of $11.77. At these usage levels, it would take just over two years to break even. At lower usage levels (four hours per day), it would take ~4.5 years to break even. Over the lifespan of the bulb (neglecting inflation or electricity price changes) this still results in a net profit of $128.
Either way, these calculations won't matter after 2012 when new energy efficiency standards take effect that require all standard light bulbs between 310 and 2600 lumens to be 30% more efficient than baseline incandescent bulbs, or in 2020 when even tighter standards will require all bulbs to produce at least 45 lumens per watt. While this has spurred more innovation in lighting technology in the past five years than the previous fifty, the core benefits from all improved technology have thus far been from long-term savings and benefits. It's likely that as the technology improves and production ramps up the initial prices of these improved bulbs (whether CFL, LED or high-efficiency incandescent) will approach parity with plain incandescent bulbs.