I recently had an energy audit done of my home. During the energy audit they tested my gas boiler (Burnham Holiday 172,000 input, 140,000 output, 106,000 IBR - cast iron approx. 1960 vintage) and found it operating at 81% combustion efficiency. This was a dissappointment to me because I hoped to replace it and reap much efficiency savings. If it's 80% efficient, an new one won't be much better.

The boiler was actually consuming only 1.5 CF of gas per minute or about 90,000 BTH/hr because I have turned the shut-off valve down some to reduce flow. The boiler is grossly oversized as, on the coldest day of winter the home requires only 12 CCF total to keep it warm, or about 50,000 BTU/hr.

Here's the riddle - while turning up the aquastat to test efficiencies at different water temperature, the boiler topped out at 160F (although the aquastat was set at 180F). This means that the hydro-air blower was pulling heat off as quickly as the boiler could produce it and therefore the boiler was in equilibrium and no longer heating up.

I looked up the rated BTU output of the air handler and found for a water temp of 160F, a water flow of 6gpm (generous considering it's a Taco 007 pumping through maybe 100' of 3/4" copper piping round trip from boiler in the basement to air handler in attic) the rated BTU output from the air handler was approx 50k but/hr. The boiler should have been putting out 72k btu at 80% efficiency (.8x90k btu). What happened to the extra 22k btu? Could it possibly be lost from the boiler jacket (which isn't noticibly warm) or in the 100' of mostly insulated copper pipe? It seems unlikely.

Is it possible that the combustion testing didn't give an accurate read?