When a new hospital in New England opened, it expected about 5,000 heating degree-days (HDD), based on the weather bureau's 30-year average. But its first winter was exceptionally cold, with 5,500 HDD during the eight-month winter period. The facility manager noted that he used 267,480 CCF of natural gas during those winter months, so he figures he did so at a rate of 48.6 CCF/HDD (i.e., 267,480 CCF/5500 HDD). Based on this factor, he calculates that a normal 5,000-HDD winter would require that he burn 48.63 x 5000 = 243,150 CCF.
What's wrong with this method?