The very definition of 1 therm is 100,000 BTU, can't figure out where you came up with a bizarre number such as 1 therm =1000043.059.
The number of btu per cubic foot varies with the quality of the gas going into the local gas grid, but typically it's pretty close 1 therm/ccf. Pure methane is considerably below that, the national average is between 1.02 & 1.03 therms/ccf. If your gas supplier is delivering 0.999661 therms/ccf it's somewhat below the national average, but still way higher than pure methane. Given the accuracy of your calculation it's pointless to take it to even three signficant digits. The granularity degrees of HDD isn't that precise, you probably didn't read the meter at exactly midnight either, and your house probably isn't at the airport right next to their weather monitoring instruments (or is it?). You might as well call it 100,000BTU/ccf.
And yes, there are a few steps missing. The number you are probably looking for is in units of BTU per hour, but at the 99% outside design temp, not some random late-fall daily average temp. The information you get to start with is in unit of therms (or ccf) per degree-day, and "degree" being defined the binned hourly mean temperature below a presumptive base heating/cooling balance temp of 65F (which will be close, for most houses.)
The way to get from therms/HDD to BTU/hr at the 99% outside design temp goes something like this:
30 therms/178HDD works out to ~16,854 BTU per HDD.
Assuming you're burning gas in a condensing boiler with output temps at or under 125F you'd be getting about 95% efficiency out of it, so the heat going into your house (as opposed to out the exhaust venting) is 0.95 x 16,854= ~16,000 BTU/HDD.
With 24 hours in a day, per degree-hour that's 16,000/24= 667 BTU/ degree hour.
In Roanoke the 99% outside design temp is +17F. Heating degree-days are measured at a presumed heating/cooling balance point of 65F, so your heating-degrees are 65F-17F= 48F.
That implies your heat load at the 99% outside design temp is 667 BTU/degree-hr x 48F degrees= 32,000 BTU/hr, give or take.
If your radiation square-feet equivalent is 580 sq.ft., at the 99% outside design condition that's 32,000/580= ~ 55 BTU per square foot. Taking a peek at the graph on page 2 of this simple guide, it cross the 55 BTU/ft line at about 122F average water temp.
If your system radiation water is pumping at a rate that delivers a 20F delta-T at about those temps, you would need 132F output, and be getting a 112F return. If it's more like a 10F delta you'd be looking at ~127F out, ~117F return.
Since you probably used some gas for hot water etc (or did you not bathe or wash dishes/clothes that week? :-) ), start out setting up the curve for 125F output @ +17F and see if it still keeps up when it gets really cold out. If you go through a night where it drops to the low 20s and it doesn't lose ground, back it off to 120F out @ +17F. If it doesn't quite keep up when the temps are in the low 20s bump it up to 130F @ +17F. But that's going to be about the right range, if you measured up the radiators correctly.
There are some other factors that will skew it- a 40 mph wind on a cold winter night would increase the heat load enough to matter, and if your fuel use measurement week was calm & sunny, and the house has a lot of south facing window and high-R values (not likely, given the ~32K estimated heat load) the passive gains might be enough to more than offset your domestic hot water gas use or even a good fraction of your space heating use. But the calculation is good enough to get you to a reasonable starting point for tweaking the outdoor reset curve.