[time-nuts] Logging the grid frequency....

Tom Van Baak tvb at LeapSecond.com
Fri Mar 1 06:26:58 EST 2013


> Ok, it´s a relative measurement... now I understand your data. Thank you.

Not sure it's relative measurement. Probably best to call it a time error measurement, or a phase measurement (in the timekeeping sense, not the 2*pi sense). Just consider a 60 Hz wall clock; the data is how far wall clock time differs from true time, measured every zero-crossing. So subtracting successive data points gives you period (e.g., 0.0166667).

This brings up a very interesting question.

Many of us continuously log data from counters and such. Although cpu speed and disk capacity are high and costs low, it is still worth considering the format of data we collect -- especially if it's data collected over months or years.

For example, if I measure 60 Hz zero-crossings for an hour with 100 ns resolution, what is the most compact way to format the data? That's 216,000 samples per hour; 5.184 million per day. Consider these methods:

1) ctime-like time-stamp of zero-crossing
Fri Feb 22 23:38:00.2279906 2013
Fri Feb 22 23:38:00.2279906 2013
Fri Feb 22 23:38:00.2446570 2013
Fri Feb 22 23:38:00.2613322 2013
Fri Feb 22 23:38:00.2779954 2013

2) fractional unix time_t of zero-crossing
1361605080.2279906
1361605080.2446570
1361605080.2613322
1361605080.2779954
1361605080.2946650

3) ISO 8601-style (see xkcd.com/1179)
2013-02-22 23:38:00.2279906
2013-02-22 23:38:00.2446570
2013-02-22 23:38:00.2613322
2013-02-22 23:38:00.2779954
2013-02-22 23:38:00.2946650

4) Modified Julian Day (bad idea)
56345.9847248610
56345.9847250539
56345.9847252469
56345.9847254398
56345.9847256327

5) UTC time, one file per day (date in filename)
23:38:00.2279906
23:38:00.2446570
23:38:00.2613322
23:38:00.2779954
23:38:00.2946650

6) STOD (seconds time of day), one file per day
85080.2279906
85080.2446570
85080.2613322
85080.2779954
85080.2946650

7) per-cycle period measurement, date/time implied
0.0166664
0.0166752
0.0166632
0.0166696

8) per-cycle frequency measurement (bad idea)
60.00096002
59.96929572
60.01248260
59.98944186

9) period minus some nominal value (0.0166666)
-0.0000002
0.0000086
-0.0000034
0.0000030

10) period - nominal (0.0166666) and scaled to integer 100 ns units
-2
86
-34
30

11) successive difference of period in 100 ns units
166664
88
-120
64
-24

12) time/phase difference between DUT and REF
0.0000003
-0.0000083
-0.0000048
-0.0000077

Most of us with TIC's (time interval counters) use data like #12 because it's natural, convenient, each datum stands on its own. This time series format can be fed directly into time/frequency/stability calculations.

The most compact for 60 Hz appears to be #10 or #11, even if awkward to use (but a program can easily convert among any of these formats).

Aside from cheap tricks like using binary instead of ascii or using file compression utilities (e.g., g/zip, etc.), does anyone have additional ideas on a compact streaming decimal data format?

Thanks,
/tvb




More information about the time-nuts mailing list