DX time signals on 60 kHz: JJY and WWWB in MSF gaps

The British time signal MSF on 60 kHz switches its carrier from 15kW completely to zero for 500ms at the beginning of each minute, and for 100 to 300ms at the beginning of each second, depending on the clock data. During these gaps, it is possible to observe good signals from DX time stations JJY (Sage, Japan) and WWVB (Fort Collins, Colorado).

WWVB reduces its carrier by 10dB (50kW to 5kW), also at the beginning of the second for 800, 500 or 200ms. Therefore, one can normally observe only their low-level signal in the MSF gaps. There are a few rare exceptions: If WWVB transmits a zero while MSF's bit B is set (ie. blanked at 200-300ms), it can be seen at full power for 100ms. This currently occurs three times per minute, at seconds 10 and 11 (two of the four double dips) and at second 54.

JJY on 60kHz is a second Japanese LF station which broadcasts the same time code as the first one on 40kHz, which has been on air for several years. Interestingly, they are using an inverted format with carrier level rising at the full second, so they are visible with their full power of 50kW when MSF is down. There is a characteristic 200ms minute-marker which sits neatly in the 500ms gap. On a fast spectrogram with 43Hz bandwidth or more, you can even see the path delay (about 24ms) between MSF's keyup and JJY's keydown.


Fast spectrograms of MSF minute marker with JJY, both DX signals, WWVB, double-dip with WWVB low and high
(center frequency 60kHz, 43Hz/pix, 5.8ms/pix, phase is dislayed as hue - 20ms ticks are from SMPS interference)

For long-term monitoring over 24 hours, I took inverse-power weighted averages of 650 samples in every 15 seconds, which effectively blanks out the strong MSF signal and restricts the measurement to the gaps. Naturally, this method cannot discern between the two DX signals, it simply produces the complex sum of JJY high-power and WWVB low-power components. In December 2003, WWVB was visible here from about 22 UT to 12 UT with levels up to -40dB below MSF, while JJY was dominant between 14 and 23 UT with up to -35dB, with some overlap between them.

Comparing these signal levels, one can try to estimate skywave propagation losses. With WWVB roughly 40dB down on MSF; the power difference (5kW vs. 15kW) accounts for 5dB, the distance (8200 km vs. 900 km) for another 19dB for 1/r. Assuming a single hop for MSF and three hops for WWVB, we get approximately 8dB loss per reflection. The calculation for JJY at 9000 km yields about 10dB per hop.

As the transmitted phase of time signal stations is usually kept tightly to an absolute UTC reference, they are an ideal source for path delay change measurements. Thus I have tried to monitor carrier phase as well. My reference was a 60kHz oscillator, locked to the linesync of the German ZDF TV-signal and fed to the 2nd receiver channel. The ZDF framerate is controlled by the same atomic clock that steers DCF77 and is long-term traceble to UTC. - Experimenters within groundwave range of MSF could probably use a simpler setup without the second channel and use MSF itself as a phase reference.


Longterm amplitude and phase plots of 60kHz signals MSF and JJY + WWVB, observed in Nuernberg in December 2003

On all three of the 60kHz signals (including MSF), the increasing amplitude during hours of darkness was accompanied by an additional phase delay of roughly one wavelength, which seems to agree intuitively with the notion of an increased D-Layer height at night. Hints of the WWVB identifier (45-deg advance at minutes 10 to 15) were sometimes found, but it could not be seen reliably. The somewhat periodic jitter is an artifact from the phase reference, as will be explained below.

There is a very peculiar jitter (about +-1µs in approximally 25 minutes) in the ZDF reference, which is equally present in their colour subcarrier. It had previously been attributed to the UHF terrestrial distribution network (see project descriptions by DL7MAJ and DK7NT et al, in German). But an in-depth comparison against the analog satellite signal from the Astra cluster shows the jitter is exactly the same on the sat downlink; the time difference between the two video routes consists solely of the 12- and 24-hour orbital and tidal shifts in the sat position, without jitter. So I have come to the conclusion that the jitter is actually an inaccuracy in the central ZDF timebase, which limits the short-term frequency accuracy of simple ZDF-derived frequency standards to about 1 part in 10^8. This is somewhat annoying, as the transfer accuracy of a TV signal could potentially be better than 1 in 10^12 with one-minute averaging. - Any comments on the possible origin of the ZDF jitter would be most welcome.

Homepage of DF6NM