Allan variance, σ2[τ], or its square root (Allan deviation, σ[τ]) is a quantity (as function of parameter τ) which is said to be a measure of (or related to) "stability of clocks".
For a recent example cmp. "First accuracy evaluation of NIST-F2" (T. P. Heavner et al.), especially
"Figure 1. The Total deviation (TOTDEV) of NIST-F2".
Clearly, this quantity is referring to one clock itself; e.g. "the NIST-F2" in the article.
However, the Wikipedia page points out that for "practical measurements":
Likewise, the article on the NIST-F2 states that
"The measurement was made using a commercial hydrogen maser as a reference."
With two clocks being involved, there's necessarily concern about clock drift; indeed:
My question:
Can you please give an expression (as explicitly as reasonably achievable here)
of this mentioned "(raw) output result before post-processing",
in terms of
the total phase Φa[t] of the "device under test",
the total phase Φb[t] of the "reference",
and other parameters as needed, such as perhaps the nominal angular frequencies, ωa and ωb, of the device under test and of the reference, respectively ?
No comments:
Post a Comment