Hi,
In the data to date of an NTP measurement (#80845985), I find a single
strange outlier with respect to the RTT value calculated:
{
"fw": 5090,
"mver": "2.6.4",
"lts": 281,
"dst_name": "time2.cloud.tencent.com",
"ttr": 0.413039,
"dst_addr": "119.28.229.79",
"src_addr": "45.11.104.146",
"proto": "UDP",
"af": 4,
"li": "no",
"version": 4,
"mode": "server",
"stratum": 2,
"poll": 8,
"precision": 1.19209E-7,
"root-delay": 0.0000915527,
"root-dispersion": 0.0249329,
"ref-id": "647a24c4",
"ref-ts": 3939345482.4404373,
"result": [
{
"origin-ts": 3939346062.989854,
"receive-ts": 3939346062.999974,
"transmit-ts": 3939346063.0000167,
"final-ts": 3939346062.995442,
"rtt": 4294967296.005545,
"offset": -0.007347
}
],
"msm_id": 80845985,
"prb_id": 7030,
"timestamp": 1730357262,
"msm_name": "Ntp",
"from": "45.11.104.146",
"type": "ntp",
"group_id": 80845985,
"stored_timestamp": 1730357293
},
When I do the math by hand, the "rtt" value I get is 0.0055453, nicely
matching the fractional part found in the measurement data.
Does anyone have an explanation as to where the weird integer part might
come from? Some overflow/underflow or other issue with decimal math on a
binary machine? (I am not sufficiently versed in computer science to
fully understand that topic, just that there are challenges in that area.)
Thanks!
Kind regards,
R.