I have a fairly tenous grasp on this GPS thing, but I think it's enough to get a grip on basic long-term relativity effects. The short term stuff (changes in signal in times less than a day) is over my head.
So, here's my grasp:
Special Relativity says that due to the velocity of the satellites, clocks on the satellite lose 7.2 microseconds per day compared to ground clocks.
This is easy to calculate for a ground clock on a Pole, but trickier (too tricky for me) for clocks off the pole. This is because the ground clock's speed in the ECI frame is anything up to 460m/s depending on Latitude, and the relative velocity of the ground-clock and satellite clock will change through each orbit and each day. This is where those transient effects come in to confuse me.
So, a ground clock at the South Pole has zero velocity in the ECI (Earth-Centre-Inertial) frame, while the satellite speed in the ECI frame is a pretty constant 3870m/s.
That gives √(1-v²/c² = 0.9999999999168
Multiply that by 86400 seconds in a day, and we find that SR predicts the satellite clock ticks 86399.9999928 seconds for each 86400 second day on the ground clock, or 7.2 microseconds short.
This is not frame dependent. SR says that it's true in all frames.
So anyway, the easy calculation I make is to use the simple gravitational time dilation formula I found at HyperPhysics...
T = To/√(1-2GM/Rc²
...with the ground clock at R = 6.4x10<sup>6</sup>m, and the satellite clock at R = 2.7x10<sup>7</sup>m
I don't know how that formula is derived, so I can't be sure that it's appropriate, but I do know that it gives a result consistent with what is apparently measured in reality once the velocity dilation is included.
The gravitational time dilation formula says the satellite clock runs 46 microseconds per day faster than the ground clock, ie the satellite clock accumulates 86400.000046 seconds for each 86400 seconds passing on the ground clock (I didn't actually just calculate that, I looked the figure up. But I have calculated it in the past).
Again, this is not frame dependent.
Adding velocity time dilation and gravitational time dilation together, we find that the satellite clock should run 39 microseconds a day faster than the ground clock... a result that I'm led to believe matches the rate that GPS clocks are actually set to run in practice.
Like I said, it's fairly tenuous and ignores transient effects... but it's all I've got.
Edit:
I've just plugged in the numbers and confirmed that the gravitational time dilation formula does indeed predict that the Satellite clock accumulates 86,400.000046 seconds for each 86400 seconds passing on the ground clock.
So, here's my grasp:
Special Relativity says that due to the velocity of the satellites, clocks on the satellite lose 7.2 microseconds per day compared to ground clocks.
This is easy to calculate for a ground clock on a Pole, but trickier (too tricky for me) for clocks off the pole. This is because the ground clock's speed in the ECI frame is anything up to 460m/s depending on Latitude, and the relative velocity of the ground-clock and satellite clock will change through each orbit and each day. This is where those transient effects come in to confuse me.
So, a ground clock at the South Pole has zero velocity in the ECI (Earth-Centre-Inertial) frame, while the satellite speed in the ECI frame is a pretty constant 3870m/s.
That gives √(1-v²/c² = 0.9999999999168
Multiply that by 86400 seconds in a day, and we find that SR predicts the satellite clock ticks 86399.9999928 seconds for each 86400 second day on the ground clock, or 7.2 microseconds short.
This is not frame dependent. SR says that it's true in all frames.
The mind-picture that I've constructed incorporates GR only as far as thinking about how deep the clocks are in Earth's gravity well. I wouldn't have thought that the direction of the field would make a difference if the potential stayed the same, but that's not much more than a wild-ass-guess...BUT... hang on a minute. This is difficult, because if I think of the problem from a GR perspective the "gravitational field" experienced by the satellite constantly changes direction relative to the stationary-satellite clock, and I'm not sure if or how this affects the timing.
So anyway, the easy calculation I make is to use the simple gravitational time dilation formula I found at HyperPhysics...
T = To/√(1-2GM/Rc²
...with the ground clock at R = 6.4x10<sup>6</sup>m, and the satellite clock at R = 2.7x10<sup>7</sup>m
I don't know how that formula is derived, so I can't be sure that it's appropriate, but I do know that it gives a result consistent with what is apparently measured in reality once the velocity dilation is included.
The gravitational time dilation formula says the satellite clock runs 46 microseconds per day faster than the ground clock, ie the satellite clock accumulates 86400.000046 seconds for each 86400 seconds passing on the ground clock (I didn't actually just calculate that, I looked the figure up. But I have calculated it in the past).
Again, this is not frame dependent.
Adding velocity time dilation and gravitational time dilation together, we find that the satellite clock should run 39 microseconds a day faster than the ground clock... a result that I'm led to believe matches the rate that GPS clocks are actually set to run in practice.
Like I said, it's fairly tenuous and ignores transient effects... but it's all I've got.
Edit:
I've just plugged in the numbers and confirmed that the gravitational time dilation formula does indeed predict that the Satellite clock accumulates 86,400.000046 seconds for each 86400 seconds passing on the ground clock.
Last edited: