James R said:
]I think it is fundamentally important.
How? The only thing important is the conversion of the "A" clocks tick
rate to its real value locally at "B". This system does that. If you think
it doesn't then let me suggest that while you may know Relativity, you
clearly do not understand electronics.
That is not a put down but would be a simple fact based on evidence.
I dispute that it produces the same tick rate between
clocks.
I'm afraid you would have to show that to not be the case. It is
clearly an easy thing to do and prove without Relativity being involved.
I have two clocks, neither in motion but in addition to what I have
done in this system, after modulating the 1 MHz carrier at "A" I pass the
signal through a frequency converter and reduce its frequency to 229KHz and
transmit. I receive it at clock "B", I do the division and applly the number 10 to the digital to frequency converter which is running on "B's" 1MHz carrier frequency.
Result? 1 tick per second.
It does not matter that the frequency is shifted by electronics or
Relativity, the process loop regenerates the actual signal as it is in reality.
If you think the carrier frequency is ultimately unimportant,
then I think I can reproduce the information effect of your setup in all
respects with the following alternate setup:
1. Clock A transmits a message to B containing the digital number ten.
2. B sets his monitor of clock A to tick at 1 tick per second upon
receiving the number ten.
There. That is the entire content of your "calibration", as far as I
can tell.
Great you got it. Conclusion doppler shift and simulataneity shift by
Relativity is null and void. Clock "A" is shown to be "Actually"
running at 1 tick per second and not 0.229 ticks per second according to
Relativity.
While "B" may think "A" has slowed its encoded data about its operation
clarifies the fact that it is actually unaffected in reality by "B's"
view.
But notice that B can only set his clock by referencing his own
clock. He knows that he has to set his monitor to 1 tick per second,
but how does he measure a second? He has no information from A on how long a second is. He only has his own clock to determine how long a second
is.
Absolutely, unless you are now going to argue that "B's" proper time
has changed? Of course not he is still at 1 MHz the common standard
between clocks. Relativity does not allow changes in the local proper time or
any other component of its physics. That is a hall mark of Relativity.
All physics in "B's" inertial frame are identical to the physics of "A's"
inertial frame although there is relavistic velocity between them.
Lets reduce this arguement to something easier to see. Lets forget all
about light signals, timing, frequency changes, ratios codes, etc.
Lets simply calibrate two clocks and install on them a digital display which
shows tick rate. Now from deep space moving at any velocity I view
clock "A" with a high power scope. What does "B" now see "A's clock rate as?
If it still reads 1 tick/second (which it will) then its accumulated time must be the same as it was at rest.
So, using this setup, B has not really duplicated any feature of A's clock. He has made an assumption and produced a result based on B's own
clock.
Nice try but no cigar. A and B have calibrated standards to each other.
Relativity does not allow changing that in each's own inertial system.
All physics are identical in both frames. There are no assumptions, only
transfer of real information about the status of operations in reality.
Every thing else is perception since it didn't and can't even by the
standards set by Relativity itself. That fact is indeed a conflict
with Relativity. It requires all inertial systems to remain equal in terms
of physics but then demands that the physics of inertial systems have
changed.
The only rational conclusion is that such change is perception otherwise it
violates the equality of physics in inertial systems. You just can't have
it both way as reality. Since the equality of physics in any inertial system can and has been tested and found true, then it is time dilation that is mere perception. Both cannot be reality. It violates Relativity.
Your description with the modulated beam is no different from
this, in essence. It does not capture anything real about the rate at which
A's clock ticks relative to B's clock. It only captures information about how
fast A's clock ticks, as determined by A himself.
In other words, your setup does not compare the two clocks, as far as I can
see. You haven't provided any means of sychronisation which guarantees that
the clocks tick at the same rate, or that the monitors are calibrated to actual tick rates.
There are therefore two options open to you. Either:
(1) Adopt my method for calibrating the clocks, given above; or
(2) Come up with some alternative method which actually transmits
information about the clock rates from one clock to the other and which
can be used to measure their relative rates.
As I just explained. According to Relativity their rates MUST be the
same.
Physics are the same in all inertial frames. The physics cannot change
based on another frames view. Your view of his physics, certainly I
will accept that but not a change in physics reality.
And no you can't use your system because it imposes relavistic data
upon the monitor that is in disagreement with "A" actual tick rate and serves no purpose other than to claim "artifically" that time has dilated.
Transfer of true information is in disagreement with that and is IN agreement
with the requirement that the physics remain unchanged.