MacM:
Your latest post is the same old waffle all over again. The crux of the matter is this:
You have shown no such failure of the proceedure. It assumes B's local standard and applies the coded 10/1 ratio to that standard. It hence upon transmitting 100,000 pulses to A's monitor counter displays 1 second of accumulated time.
...as measured by B's clock. Which need have no relationship to A's tick rate.
Like I said.
For your view to be valid there would have to be no synchronization between the clocks.
Bingo! You've got it (I don't think).
Now, do you have anything new to add, or are you just going to keep repeating your incorrect assertion?
There's really no simpler way I can explain your failing to you.
In order to determine a clock's tick rate, it is necessary to observe that tick rate and measure the rate against a local clock. Your procedure completely fails to do that. What you do is simply set B's clock to some number at random, and pretend that your number is A's tick rate as compared with B's clock. Why, in your wildest dreams, you would think that, is beyond me.
Ignore clocks in motion. Ignore relativity. Suppose I have two clocks. Suppose clock A is broken, so that it is ticking at 100 ticks per second. A transmits what it thinks is a 1 MHz signal to clock B, sitting on the bench beside A. In other words, the signal A transmits has 1 million waves per tick of A's clock. Since A is running at 100 ticks per second, the actual frequency of A's wave is 100 MHz, but A thinks it is 1 MHz, because A measures this using his own incorrect clock.
A uses your method to encode the number "10" on his wave. His side-band modulation is at 10 MHz actual frequency, but he thinks it is at 0.1 MHz.
Now, B, sitting on the bench next to A, is assumed to have local clock running at the correct rate of 1 tick per second, as measured by B. B receives A's signal and says "Aha! A has sent a signal at 100 MHz, modulated at 10 MHz. According to MacM's procedure, I must now divide 100 by 10 to get 10. Therefore, A has sent the message '10' to me! And I know '10' is MacM's secret signal which means I have to set my monitor of A's clock to 1 tick per second."
B then sets his monitor of A's clock to 1 tick per second, using B's own local clock, of course, as the only available standard for the setting.
B's monitor of A's clock now ticks at 1 tick per second. B's local clock now ticks at 1 tick per second.
According to MacM, B's monitor of A's clock now ticks at the same rate as A's local clock.
Wrong!
Because, as we know, A is actually ticking at 100 ticks per second, but using MacM's synchronisation procedure, B's monitor is only ticking at 1 tick per second.
It is obvious to anybody that B's monitor is in no way synchronised with A's local clock.
This doesn't depend on relativity.
This has nothing to do with relative motion.
This has everything to do with the fact that MacM has invented a stupid, non-working "synchronisation" procedure.
Now, take a slightly different situation. Assume that, when A and B are both at rest, they do, in fact, run at exactly the same rate of 1 tick per second. (This is the original scenario.) Now, this next part will be very difficult for you, MacM, since you have problems dealing with new information and alternative scenarios. You will argue and whinge about how what I am about to say can't be true, but it will be an argument based on nothing but MacM's gut feeling that he just doesn't like it and can't imagine such a thing. Anyway...
Just
suppose that, for some unknown reason,
even though A and B are synchronised at rest, they in fact run at different rates once B is moving relative to A. Suppose, for the sake of argument, that A in fact runs 100 times faster than B when the clocks are in motion.
Does your method now produce a correct synchronisation? Does B's monitor display A's true local time? Of course not. Why? For exactly the same reasons as I have explained above, where clock A is running at the wrong rate because it is faulty.
Whichever way you look at it, your method of setting up the monitors is hopelessly flawed.
Now, anything new to add?
Cue MacM (in whingy whiney voice):
"But my method uses relativity, and it shows that relativity is wrong."
"But clock A isn't broken, so clock B's monitor is synchronised."
"But I don't like it if my method is wrong, so it must be right."
"You haven't shown my method is wrong. I'm going to repeat my false claims again, and pretend I haven't read any of your posts."
"But relativity is wrong, so everything I say is right automatically."
"I can't understand your simple argument, so I will ignore it and assume I am right."
blah blah blah