BenTheMan,
Can you predict, from your model, how much the speed of light should change? How do you know what ``not much'' means if you don't actually have any numbers to back it up?
Here is what I posted earlier:
"Using just Newton's equation for gravity I've found that the gravitational force on an object on the surface of the Earth is almost 2000 times stronger than the gravitational force of the Sun on that same object. Using this number I found that that Sun's gravitational field will only cause the speed of light on the surface of the Earth to change by less than 20m/s. This is much less than the 30 km/s (Earth's orbit) that was expected in the aether-detecting experiments."
From my above statements, I'm sure you can tell how I came up with the numbers that I did. Here is the data:
You have two gravitational fields influence light on the surface of the Earth, the Earth's and the Sun's.
The gravitational force on an object on the surface of the Earth resulting between the object and the Earth is:
F1=m(object)*m(Earth)*G/d^2(distance from the Earth)
The gravitational force on an object on the surface of the Earth resulting between the object and the Sun is:
F2=m(object)*m(Sun)*G/d^2(distance from the Sun)
Now to find how much stronger the gravitational field between the object and the Earth is relative to the object and the sun is:
F1/F2 = (m(Earth)*d^2(distance from the Sun))/m(Sun)/d^2(distance from the Earth)
If you do the calculations, you'll find that the Earth's gravitational effects on the object are 1655.15 times stronger than the Sun's effect on that same object. So on the surface of the Earth, two forces are influencing light:
1 (the strength of gravitational force of the Sun) moving at 60,000m/s (speed of the Sun's gravitational field relative to the object)
1655 (the strength of the gravitational force of the Earth) moving at 0m/s (speed of the Earth's gravitational field relative to the object)
Hypothetically, if both of these forces were equal in strength, a photon on the surface of the Earth would be traveling at c + 30,000m/s(vector) relative to both the Sun and the Earth. But since the the Earth's effect on the light is 1655 times stronger than the Sun's, the difference in the speed of light in this case would be:
(v1+v2)/2 * 1/1655 = 18.12m/s
In summary, the speed of light measured on the surface of the Earth should be c(E) +- 18.12m/s where c(E) is the speed of light in the Earth's atmosphere.
Let me also point out that your request was a little unfair. You basically asked me to calculate the variance in the speed of light when the light is being influenced by two gravitational fields, at two different strengths, moving at two different speeds relative to each other. It's like requiring a person to mathematically explain the movement of an electron in a lithium atom in order to prove that electrons repel other electrons.
Here's a deal: you show me, mathematically, that your model isn't ruled out by the above experiments, and I'll make sure this thread goes back into Physics and Math.
I already showed you in the first experiment that the entire device, which in that case consists of two satellites and a ground station, is actually not moving relative to the Earth's gravitational field. As a result, according to my idea, since there is no movement of the device, the speed of light measured should be c (as it was). According to my model, the speed of light in a device would only change if the device was moving through, and moving relative to, a gravitational field.
As for the other experiment, my model does not cover all the effects of gravity on light. Besides gravity accelerating light to c, I believe that it can also force the rotation of photons causing them to change their paths. The larger distance covered by the photons as a result of a curved path could, in turn, result in slower reactions. I really don't have it all worked out. After all, my theory is not complete. But then again, there are no theories that are.