Observers

According to Wikipedia, the spooky action at a distance thing can't be an action at all:
The distance and timing of the measurements can be chosen so as to make the interval between the two measurements spacelike, hence, any causal effect connecting the events would have to travel faster than light.

According to the principles of special relativity, it is not possible for any information to travel between two such measuring events. It is not even possible to say which of the measurements came first. For two spacelike separated events x1 and x2 there are inertial frames in which x1 is first and others in which x2 is first. Therefore, the correlation between the two measurements cannot be explained as one measurement determining the other: different observers would disagree about the role of cause and effect.
If entanglement depends on simultaneous measurement (as in "speed of entanglement" experiments) then either SR is in trouble or simultaneity is.
 
exchemist said:
What I mean is that it seems obvious that the energy required to erase information will depend on the medium by which it is communicated.
Yes, I agree it seems obvious.
Are you familiar with Landauer's principle?
 
Yes, I agree it seems obvious.
Are you familiar with Landauer's principle?
I wasn't but I've looked it up and it makes obvious thermodynamic sense.

First it concerns the entropy change, not in the first instance the energy change. Second it concerns only the theoretical minimum change, in other words applicable only to an ideal system, not to a real one, just as the Carnot cycle is ideal and not real. My examples of tombstone, radio waves and DNA were real ones of course.

In the Wiki article it states that a certain amount of energy is "required to erase" a bit of information, but that is not how I interpret it. What is being said, it seems to me, is that a certain amount of entropy increase occurs when erasure takes place. This does not mean any energy input is required to erase it, but that there is an entropy output - which will of course involve a variable amount of energy, depending on the temperature at which it is released.
 
Last edited:
I wasn't but I've looked it up and it makes obvious thermodynamic sense.

First it concerns the entropy change, not in the first instance the energy change. Second it concerns only the theoretical minimum change, in other words applicable only to an ideal system, not to a real one, just as the Carnot cycle is ideal and not real. My examples of tombstone, radio waves and DNA were real ones of course.

In the Wiki article it states that a certain amount of energy is "required to erase" a bit of information, but that is not how I interpret it. What is being said, it seems to me, is that a certain amount of entropy increase occurs when erasure takes place. This does not mean any energy input is required to erase it, but that there is an entropy output - which will of course involve a variable amount of energy, depending on the temperature at which it is released.
I've thought some more about the last bit and I think I see what they mean,now.

If one takes the analogy of a melting crystal, in which order is lost and entropy increases, latent heat is taken in from the environment to bring about the phase change, partly to break molecule-molecule bonds (enthalpy increase), but also partly for the increase in entropy of the substance, as the molecules become able to explore more degrees of freedom in their new liquid state.

So what they mean is that the entropy increase will draw energy in from outside. However the amount of energy is not fixed, as it goes into an internal entropy increase. So it depends on the temperature. Does that sound right?
 
exchemist said:
So what they mean is that the entropy increase will draw energy in from outside. However the amount of energy is not fixed, as it goes into an internal entropy increase. So it depends on the temperature. Does that sound right?
Yeah, that sounds something like it.

From a paper by Charles H Bennett:
"
1. Landauer’s principle

In his classic paper, Rolf Landauer (1961) attempted to apply thermodynamic reasoning to digital computers. Paralleling the fruitful distinction in statistical physics between macroscopic and microscopic degrees of freedom, he noted that some of a computer’s degrees of freedom are used to encode the logical state of the computation, and these information bearing degrees of freedom (IBDF) are by design sufficiently robust that, within limits, the computer’s logical (i.e., digital) state evolves deterministically as a function of its initial value, regardless of small fluctuations or variations in the environment or in the computer’s other non-information bearing degrees of freedom (NIBDF).

While a computer as a whole (including its power supply and other parts of its environment), may be viewed as a closed system obeying reversible laws of motion (Hamiltonian or, more properly for a quantum system, unitary dynamics), Landauer noted that the logical state often evolves irreversibly, with two or more distinct logical states having a single logical successor. Therefore, because Hamiltonian/unitary dynamics conserves (fine-grained) entropy, the entropy decrease of the IBDF during a logically irreversible operation must be compensated by an equal or greater entropy increase in the NIBDF and environment.

This is Landauer’s principle. Typically the entropy increase takes the form of energy imported into the computer, converted to heat, and dissipated into the environment, but it need not be, since entropy can be exported in other ways, for example by randomizing configurational degrees of freedom in the environment. "
 
Can we create energy by making matter travel @ c^2
Energy / mass can neither be created nor destroyed. It only changes in form. Einstein's equation demonstrates one way energy can change form, and it has been verified to work in either direction.

As for what the c^2 means:

http://www.pitt.edu/~jdnorton/teaching/HPS_0410/chapters/E=mcsquared/proof.html

"We now see where the two c's in c^2 = c x c come from. One comes from the equation relating energy to distance; the second comes from the equation relating momentum to time."

Or, to take the point further than John Norton's abbreviated relativistic derivation of E=mc^2:

Momentum is inertia (starting from rest) for energy that is bound or associated with mass.
Energy propagating at c is the equivalent of inertia for unbound energy without mass.

inertia does not exist in either form of energy without time, and the conservation of mass/energy is all about the "temporal persistence" = "conservation" of both forms of energy. Hence, Einstein's forumula is simply a restatement of the conservation of mass/energy.

Notice that no actual mechanism is provided by the formula to indicate how this transformation of energy gets from one form to another. To accomplish that, you need more than Newton or Einstein. You need hansda's instantaneous force. And you also need time itself not to be made proportional to, or otherwise based on the invariant speed of light or any other velocity.

Notice also that what you don't need in order for time itself to be "faster" / "slower" than light is an observer, unless it is to verify that the entangled photons have arrived. Not everything that is entangled is going to be something that is observable directly, relativistically, or otherwise, so you might as well get used to forgetting about observers and simultanaeity altogther. But you can see particles that have mass. All the time. They are real, and they are persistent. This is chiefly what makes me believe that c^2 has another level of meaning here. Entanglement provides the mechanism for energy in any form to persist over time. The two forms of energy are "mixed" (interact with each other), and "mixing" in telecommunications mean a product, exactly analogous to multiplication, hence the term: c x c.

What a breath of fresh, unadulterated air it is to do physics with only Newton and Einstein. No Minkowski in sight.

John Norton's excellent derivation was the content of my very first post to sciforums, or pretty close to it.
 
Last edited:
Energy / mass can neither be created nor destroyed. It only changes in form. Einstein's equation demonstrates one way energy can change form, and it has been verified to work in either direction.

As for what the c^2 means:

http://www.pitt.edu/~jdnorton/teaching/HPS_0410/chapters/E=mcsquared/proof.html

"We now see where the two c's in c^2 = c x c come from. One comes from the equation relating energy to distance; the second comes from the equation relating momentum to time."

Or, to take the point further than John Norton's abbreviated relativistic derivation of E=mc^2:

Momentum is inertia (starting from rest) for energy that is bound or associated with mass.
Energy propagating at c is the equivalent of inertia for unbound energy without mass.

inertia does not exist in either form of energy without time, and the conservation of mass/energy is all about the "temporal persistence" = "conservation" of both forms of energy. Hence, Einstein's forumula is simply a restatement of the conservation of mass/energy.

Notice that no actual mechanism is provided by the formula to indicate how this transformation of energy gets from one form to another. To accomplish that, you need more than Newton or Einstein. You need hansda's instantaneous force. And you also need time itself not to be made proportional to, or otherwise based on the invariant speed of light or any other velocity.

Notice also that what you don't need in order for time itself to be "faster" / "slower" than light is an observer, unless it is to verify that the entangled photons have arrived. Not everything that is entangled is going to be something that is observable directly, relativistically, or otherwise, so you might as well get used to forgetting about observers and simultanaeity altogther. But you can see particles that have mass. All the time. They are real, and they are persistent. This is chiefly what makes me believe that c^2 has another level of meaning here. Entanglement provides the mechanism for energy in any form to persist over time. The two forms of energy are "mixed" (interact with each other), and "mixing" in telecommunications mean a product, exactly analogous to multiplication, hence the term: c x c.

What a breath of fresh, unadulterated air it is to do physics with only Newton and Einstein. No Minkowski in sight.

John Norton's excellent derivation was the content of my very first post to sciforums, or pretty close to it.

This Norton baby derivation is pretty goofy. First equation is bad, very bad for most of the scenarios, so give it a rest if it was your first post at SF.
 
Maybe I can say something about a computer's non-information bearing degrees of freedom.

In a digital computer, there isn't much difference, especially nowadays, between keeping a capacitor charged and keeping a transistor switched on.
So a bit value is equivalent to an amount of charge. (sounds obvious)

A capacitor's degrees of freedom to be charged, or not-charged, then, represents the (internal) information bearing d.o.f's of a computer. However, this on-off state is approximate, usually capacitors are charged when at > 2/3 and discharged when < 1/3 capacitance. The degrees of freedom are just this 1/3 difference in charge.

What about "randomizing configurational degrees of freedom" in the environment (of the capacitor)?
 
Maybe I can say something about a computer's non-information bearing degrees of freedom.

In a digital computer, there isn't much difference, especially nowadays, between keeping a capacitor charged and keeping a transistor switched on.
So a bit value is equivalent to an amount of charge. (sounds obvious)

A capacitor's degrees of freedom to be charged, or not-charged, then, represents the (internal) information bearing d.o.f's of a computer. However, this on-off state is approximate, usually capacitors are charged when at > 2/3 and discharged when < 1/3 capacitance. The degrees of freedom are just this 1/3 difference in charge.

What about "randomizing configurational degrees of freedom" in the environment (of the capacitor)?

Basically what you are saying is to measure the charged state of a capacitor, say from 1/3 to 2/3 (or 0 to 1). For that you need Analog to Digital convertor. How does this randomization help in enhancing the present system? A 16 bit A to D is far more complex and resource hungry then straight 16 digital lines. And I am not talking about any meaningful assignment of information at various charged levels of capacitor.
 
The God said:
Basically what you are saying is to measure the charged state of a capacitor, say from 1/3 to 2/3 (or 0 to 1). For that you need Analog to Digital convertor.
No, that isn't what I'm saying and you don't build A/D into circuits with capacitors in them.
What you do is build circuits that have this "voltage basis" for capacitor charge, of course. You could also build a circuit that works with 0 = 0 charge, and 1 = fully charged. But it turns out to be much easier to make the difference of 1/3 of full capacitance the 0,1 basis.

If you study electronics, you find out why.
(ed. yes I have studied electronics to 3rd year as part of a B.Sc, so there.)
 
No, that isn't what I'm saying and you don't build A/D into circuits with capacitors in them.
What you do is build circuits that have this "voltage basis" for capacitor charge, of course. You could also build a circuit that works with 0 = 0 charge, and 1 = fully charged. But it turns out to be much easier to make the difference of 1/3 of full capacitance the 0,1 basis.

If you study electronics, you find out why.
(ed. yes I have studied electronics to 3rd year as part of a B.Sc, so there.)

But if you have to use various states between 1/3 to 2/3 (or still better 0 to 1) in a digital computer, then you have to convert this charging aspect into a readable control voltage through some appropriate circuit and then use A to D. How is this helpful? You are suggesting that a capacitor can offer multiple states depending on charging level, the question is how is it more efficient.
 
Yeah, that sounds something like it.

From a paper by Charles H Bennett:
"
1. Landauer’s principle

In his classic paper, Rolf Landauer (1961) attempted to apply thermodynamic reasoning to digital computers. Paralleling the fruitful distinction in statistical physics between macroscopic and microscopic degrees of freedom, he noted that some of a computer’s degrees of freedom are used to encode the logical state of the computation, and these information bearing degrees of freedom (IBDF) are by design sufficiently robust that, within limits, the computer’s logical (i.e., digital) state evolves deterministically as a function of its initial value, regardless of small fluctuations or variations in the environment or in the computer’s other non-information bearing degrees of freedom (NIBDF).

While a computer as a whole (including its power supply and other parts of its environment), may be viewed as a closed system obeying reversible laws of motion (Hamiltonian or, more properly for a quantum system, unitary dynamics), Landauer noted that the logical state often evolves irreversibly, with two or more distinct logical states having a single logical successor. Therefore, because Hamiltonian/unitary dynamics conserves (fine-grained) entropy, the entropy decrease of the IBDF during a logically irreversible operation must be compensated by an equal or greater entropy increase in the NIBDF and environment.

This is Landauer’s principle. Typically the entropy increase takes the form of energy imported into the computer, converted to heat, and dissipated into the environment, but it need not be, since entropy can be exported in other ways, for example by randomizing configurational degrees of freedom in the environment. "
I don't follow the last bit, actually. I'd have expected the entropy increase to be within the medium carrying the information, rather than exported to the environment. With a melting crystal, the entropy increase is an internal increase, within the substance, due to greater degrees of freedom. I'd have expected the same to apply here.
 
A hint. Landauer's principle remains controversial.

As recently as last year a research group published results in Nature Comms. that they had broken the $$ k_B ln T $$ limit with a micromechanical device. Last month a paper refuting this was published.

And so it goes. One the one side information is physical because erasing it costs energy. On the other information isn't physical because there is no cost (??)
 
A hint. Landauer's principle remains controversial.

As recently as last year a research group published results in Nature Comms. that they had broken the $$ k_B ln T $$ limit with a micromechanical device. Last month a paper refuting this was published.

And so it goes. One the one side information is physical because erasing it costs energy. On the other information isn't physical because there is no cost (??)
I suppose the underlying question is whether what we humans call "information" is really associated with fewer degrees of freedom at the microscopic statistical level. It is not clear to me that it would necessarily be a more "ordered" state in strict, statistical thermodynamic terms, though I can see the handwaving argument for that to be so.

I suppose in theory if one were to have a "leaky", i.e. unstable, computer memory filled with data and then one were to allow it to lose the data , the memory should get colder. Just as some materials get colder when they dissolve, due to the absorption of heat from the environment required to satisfy the internal entropy increase.

The problem I see in this is that, to me, the extra degrees of freedom that would result in increased entropy imply a freedom for the memory cells to flip-flop spontaneously between 0 and 1 states randomly, due to thermal effects. To put it the other way round, the lower entropy of the pattern of binary states that we call information results from that pattern being fixed, i.e. a lack of freedom to randomize. So if thermal energy is not enough to cause spontaneous randomising, then those degrees of freedom would be inactive and would not, I should have thought, contribute to an entropy increase.
 
This Norton baby derivation is pretty goofy. First equation is bad, very bad for most of the scenarios, so give it a rest if it was your first post at SF.
Read Norton's thought experiment carefully. If a projectile having mass is already at a relative velocity (relative to YOU, at rest) close to c, then it is not possible for you to add more force (in the direction of relative motion) and expect that it will move faster than c. His'goofy' math all follows from that simple premise.

His 'goofy' math isn't goofy at all. What is 'goofy' is when a mathematician looks at the same scenario and thinks: this is happening in inertialess space, so I can use a hammer and an iron stake to drive into inertialess space, right HERE, and create an origin in my geometric mind for doing solid geometry. Then he decides that the speed of light is the key to time, and you know the rest: 4D intervals, hyperbolic rotations of time into space, photons that cannot propagate because for them, time has stopped. Yep. Goofy. Just like Pythagorus' ancient Greek corncob pipe. Greeks didn't grow corn. Must have smoked another pipe.

Boost matrices are the "goofy" result of a mathematician's attempt at doing physics, like all they are interested in is the computational process. That's goofy. If they work at all, that's because of LORENTZ, not Minkowski. You'll find none of Minkowski's nonsense in the original derivations of the Lorentz transformations. A roadbed with sufficient inertia, gifted to it by Newton, is really all you need.
 
Last edited:
This I can understand!! Thank you Dan!
This is the best illustration I know of that E=mc^2 is a gift direct from Isaac Newton, and has nothing whatsoever to do with Einstein's former calculus teacher, who was so enamoured with quadratic equations, he saw conic sections and light cones (which are useless for understanding physics) everywhere he looked. This was not a vision. It was a delusion born of dementia. There is a difference, at least the way I am observing it.
 
This is the best illustration I know of that E=mc^2 is a gift direct from Isaac Newton, and has nothing whatsoever to do with Einstein's former calculus teacher, who was so enamoured with quadratic equations, he saw conic sections and light cones (which are useless for understanding physics) everywhere he looked. This was not a vision. It was a delusion born of dementia. There is a difference, at least the way I am observing it.
The last straw for me. Futile to continue battling against grievously wrong misinfo, only to have it rear up endlessly, with no evident sign of any responsible mod intervention. Left entirely to rank and file pleb members to sort out. Except that is, in the political/ideologically based subforums, where there is plenty of blatantly partisan intervention. Such as summary life bans with nothing more given that 'Rules', and often not even a name to who issued the ban. One gets the message, and for yours truly that message is, take a long break. Concentrate on things that really matter. Adios amigos.
 
The last straw for me. Futile to continue battling against grievously wrong misinfo, only to have it rear up endlessly, with no evident sign of any responsible mod intervention. Left entirely to rank and file pleb members to sort out. Except that is, in the political/ideologically based subforums, where there is plenty of blatantly partisan intervention. Such as summary life bans with nothing more given that 'Rules', and often not even a name to who issued the ban. One gets the message, and for yours truly that message is, take a long break. Concentrate on things that really matter. Adios amigos.
Sorry you feel that way. Must have hit a nerve.

I'd ask for specific objections, but I don't see any in your response.

These might be "plebes", but I just taught them to derive E=mc^2, and their mathematics instructors never did that. Mine did.

Feel free to place me on your ignore list, Q-reeus. I will miss you.
 
Back
Top