Supernova From Experimentation At Fermilab

...Thus the forces of nature would intrude into our subjective world. A breach in the potential barrier towards de Sitter space may be accomplished only once for each planetary system. ...
Why is it so limited? Why not "once in each universe"?
I.e. once the barrier is breached, why does not the extremely high energy density in de Sitter space continue to "pour thru" the breach until the energy density in our universe and de Siter space become the same?

For example, by way of analogy, if the dike keeping the ocean out of Holland is breached and not repaired, the water level in Holland will equliberate with that in the ocean. You claim this breach has happened many times already (all the "Type Ia Supernovas" we have observed and many more we have not seen)

Who or What is the prompt acting "Breach Repairman"? - This is just one more reason why I can not share your concern.

Another reason why I can not take you seriously is you always simply ignore sincere questions.
For example you have yet to comment on the argument that you can not sum the individual nuclear collision in the colliding beams because each is over in a time very short compared to the time when the next collison occcurs. To make this point clear, I used my imaginary "time magnifier" and then each collision appeared thru it to last one second and the next collison occured months, if not years, later.
How can they be considered as if their total energy were simultaneousl released?
Again the energy released in the higher energy primary cosmic ray events is many orders of magnitude greater than in each of the accelerator produced collisions and cosmic ray events have not caused your feared "breach."
Why should a sequence of much more punity events, widely separated in time, do so?
See post 1067, 1111, 1112, 1113 fro more details / discussion of this t <<<<<<< T argument. ("t" is duration of each collision and "T" is typical time between collisons in the colliding beams.)
 
Last edited by a moderator:
Why is it so limited? Why not "once in each universe"?
I.e. once the barrier is breached, why does not the extremely high energy density in de Sitter space continue to "pour thru" the breach until the energy density in our universe and de Siter space become the same?

For example, by way of analogy, if the dike keeping the ocean out of Holland is breached and not repaired, the water level in Holland will equliberate with that in the ocean. You claim this breach has happened many times already (all the "Type Ia Supernovas" we have observed and many more we have not seen)

Who or What is the prompt acting "Breach Repairman"? - This is just one more reason why I can not share your concern.

Not to be concerned, Billy. If this guy ACTUALLY believes what he's saying, he's a genuine crackpot bar none! An the science keeps advancing, he keeps raising the "trigger point" because they've already passed all his previous danger levels and absolutely nothing at all happened.

If I understand correctly, the ONLY reason he's allowed to keep posting his nonsense here is that he's a financial contributor to this site.
 
Not to be concerned, Billy. If this guy ACTUALLY believes what he's saying, he's a genuine crackpot bar none! An the science keeps advancing, he keeps raising the "trigger point" because they've already passed all his previous danger levels and absolutely nothing at all happened.

If I understand correctly, the ONLY reason he's allowed to keep posting his nonsense here is that he's a financial contributor to this site.
Yes. I went back and read his first post and commented about a year ago on fact that he keeps "moving the danger-point goal posts" to always been just a little more than the level current achieved. These moves are not small, but many orders of magnitude as the progress in energy density achieved in accelerators has also been many orders of magnitude.

I do not object to his being the by far the longest thread and if he is helping keep this site functional - congratulations to him.

It does seem to require a certain amount of "moral compromise" with the rules as to what gets moved to pesudo-science forum, but:
Hey, the world is far from perfect and I am a realists about this.
 
Last edited by a moderator:
Yes. I went back and read his first post and commented about a year ago on fact that he keeps "moving the danger-point goal posts" to always been just a little more than the level current achieved. These moves are not small, but many orders of magnitude as the progress in energy density achieved in accelerators has also been many orders of magnitude.

I do not object to his being the by far the longest thread and if he is helping keep this site functional - congratulations to him.

It does seem to require a certain amount of "moral compromise" with the rules as to what gets moved to pesudo-science forum, but:
Hey, the world is far from perfect and I am a realists about this.

Agreed. I suppose it does no harm other than wasting space on the server's hard drive(s).:shrug:
 
BillyT:

I believe I understand his hypothesis, and I originally had the same incorrect understanding as you expressed above.

If I understand his hypothesis correctly, the "breach" does not propagate outward [at the speed of light, or otherwise], as I had originally assumed. Instead, it simply quickly "self-repairs" [by unknown 'mechanism'], but allows a HUGE amount of energy into the universe at the point of the "breach" [into our false vacuum]. It is that energy that is the 'driving force' of the supernova. Apparently, under his hypothesis, every type 1(a) supernova derives the great bulk of its energy not from the energy of the collapse of the star, but instead the collapse of the star triggers a "breach" which releases a relatively uniform amount of energy with each such breach. Under that hypothesis, type 1(a) supernovae would be exceptionally uniform in brightness, as it would not depend on the original size of the collapsing star, but instead simply the event of a collapsing star would always trigger such a "breach" and a relatively constant amount of "intrusional energy" from the breach. This would make then an excellent "standard candle" for estimation of astronomical distances. Rather a novel concept, anyway, though I cannot attest to whether such would occur or not.

As to your "time magnifier", I believe your arithmetic is likely faulty. Each bunch of millions of atoms colliding does so in the space of about 10 cm, or about 1 E-11 seconds [travelling at about 0.9999+c]. As I have indicated before, I cannot come up with any 'mechanism' as to how millions of collisions could be summed up, even if occuring in a relatively short period of time. However, I cannot disprove it either, under the conditions of his operating hypothesis.

As to the "goal post" continually moving, he's never stated at what energy density such a "breach" would occur. Rather, he's always stated that we've thus far been below that level, but are inching closer. Who knows at what energy density such a "breach" would occur, or even if such a "breach" can occur? Certainly we don't have millions of cosmic rays all striking the same small [1 mm by 1 mm] region at nearly the same time [to within about 1 E -12 seconds] as in colliders.

In my view, it certainly needs to be investigated more. I understand that he's spoken with numerous Fermilab physicists, who've told him they recognize such a risk, but that they 'believe' that the risk is not until a much higher energy density. That's a nice belief on their part, but I really don't place much faith in that belief either.

Paul's comments as to whether I've correctly summed up his "argument" would be appreciated.

Regards,

Walter
 
This thread is kept alive only because it's in contention to win the Guinness record for 'longest running internet thread.'
 
Paul's comments as to whether I've correctly summed up his "argument" would be appreciated.

Regards,

Walter

Best of luck, Walter. He never responds to anyone. Regardless if their comments/questions are positive or negative. He's not interested in discussion, just talking.
 
...If I understand his hypothesis correctly, the "breach" ... simply quickly "self-repairs" [by unknown 'mechanism'], but allows a HUGE amount of energy into the universe at the point of the "breach" ... It is that energy that is the 'driving force' of .... every type 1(a) supernova ... releases a relatively uniform amount of energy with each such breach.

Under that hypothesis, type 1(a) supernovae would be exceptionally uniform in brightness, ... an excellent "standard candle" for estimation of astronomical distances.
With enough unsupported assumptions, an elephant can fly unobserved. Physics is not built on many convenient assumptions when there is neither evidence for them nor even rational explaination as to how they might be possible.
...As to your "time magnifier", I believe your arithmetic is likely faulty. Each bunch of millions of atoms colliding does so in the space of about 10 cm, or about 1 E-11 seconds [travelling at about 0.9999+c]. As I have indicated before, I cannot come up with any 'mechanism' as to how millions of collisions could be summed up, even if occuring in a relatively short period of time. ...
I made estimate of the duration of a collision, my "t" by considering that the two nuclei were closing on each other (rarely an exact "head on" collison as nuclei are so small) at the speed of light. Thus "t" is about the nuclear diameter divided by c. - much smaller than you seem to be estimating. What is wrong with my approach to the estimation of "t"? I strongly suspect that most (>99%?) in the two colliding bunches do not make any collision,* so you can not just multiply the individual energy released in each collision by the number in the bunches. I think you and he are doing this, erroneously. (Now I have told what is wrong with your and his estimate of the total energy. Please tell what is wrong with my approach to estimating "t.") Reguards Bill

Also I am surprized that the bunches are only 10cm long. Is this in the lab frame? (I.e. after the relavistic contraction is applied.) How long are they in their own frame? I think the time between collisions, my "T", should be evaluated in their own frame, but I am not sure of this. If that is true and in their own frame the bunch a meter long, then "T" would be tens times larger than if 10 cm is assumed for its estimation. Note that this factor (10?) and the "fraction actually colliding" (1%?) are both factors making T larger. (by 1000 times if these guess are correct) I.e. I do not agree that my t <<<<<< T is significantly in error.
----------------------
*This fraction must be reasonable well known, but I know little of this field.
 
Last edited by a moderator:
SUPERNOVA FROM EXPERIMENTATION AT FERMILAB CERN, BROOKHAVEN AND LOS ALAMOS

Where the collisional energies at Fermilab are now going from 10^-9 to
10^-14 seconds subsequent to the Big Bang at the point origin of the Universe, and the general view of modern physics is that there is a large but not infinite potential barrier towards de Sitter space, then from an empiriical, i.e., experimental viewpoint, we are testing the theoretical predictions of general relativity in the Einstein de Sitter Universe as it is now termed. Gven the 100% predictive efficacy of general relativity now observed, we should request a moratorium in highest energy physics experimentation at this time.

This may be thought of as the better moustrap. Should we fail to grasp the dangers inherent in highest-energy physics experimentation, we then create a Type Ia Supernova, As a species we have failed the basic test of survival, along with some 99 percent of of all species once extant on our planet.

All the children will thank you for your kind actions on their behalf.

Every Best Wish,

Yours sincerely.

Paul W. Dixon, Ph.D.
Supernova from Experimentation
 
Paul - You are like fine wine - getting better with age. Your post are much less redundant and worth reading now.

I do not know much about high-energy physics, but think one of the main motivations for the efforts to achieve higher energies is to produce an observable "Higgs boson" and that some how it helps give "mass" to matter.

This "Where does mass" (in the universe) come from?" is perhaps the deepest mystery known. More crudely, it is: "Where did it all come from?" (Even the now dead "steady state" model of the universe was forced to mysteriously postulate mass springing into existence "out of nothing" in violation of conservation of energy to explain the observation that the universe is expanding. BTW that has gotten to be a worse problem with the acceleration of that expansion, forcing now the postulate of "dark energy.")

Clearly mankind does not understand some very deep questions about the origins of the universe's substance or, stated in the modern POV, what was going on in sub nanoseconds after big bang's T = 0. Your concern could be correct at energy densities (in both space and TIME) higher than cosmic rays have demonstrated do not "punch thru" to De Sitter space energy (if it really exists - of course they can't, if it does not.)

Despite this, I can sort of join your call for a "moratorium in highest energy physics" but AFTER the full capacity of the accelerators (still much lower energy density that is safely exhibited by the more energetic cosmic ray's first collision with a nucleus (of oxygen, for example) in the high atmosphere. I do so as the expense of building an acclerator that has significantly more capacity than the current generation will be an "economic supernova."

Many physicists are happy, even employed, to dream (with no supporting evidence or even potential experiments to get it) about "strings" etc. Lets not spend any more on these huge (every sense of "huge") machines, at least for several hundred years. I do not know if you have read any of my many "black cloud is approaching" economic threads, but IMHO much of mankind is in for a very rough time (especially in US and EU, where the economies, flying high on borrowed money, have a long way to fall.) If you have, you may better understand why I say NO MORE BIG accelerators.

I remain confident that nature has done much higher energy density experiments than man can, as is almost always the case in everything. For example: Nature has made and operates a stable fusion reactor only 1 AU from where I set now. Nature also made the first fission reactors in African uraniam ore body, water-moderated with stable fission rate controlled by the negative thermal coefficient as the ground water seeping in was converted to steam. (It ran longer than I bet advanced civilaization will last. - We may agree, (for different reasons), that "big brains" were/are one of natures evolutionary mistakes, but I doubt their capacity for high energy physics has much to do with the nature of this mistake.)
 
Last edited by a moderator:
This thread is kept alive only because it's in contention to win the Guinness record for 'longest running internet thread.'
Yust wondering what's the record at the moment? Is this the longest thread in sciforums?
no irony or mall intentions to the posts obviouly
 
SUPERNOVA FROM EXPERIMENTATON AT FERMILAB, CERN, BROOKHAVEN AND LOS ALAMOS

One of the classic questions regarding the presence of aliens in our Universe is"Where are They?," since over cosmological time they must have been able to visit us with their greater advances in technology developed over vast stretches of time. It may well be that we are viewing the answer to the classic question as we now in our colliders are producing energies
nearly equal to the "Big Bang" at the point origin of our universe Thus we are
testing the Relativistic Cosmology of Albert Einsten and Willem de
Sitter. In their theory, only a large potential barrier prevents the
entrance of an exploding universe into our continuum (1). This
intrusional event would be generative of Type Ia supernova. The multitude
of species envisioned by some authors may somtimes acheive intelligence
and then in our universe, much like a better mousetrap, are rewarded by a
vast explosion destroying their solar system and a host of near-by
stars. In this way, offering another confirmation to the well-established
relativistic physics of Albert Einstein and his co-workers. Type Ia
supernovae are used as standard candles due to their great similarity in
size and lack of evidence for hydrogen at maximum light.

So far in some 30 years of presenting this to the world of science, there has been no refutation of this thesis.

All the children will thank you for your kind efforts on their behalf.

All Best Wishes,

Yours sincerely,

Paul W. Dixon, Ph.D
Supernova from Experimentation

1. Perry, M. J. (1986) Quantum tunnelling towards an exploding
Universe? Nature, Vol 320, p. 679.
 
Last edited:
Paul is like an immovable object agaist an irresistible force.

His points are still valid as we do not know what will happen with such concentration of energy and that is why it is an irresistible idea to experiment.

And, so the debate continues until nothing happens after the experiement or we wont be here to argue about it. At least some journalists are reading the thread and bringing up these questions which will make the scientists to be extra careful.

There have been movies about it, television series like Odyssey and so on.
 
Ok, so then i guess we are looking at some background destruction, at these Labs.
It seems quite clear that the core of the earth and magnetic feild are actually reciveing this SOS/ morse code produced by these accelreators.

Seems Dangerous, what should i look at Paul. I know should read some of the topic to pick a little, but what is the specific reaction i need to look at.
And who would you like me to contact.
DwayneD.L.Rabon
 
SUPERNOVA FROM EXPERIMENTATION AT FERMILAB, CERN, BROOKHAVEN AND LOS ALAMOS

Please contact your elected official and indicate your concern in this regard.

May we very respectfully request that these transitions be modeled via computer simulation before the Tevatron at the Fermi National Accelerator Laboratory continues with any further experimentation at this time. To avoid any bias in understanding, publication of these results in a peer-reviewed journal of highest repute is most respectfully recommended to members of the staff at Fermilab.

It may be helpful to clarify the philosophical position and astrophysical energetics instrinsic to de Sitter space in the standard cosmological model in this postulation of transition from de Sitter space as generative of supernova in high-energy physics experimentation.

A philosophical position may be cited from, G. W. F. Hegel (The philosophy of history, New York: Dover, 249, 1956) ..." there is no essential existence which does not manifest itself." The very large energies derived by Willem de Sitter for the equations describing the false vacuum of de Sitter space yield an energy density of 1.69 x 10^126 for eV (electron volts) per cm^3. (Gott, R. (1982) Creation of open universes from de Sitter space, Nature, 295, 304-307. In Waldrop. M.M., (1982) Bubbles upon the river of time, Science, 215, 4536, 1082-1083), the energy density of de Sitter space is given as: 5 x 10^31 kelvin and 3 x 10^93 grams per cm^3 , converted to eV via e=mc^2 which is Albert Einstein's famous equation. This energy would then find expression in the observable universe. In the sense of this analysis, it would be quite unlikely that energies of this order of magnitude would remain hidden should a transition be formed in the potential barrier towards de Sitter space. This would serve as an immediate and ever present danger for the investigator and constitutes a public endangerment as well.

This is based on the mainstream theory of universe formation by Professor R. Gott of Princeton University in which each bubble universe forms smoothly out of de Sitter space. A potentially infinite number of universes may form in de Sitter space. In a topological sense, de Sitter space is cobordant at each point with the continuum (our universe). De Sitter space is then prevented by a large potential barrier from forming an intrusional event into the continuum. The essential hypothesis of this formulation is that with sufficiently great energetics, a classical breach in the potential barrier towards de Siitter space will be formed thus releasing the force of Type Ia supernova upon the terrestrial ecosphere, the solar system and those nearby stars. These energies are from de Sitter space, therefore; the energies of the accelerator only serve as a trigger for their release.

With sufficient energies, under this postulation, we discover that the accelerator is in the Einstein de Sitter universe, as it is now termed, and we have gone from particle physics as our governing theory to relativistic cosmology.

No harm will result from computer modelling of this alternate hypothesis for generation of Type Ia supernovae as a result of the formation of a transition towards de Sitter space. Yet clearly, vast harm may result form our continuing to plunge into the unknown without proper foresight concerning this possibility.

All the children will thank you for your kind actions on their behalf.

Every Best Wish!

Yours sincerely,

Paul W. Dixon, Ph.D.
Supernova from Experimentation
 
''I think Paul is concerned with the concentration of large energy in a small volume, not about the total energies employed. I've heard people suggest that such cramming of energy might produce a femtoscale black hole, which could accidentally absorb some nearby particles and grow larger before it can evaporate, and continue absorbing surrounding matter/energy and growing until it consumes the Earth.''

And you are thinking too statically. Presume (as i do), that we use the energies deployed at Fermilab, and did create a black hole - (let's say for example, the size of a marble) - it will move out of its confinements, and due to gravitational force of the earth, and the centrifugal forces, and rush towards the center of the earth! By this time, we might as well begin to say good bye to earth. It would begin to eat the earth - much like what is presumed in some cores of nuetron stars... More destructively, it could spurt out energy (as presumed from certain types of black holes).

Reiku

P.S... We need to stop these high energy products in particle accelerators. Hawkings now wants to create a baby black hole in the LHC... He is off his head, as we still do not know whether some large black holes could be horselike. Hawkings is nothing but a brain in a jar!
 
And there you have it in a nut shell...time to write a email capmpaign...for computer simulation than actual megabomb like event....I am for it.
 
Back
Top