Supernova From Experimentation At Fermilab

SUPERNOVA FROM EXPERIMENTATION AT FERMILAB, CERN, BROOKHAVEN AND LOS ALAMOS

Of further concern in this regard is the possiblity that de Sitter space, as primoridal aspect of the cosmos, fluctuates in its energy level. Thus the enegetics at the Fermi National Accelerator Laboratory while safe at time A may not be safe at time B. This same point was raised by Richard P. Feynman in his review of the Last Flight of the Challenger. Thus we note:

William E. Burrows p. 558

"The real tragedy of the Challenger disaster was that it was entirely
preventable. Most of the problems with the joints were known about years
before the accident, some were even recognized before the first shuttle
flight. (There were also a number of problems with the main rocket
motors, but most of these were solved before the fatal flight. (Feynman
p.28)) But fixing the problems would have meant even more delays in a
program that was already behind schedule. Instead of fixing the problems
as they were found NASA management rationalized that if the last flight
succeeded then the next one would too. NASA apparently violated both
industry rules and their own safety rules (Vaughn p. 33).. People were
assigned to solve the problems, but this activity was given a low
priority. In the meantime the engineers who worked on the shuttles were
crying for help ((Feynman p. 185). And then there was the famous decision
not to delay the launch of the shuttle, in spite of the objections of a
number of engineers. This decision, again, was apparently made because
the launches were way behind schedule.


The real cause of the accident was not a lack of "lucky putty" but a lack
of concern for safety."

William E. Burrows, This New Ocean: The Story of the First Space Age, The
Modern Library, 1998

We need, therefore, to consult with the wisdom of Professor Richard P. Feynman, Noble Prize winner in Physics and call a halt to further experimentation both at Fermilab, Cern as well as Brookhaven before we make the same fatal error that took the lives of the astronauts in the Last Flight of the Challenger. The determination of the energy threshold towards de Sitter space over time should be carefully measured or all is lost.

Wishing one and all -

A Very Merry Christmas and a Most Happy New Year !!!

Yours sincerely,

Paul W. Dixon, Ph.D.
Supernova from Experimentation
 
Dr. Wagner

Can i speculate soething here.

Instead of antiphotons being the product of these supernovae blasts, why couldn't it be a tuneling energy from the ZPE, instead of the Einstein de-Sitter space?
 
Anti photons. Yes. And has anyone grasped the obvious answer to expanding space?

Anti gravitons of course. Reiku, what do you think of this idea?
 
Superluminal

I agree, in fact... i have a little theory stirring in my mind. Some supernovae blast outwards, which might be a compression of a superdense clahing of monopoles, releasing the negative energy... We have even found black holes releasing such blasts, and this might be highly related...

Now antigravitons couldn't ever clash against each other, because they are so very weak; it's like trying to hit two photons together without a mediator, such as an atom... So yes... Antgravitons, IF produced in more quantity than that of the ellusive gravitons we see today could indeed be causing the acceleration.

As for your question, there is something like 1 antiphoton, (or photon hole) to every 10^15 photons in the universe... what i am struggling right now to comprehend, is how so much antiphotons could still exist... unless this massive hole in the galaxy was a very early collision. In other words, the massive hole could be evidence that all particles where created in equal proportion, instead of the CCT...

Dino... I think that dpeends on certain theories of string theory... Ben could answer this better than I, but by mathematical theory, they SHOULD EXIST.
 
Anti-photons?? I thought the photon was its own anti-particle.
Almost sure you are correct. Only charged particles have an anti- particle (of the opposite charge) For example there is an anti-proton, but no anti-neutron.

If correct, Rieku and Superluminal are just exchanging and sharing their ignorance when speaking of the "anti-photon" and "anti-graviton" as neither has any charge.
 
Almost sure you are correct. Only charged particles have an anti- particle (of the opposite charge) For example there is an anti-proton, but no anti-neutron.

If correct, Rieku and Superluminal are just exchanging and sharing their ignorance when speaking of the "anti-photon" and "anti-graviton" as neither has any charge.
Billy T,

Do you ever read any of my posts? Any idea whatsoever of my qualitative understanding of physics? Do you have any idea when someone is being sarcastic?

Guess what? I just read one or two of your posts and, due to the fact that you're not very good at english, I've decided that you must be an ignorant crackpot.

What do you think?
 
SUPERNOVA FROM EXPERIMENTATION AT FERMILAB, CERN, BROOKHAVEN AND LOS ALAMOS

We may note in this connection that the last observation of a Supernova in our Galaxy was Kepler's Supernova or Star in 1604. In a galaxy of our dimensions, there should be one approximately every 50 years. We now have a statistical expectation of 1/8 or p = 0.125.

"Because supernovae are relatively rare events, occurring about once every
50 years in a galaxy like the Milky Way,[5] many galaxies must be
monitored regularly in order to obtain a good sample of supernovae to
study.

Supernovae in other galaxies cannot be predicted with any meaningful
accuracy. When they are discovered, they are already in progress.[11] Most
scientific interest in supernovaeas standard candles for measuring
distance, for examplerequire an observation of their peak luminosity. It
is therefore important to discover them well before they reach their
maximum. Amateur astronomers, who greatly outnumber professional
astronomers, have played an important role in finding supernovae,
typically by looking at some of the closer galaxies through an optical
telescope and comparing them to earlier photographs.

Towards the end of the 20th century, astronomers increasingly turned to
computer-controlled telescopes and CCDs for hunting supernovae. While such
systems are popular with amateurs, there are also larger installations
like the Katzman Automatic Imaging Telescope.[12] Recently, the Supernova

like the Katzman Automatic Imaging Telescope.[12] Recently, the Supernova
Early Warning System (SNEWS) project has also begun using a network of
neutrino detectors to give early warning of a supernova in the Milky Way
galaxy.[13][14] A neutrino is a particle that is produced in great
quantities by a supernova explosion,[15] and it is not obscured by the
interstellar gas and dust of the galactic disk.

Supernova searches fall into two classes: those focused on relatively
nearby events and those looking for explosions farther away. Because of
the expansion of the universe, the distance to a remote object with a
known emission spectrum can be estimated by measuring its Doppler shift
(or redshift); on average, more distant objects recede with greater
velocity than those nearby, and so have a higher redshift. Thus the search
is split between high redshift and low redshift, with the boundary falling
around a redshift range of z = 0.10.3[16]where z is a dimensionless
measure of the spectrum's frequency shift.

High redshift searches for supernovae usually involve the observation of
supernova light curves. These are useful for standard or calibrated
candles to generate Hubble diagrams and make cosmological predictions. At
low redshift, supernova spectroscopy is more practical than at high
redshift, and this is used to study the physics and environments of
supernovae.[17][18] Low redshift observations also anchor the low distance
end of the Hubble curve, which is a plot of distance versus redshift for
visible galaxies.[19][20]" Wikepedia: Supernovae

Let us keep this frequency of Supernovae generation at this low statistical level.

Al the children will thank you for your kind efforts on their behalf.

All Best Wishes to one and all for
A Very Merry Christmas and Most Happy New Year !!!

Yours sincerely,

Paul W. Dixon, Ph.D.
Supernova from Experimentation
 
SUPERNOVA FROM EXPERIMENTATION AT FERMILAB, CERN, BROOKHAVEN AND LOS ALAMOS

As far as the prediction of Supernova generation from those vast energies now
having onset in the Larege Hadron Collider at CERN in Geneava Switzerland in May 2008, in reference to the dimensional constraints described via string theory:

"With Elie Gorbatov, Banks and I have noted that among the states that have been studied, not only the cosmolgical constant but most or all of the parameters of ordinary physics are random variables. Some physical constants such as the cosmological constant or the electromagnetic coupling might be selected by environmental effects. But many, including heavy quark masses and the theta parameter, seem to have little consequence for the existence of stars or observers. In nature, however, those quantities exhibit intricate patterns that seem unlikely to result from random distributions. On can imagine resolutions, but the problem is a serious one."

Michael Dine, String theory in the era of the Large Hadron Collider, Physics Today, Vol. 60, Number 12, December 2007, pp, 33-39.

The accuracy of the predictions in this regard is thus largely in doubt. All the children will thank you for your kind actions on their behalf

All Best Wishes for the New Year !!!

Yours sincerely,

Paul W. Dixon, Ph.D.
Supernova from Experimentation
 
Last edited:
SUPERNOVA FROM EXPERIMENTATION AT FERMILAB, BROOKHAVEN, CERN AND LOS ALAMOS

In contrast to the current problems with string theory as noted in the preceeding post, the essential predictive valiidity and reliability of generalized theory of relativity has been demonstrated as shown in the following paragraph. Let us hope that the preservation of our species as well as the planetary ecosphere will take precedence over theoretical disputes among the practioners of highest-energy physics in this regard.

Please review, Quantum tunnelling towards as exploding Universe? (Malcolm
J. Perry (1986) Nature 320, p. 679) as well as Dragging of Inertial Frames
(Ignazio Ciufloni (2007) 7158, 449, 41-53) We note: "Classically,
transition from one type of solution to the other is forbidden by the
existence of a large potential barrier." Thus the transtion from the
continuum to de Sitter space is only a function of energy. The source of
energy could be from natural sources, i.e., the implosion of a stellar
envelope, conditions existing in the early Universe, or via high energy
physics experimentation. We now have an empirical experimental test of the
generalization of the equations in the General Theory of Relativity in the
Einstein de Sitter Universe as it is now termed paid for with billions of
our tax dollars. We, therefore, await the tragic confirmation of the
Exploding Universe via the generation of a Type Ia Supernova at the Fermi
National Accelerator Laboratory in Batavia. Illinnois or in March 2008 at
CERN with those energies found some 10^-9 to 10^-14 seconds subsequent to the Big Bang at the point origin the Universe. Please note, Perry (1986)
"Classically, transition from one type of solution to the other is
forbidden by the existence of a large potential barrier." Thus the
transition from the continuum to de Sitter space is only a function of
energy. The source of energy could be from natural sources, i.e., the
implosion of a stellar envelope, conditions existing in the early
Universe, or via high energy physics experimentation. We now have an
empirical experimental test of the generalization of the equations in the
General Theory of Relativity in the Einstein de Sitter Universe as it is
now termed paid for with billions of our tax dollars. We, therefore, as
noted above, await the tragic confirmation of the Exploding Universe via
the generation of a Type Ia Supernova at the Fermi National Accelerator
Laboratory in Batavia. Illinnois or in May 2008 at CERN with those
energies found some 10^-9 to 10^-14 seconds subsequent to the Big Bang at the point origin the Universe. The excellent, Dragging of Inertial
Frames, article in its review of the findings concerning The General
Theory of Relativity indicates the confirmation of the theories
predictions up to the limits of current astrophysical observational
measurement Let us not confirm this theory once again with the
generation of a Type Ia Supernova in our planetary neighborhood.

All the children will thank you for your kind efforts on their behalf.

Yours sincerely,

Paul W. Dixon, Ph.D.
Supernova from Experimentation
 
Paul, if smashing together protons with 10TeV energies could cause a shift in the vacuum then it would have happened long long ago. All the time the upper atmosphere is bombarded by high energy protons, ie cosmic radiation. A huge number of them are at the energies CERN will be reaching. Some of them are trillions of times more powerful!

Infact some of them are a billion trillion times more powerful!
 
"...the children will thank you for your kind efforts on their behalf."

We must keep an eye out for the children, indeed.
 
Paul, if smashing together protons with 10TeV energies could cause a shift in the vacuum then it would have happened long long ago. All the time the upper atmosphere is bombarded by high energy protons, ie cosmic radiation. A huge number of them are at the energies CERN will be reaching. Some of them are trillions of times more powerful!

Infact some of them are a billion trillion times more powerful!
This both true and has been pointed out to Paul several times before, but he wishes to focus on the energy in the entire colliding beams.

I have pointed out that their individual collsions are very separate events. I.e. I imagined a "time magnifying glass" to expand the time scale of the sequence of individual collisions to a more human time scale.

At the worse case, one collison would typically last about a second* (Viewed via the "time magnifier") and the next one would occur about a week later (or perhaps only next year for the typical case). Thus, it is non-sense to consider these widely separated in time events as if they were one collective event. Hence the "cosmic ray" saftey factor argument certainly is valid. (My analysis used in the construction of the "time magnifier" is many pages back, so I have not given the post number.)**
-------------------
*the protons are very small and moving at the speed of light. Thus to get to a one second duration of a single proton on proton collison, the time magnification is huge. The space between protons (due to their mutual repulsion) is also huge compared to their diameter. This leads to the weeks (or years) between the collisons, seen via the time magnifier.

**It was easier to outline the argument than to find the old post.
 
Last edited by a moderator:
Paul, if smashing together protons with 10TeV energies could cause a shift in the vacuum then it would have happened long long ago. All the time the upper atmosphere is bombarded by high energy protons, ie cosmic radiation. A huge number of them are at the energies CERN will be reaching. Some of them are trillions of times more powerful!

Infact some of them are a billion trillion times more powerful!
I am not particuarly worried about a mishap at the LHC, but I do see some misleading statements made in the popular press as well as here.

The most powerful cosmic ray ever recorded was estimated as having an energy 42 million times (not trillions of times) more powerful than will be produced by a 14 TeV proton-proton collision at the LHC. But the LHC will have billions of protons in each bunch, and over 2800 bunches in the ring at one time during full operation. The bunches will collide at a 25 nanosecond spacing during full power operation. That single most powerful cosmic ray was said to have had the kinetic energy of a fast-pitched baseball. I have seen an estimate for the total kinetic energy carried in a beam at the LHC as being that of a crusing aircraft carrier. The older colliders could have the kinetic energy of a 100 ton locomotive. Also, the LHC will collide heavy ions (lead) with energies up to 1,150 TeV per collision. There will be up to 592 bunches of the heavy ions in the beam at a time with tens of millions of heavy ions per bunch. If I recall correctly, the collision spacing between these bunches will be around 100 nanoseconds.

Billy T, these bunches of millions/billion of particles are very tiny in size and the collision should take much less than a nanosecond for all the particles in a pair of bunches. I do not know how many actual collisions will take place when a pair of bunches containing billions of protons collide.
 
...The most powerful cosmic ray ever recorded was estimated as having an energy 42 million times (not trillions of times) more powerful than will be produced by a 14 TeV proton-proton collision at the LHC. But the LHC will have billions of protons in each bunch, and over 2800 bunches in the ring at one time during full operation. The bunches will collide at a 25 nanosecond spacing during full power operation. That single most powerful cosmic ray was said to have had the kinetic energy of a fast-pitched baseball. I have seen an estimate for the total kinetic energy carried in a beam at the LHC as being that of a crusing aircraft carrier. The older colliders could have the kinetic energy of a 100 ton locomotive. Also, the LHC will collide heavy ions (lead) with energies up to 1,150 TeV per collision. There will be up to 592 bunches of the heavy ions in the beam at a time with tens of millions of heavy ions per bunch. If I recall correctly, the collision spacing between these bunches will be around 100 nanoseconds.

Billy T, these bunches of millions/billion of particles are very tiny in size and the collision should take much less than a nanosecond for all the particles in a pair of bunches. I do not know how many actual collisions will take place when a pair of bunches containing billions of protons collide.
I do not recall exactly what I assumed in my "time magnifier" analysis, but think it was that the bunch was at least a meter long and 1 cm in diameter and had about a million collisons.

Perhaps you will both give your reference and make an independent computation of the ratio of the time between individual collisions to the duration of each. If this ratio is greater than 10, it is very unreasonable to consider all the collisons as one event, instead of as a sequence of many isolated events. (Note this only considers their isolation in time - they are isolated in space also. If their space isolation is a factor of 20 or more, IMHO, that too makes them "isolated events", even if by chance they were simultaneous.)

By stating that AlphaNeumeric's post was true, I was only referring to the main point - I.e. that very energetic cosmic rays are at least thousands of times higher power events and greater energy density events than man can make. Paul is never very clear about which is the critical factor (power or energy density). I would think both must be very high to "punch thru" to DeSitter space, and I do not see any "one way only" sign on the barrier wall. - I.e. why has not that fantastic energy density on the other side of the barrier "punched thru" to our universe?
 
Last edited by a moderator:
The most powerful cosmic ray ever recorded was estimated as having an energy 42 million times (not trillions of times) more powerful than will be produced by a 14 TeV proton-proton collision at the LHC. But the LHC will have billions of protons in each bunch, and over 2800 bunches in the ring at one time during full operation. The bunches will collide at a 25 nanosecond spacing during full power operation. That single most powerful cosmic ray was said to have had the kinetic energy of a fast-pitched baseball. I have seen an estimate for the total kinetic energy carried in a beam at the LHC as being that of a crusing aircraft carrier.
Except that you have to realise that that particle is by no means unique, either in our atmosphere or the natural phenomena which created it.

Then you factor in the fact that such high energy particles shed pions in copious quantities as they move through the vacuum and experience a deceleration effect too. Hence their initial energy would be even higher but they burn themselves out until they drop below the GZK limit of around 10^19~10^20 eV.

Yes, I was a little over the top, but billions is not out of the question.

And your comments about billions of protons is mute by similar logic. Any astrophysical phenomena which can produce such high energy particles is going to be producing a fair few of them, few astrophysical phenomena happen in quantities of only a few million atoms/particles. Yet we don't see the sky vapourising along the trajectory that such high energy cosmic rays come from.

Anything we can produce, nature has been doing it at higher energies and in larger quantities for a very long time.
 
Billy T,
I do not recall exactly what I assumed in my "time magnifier" analysis, but think it was that the bunch was at least a meter long and 1 cm in diameter and had about a million collisons.

Perhaps you will both give your reference and make an independent computation of the ratio of the time between individual collisions to the duration of each.
I think you are vastly overestimating the length of a bunch. I remembered that the bunch lengths were supposed to be only a few centimeters long, but I looked up a source from CERN for the beam parameters. The bunches are about 7.55 centimeters long. With each bunch traveling at essentially the speed of light, a bunch will travel one foot in a nanosecond. That bunch is colliding with another bunch travelling at the same speed, so that is why I said the collision would take much less than a nanosecond, considering their short lengths. Here is a pdf. chart (doubt it will copy correctly) and a link:

Table 2.1: LHC beam parameters relevant for the peak luminosity
Injection Collision
Beam Data
Proton energy [GeV] 450 7000
Relativistic gamma 479.6 7461
Number of particles per bunch 1:15 1011
Number of bunches 2808
Longitudinal emittance (4) [eVs] 1.0 2:5a
Transverse normalized emittance [m rad] 3:5b 3.75
Circulating beam current [A] 0.582
Stored energy per beam [MJ] 23.3 362
Peak Luminosity Related Data
RMS bunch lengthc cm 11.24 7.55
RMS beam size at the IP1 and IP5d m 375.2 16.7
RMS beam size at the IP2 and IP8e m 279.6 70.9
Geometric luminosity reduction factor Ff - 0.836
Peak luminosity in IP1 and IP5 [cm
https://edms.cern.ch/file/445830/5/Vol_1_Chapter_2.pdf
 
To too inqusitive. Thanks for the data. I am going to bed now, but will just note that what you are calling a "collison duration" is for the "bunch collison" I was taking the time for light to travel a few nuclear diameters and then comparing to the time between collisons that are likely to be with in a about a fact of 20 form that first collison (or something like that) to try to see if the individual nuclear collsion could be considered to all be one event. I conclude NO - they are separtre events, separated in both time and space tomuch to be "one event" instead of a sequence of repeated events withmuch less energy and energy density than high energy cosmic ray hitting the O or N nucleus in the upper atmosphere.

I will try to get to the site tomorrow to see how "fat" the beam is and take 8cm for its length.

To me a good analogy is the collions of stuff orbiting the Earth. In ten million years or so there could be a lot of collisons but each is very brief and widely separated from the others in time and space so it would be wrong to consider all these collions as one event.
 
To too inqusitive. Thanks for the data. I am going to bed now, but will just note that what you are calling a "collison duration" is for the "bunch collison" I was taking the time for light to travel a few nuclear diameters and then comparing to the time between collisons that are likely to be with in a about a fact of 20 form that first collison (or something like that) to try to see if the individual nuclear collsion could be considered to all be one event. I conclude NO - they are separtre events, separated in both time and space tomuch to be "one event" instead of a sequence of repeated events withmuch less energy and energy density than high energy cosmic ray hitting the O or N nucleus in the upper atmosphere.
I understand your point, Billy T. The individual particles are not 'touching', and you are considering the energy produced when two bunches collide as a chain of individual particle collisions instead of a single event. The same thing happens in a Hydrogen bomb explosion, but I think of it as a single explosion instead of the chain reaction that happens. Is it just a matter of symantics?

And the bunches are only 16 microns in diameter when they collide. Here is another site that is easier to read for us that are not specially educated in the complicated collider mathematics:

Beam
Each proton beam at full intensity will consist of 2808 bunches per beam.
Each bunch will contain 1.15×1011 protons per bunch at the start of nominal fill.
few cm. long
with transverse dimensions of the order a mm, but in a collider as small as possible at the collision point (LHC - 16 microns fully squeezed)
The particles in the LHC are ultra-relativistic and move at 0.999997828 times the speed of light at injection and 0.999999991 the speed of light at top energy
Energy in beam

Total beam energy at top energy, nominal beam, 362 MJ


2808 bunches * 1.15 1011 protons @ 7 TeV each. = 2808*1.15*1011*7*1012*1.602*10-19 Joules = 362 MJ per beam



British aircraft carrier

HMS Illustrious and Invincible weigh 20,000 tons all-up and fighting which is 2 x 107 kg. These are babies compared with the USS Harry S. Truman (Nimitz-class) - 88,000 tons.

Energy of nominal LHC beam = 362 MJ or 3.62 x 108 J

so 1/2 m v2 = 0.5 * 2*107 * v2 = 3.62*108
so v2 is 36.2 and v is 6.0 m/s or 11.7 knots (or around 5.6 knots if you're an American aircraft carrier)

(1 knot = 1.852 km/hour)

Subaru equivalent
Kerb weight 3140 kg plus a light person 3200 kg

1/2 m v2 = 0.5 * 3200 * v2 = 3.62*108 gives 1712 kph



Melting Copper
Melting point of copper: 1356 K - our magnets are at 2 K so the temperature rise needed is 1354 K

Specific heat capacity of copper: 385 Jkg-1K-1

Specific latent heat of fusion (energy required to convert a solid at its melting point into a liquid at the same temperature): 205000 Jkg-1

So to melt 1 kg of copper in the LHC we need (1354*385 + 205000) J

With one beam - 362 MJ - we can melt 362 106/(1354*385 + 205000) kg = 498.4 kg of copper

So at nominal beam current the two LHC beams together could melt nearly one tonne of copper.
http://lhc-machine-outreach.web.cern.ch/lhc-machine-outreach/beam.htm
 
I understand your point, Billy T. The individual particles are not 'touching', and you are considering the energy produced when two bunches collide as a chain of individual particle collisions instead of a single event. The same thing happens in a Hydrogen bomb explosion, but I think of it as a single explosion instead of the chain reaction that happens. Is it just a matter of symantics?...
I am inclined to consider set of collisions as a sequence of independent events, not one event, iff the duration and volume of each individual collision event is "significantly" separated in space and or time from the next individual collision. Here is how my "significantly" was evaluated:

The time separation is estimated in this case as you can know the total number of collisons (beam bunch diameter and full length projected on to 2D cross section and see how many of the nuclear diameters "touch"). Once one knows the number of collisons and the time for one bunch to pass thru the other, a reasonable estimate for the time between collisions is just this bunch collision time divided by the number of collisons (Assuming that collisions are uniformly distributed in time but better would be a "tent model" with zero collisions as the bunches just touch and max collison rate when they exactly occupy same space.)
The duration of each colision is reasonably estimated to be how long speed of light takes to travel 2 nuclear diameters. Thus the "separation in time" ratio can be estimated.

The separation in space can be estimated by comparing the volume of the bunch to the cube of two nuclear diameters. (cube model, instead of sphere to be conservative.)

I would then take the product of the total number of collision, times the tiny "time duration ratio" times the tiny "collision volume ratio" to see how many individual collisons are likely to be at the same point in space and time.

If "n" collisions are expected to be at same point in space and time, then certainly n times the individual collision energy can be considered to be the "single event energy." With roughly this approach, I came previously to conclusion that n = ~0, but the bunch is much denser than I was assuming. I am too lazy to do it again. Perhaps someone will.

So at nominal beam current the two LHC beams together could melt nearly one tonne of copper.
Amazing! Only guessing, I would have been off by factor of 100 (or more). At least,the LHC is the world's greatest spot welder! :eek::D:cool:

\PS, I too think of the H-bomb as one event and have not computed to see how many fusions are at "same space and time" by above approach, so perhaps it is also a sequence of fusion events. I do not think that the physics is much different than the sum of the individual events, but believe that not completely true as I think the X-ray & gamma ray radiation pressure* is important in the dynamics of the explosion. If one can show (by something like above approach) that all individual events are likely to be separted in space time, then I do not think it only semantics (or human time scale POV) to say that there is just a sequence of events, not one event.
------------------
*This radiation makes it complicated as there are two very different times scales: The fusion time scale and the radiation transit of the fireball time scale. Perhaps Paul can agrue that the "exotic daughter products" from the individual collions do make the the LHC beam problem more complex also as they do live much longer than the individual collisions.
 
Last edited by a moderator:
Back
Top