The Pincho Paxton Universe generator

The Universe includes predetermined collisions, they are called Entanglement.
Two errors.
You are supposing these collisions are predetermined.
Entanglement is not "collisions" - you're back to misuse of scientific terms again.

I use the same collision as the Universe, my program is an emulation of the Universe. No cheating.
No, your programme is specious crap. With no basis in reality.
 
Two errors.
You are supposing these collisions are predetermined.
Entanglement is not "collisions" - you're back to misuse of scientific terms again.


No, your programme is specious crap. With no basis in reality.

Entanglement is the update of the collision grid. You can call it something else, but you will be wrong.
 
Ho hum.
And what, if any, evidence do you have that you're right?
 
And how can you write a programme for something based on... bugger all. Which is the sum total of what you have so far.
And, for your information, computer simulations do not provide proof. Ever.
 
Why would I call it a "maths proof" since you have declared that you're not using maths?
And how can you write a programme for something based on... bugger all. Which the sum total of what you have so far.
And, for your information, computer simulations do not provide proof. Ever.

Where did 'it a' come from?

A computer simulation that just bumps particles together, and nothing more. No cheating with G, or T, or M, or C. A Neural Network of the Universe that builds itself from a few lines of code. It does what the particles can do, just bump together. I don't aim them, I don't give them reasoning, I give them a shell, and a hole, and I bump them. Part of the shell is stored in the hole... entanglement. The stored shell updates the collision through a hole. But first you need to start on day 1. Two shells that bump. No cheating.. no maths.
 
Where did 'it a' come from?
Note that that sentence no longer exists.

A computer simulation that just bumps particles together, and nothing more.
So what?
Oh, one little hint for you - a computer uses maths...

No cheating with G, or T, or M, or C. A Neural Network of the Universe that builds itself from a few lines of code. It does what the particles can do, just bump together. I don't aim them, I don't give them reasoning, I give them a shell, and a hole, and I bump them.
And you have predetermined collisions. How is that not aimed?

Part of the shell is stored in the hole... entanglement. The stored shell updates the collision through a hole. But first you need to start on day 1. Two shells that bump. No cheating.. no maths.
TWO shells?
What happened to "a single particle.. a Pinchon"
 
Note that that sentence no longer exists.


So what?
Oh, one little hint for you - a computer uses maths...


And you have predetermined collisions. How is that not aimed?


TWO shells?
What happened to "a single particle.. a Pinchon"

A computer does not know what I am trying to do. So, if you don't give it physics formulas, you don't tell it to do anything. The result of not telling it to build the Universe, and ending up with a Universe is my proof.

They are remembered from their last bump as a shell in a hole, a photon. Don't go towards the shell in the hole, you just bumped something there.

There are two. Their sum adds up to nothing. So in a way, there are none.
 
A computer does not know what I am trying to do. So, if you don't give it physics formulas, you don't tell it to do anything.
If you don't tell a computer to do anything it does nothing.
The programme you (claim you) are writing will contain your assumptions and how they operate. Assumptions that so far have not been shown to have any validity. Garbage in, garbage out.

The result of not telling it to build the Universe, and ending up with a Universe is my proof.
Then you have a low standard of "proof".

They are remembered from their last bump as a shell in a hole, a photon. Don't go towards the shell in the hole, you just bumped something there.
So you're changing your mind again...

There are two. Their sum adds up to nothing. So in a way, there are none.
And more bullshit.
 
If you don't tell a computer to do anything it does nothing.
The programme you (claim you) are writing will contain your assumptions and how they operate. Assumptions that so far have not been shown to have any validity. Garbage in, garbage out.


Then you have a low standard of "proof".


So you're changing your mind again...


And more bullshit.

First part doesn't make any sense.

Creating the Universe doesn't prove how the Universe was created? Science is in trouble then.

How can I change my mind on a part of my program that is already written? What are you talking about? I told you it was Entanglement right from the start.

The problem is that your science is so far off the mark that you can't even understand the real solution.
 
Last edited:
First part doesn't make any sense.
Then you should learn to read.
A computer only works from what you give it.

Creating the Universe doesn't prove how the Universe was created? Science is in trouble then.
But you won't be creating a universe.

How can I change my mind on a part of my program that is already written? What are you talking about? I told you it was Entanglement right from the start.
1) store pre-determined collisions
2) They are remembered from their last bump as a shell in a hole, a photon.

The one is not the other.
 
Then you should learn to read.
A computer only works from what you give it.


But you won't be creating a universe.


1) store pre-determined collisions
2) They are remembered from their last bump as a shell in a hole, a photon.

The one is not the other.

A/ The computer will bump particles together, about 7 rules will create the universe. Expand, shrink, Bump, store in hole, check hole, move, rotate hole.

B/ I am using the known properties of experiments. 6 storage holes.. Quarks. Gravity, bump. Entanglement, magnetism, particle / wave duality, entropy, DNA, cell division, reflection, mass, photons, intelligence, sentience.

All from bumping particles together.. no cheating.

1/2 are the same thing. Last bump doesn't mean next bump. Last bump means to know that you have already bumped in that direction so go in the other direction.. a pre-determined decision not to even bother checking in that direction.
 
Last edited:
Word salad again.
How are you using known properties if you're not using science?
 
Word salad again.
How are you using known properties if you're not using science?

I am using the results from science, but not as science. Science calls 'Action At A Distance' Quantum Physics. I only have 7 rules, I don't even have as many as Physics let alone adding Quantum physics to the list. Just Pinchoism.
 
Pincho,

[I apologise for word soup, the following is something that I should be having a discourse over with multiple people so that between myself and them, something more digestible can emerge.]

swap you collision nodes for emulations of a cm3 volume, housed in separate universes neural network arrays (One world could technically support many dimensional clouds that formulate the cm3 emulation) and you might be a little closer to the truth.

I initially looked at emulation by Particle or Atom, however the problem with locking into particles or atoms is that "they didn't exist to begin with". The suggested course is actually pure energy which incidentally would be measured by vacuum space (cm3)

It's suggested that the initialisation phase is just one world creating that one small vacuum space. The plan being to utilising the generation of a temporal paradox to create multiple instances of the same emulated node, infinitely in all dimensional directions (or at least to radix horizon signified by the inverse square law)

Based upon the theory that "you can't make a parallel without a universes amount of energy", it can be suggested that a parallel plays proportion into the making the paradox of what the beginning of the universe is, therefore it incorporates a pointer into the universe at the beginning so you don't have to create a parallel you just hook to one that already exists.

This basically is like initialisation a String variable at the beginning of a program to make sure that the pointer has been set and isn't violated by alternative data. You can then use the pointer during the program and assign a new value.

This from a multiworld's perspective implies that you'd have an infinitely sized universe with initialised "pointers" all initially set with the same values. The real magic however is utilising methods to create not just paradoxes and difference between each node, but also have the capacity to communicate as a overall structure.

As with any programming test however, you initialise the system further with a LOAD TEST. After all you need to know that the parameters of your physical resources are capable of "running the program". This means ramping up the initial emulation node (Which is auto-paralleled by all other nodes because they haven't been assigned to be different. So from a multiworlds perspective all of "space" becomes energy and mass) to be filled to the point where they can't be filled any more, in this particular instance with pure digital energy [String Theory]. (Symbolically The Big Bang or preferably, "The Big Number Crunch")

Once the initial LOAD TEST is complete, it's then possible to start manipulating those nodes to become different, this is where the creation of particles and matter occur. You see pure energy is great, however it doesn't have structure, particles are basically a construct. In some respects the system would mirror the concepts of Conway's Game of Life with a very simplified method to create particles ("balls of string") and leave "space/spacing".

To stop there and just cut to the chase:
Imagine if every cm3 of vacuum space was emulated by one single world. That single world emulates it not as a single Two dimensional occurrence across the whole network, but runs multiple cloud systems together to compile a multidimensional representation for the emulation.

That single emulation (And world) is joined by an infinite number of emulations that all started with a "Singularity", a single world that created the first Kernel parameters that was then manipulated into being duplicated infinitely across spacetime.

The initial Kernel would function much like a linux kernel, in the sense that any updates to physics through Observation is automatically applied to the Singularity kernel. (Heh a Singularity in this case is "an Auto-updater for a kernel build".) However each emulation then has it's own parameters, those parameters might be how to bridge interactions of atoms to bond together to create molecules if each emulation was just an atom or possibly communicate through a squarelaw interpretive "bridge".

(I guess what I'm saying is if an atom exist in an emulation, perhaps there is a way to "talk" to it from out relative perception, to communicate with the universe that emulates it.)

Incidentally Fermilab might like this interpretation or maybe not if I didn't read what they were up to with their Holometer correctly.

Each world is in glorified 3D+, it's interactions with other worlds are through "bridges", those bridges used compressed information in 2D arrays. I have a concept that if the universe is left to it's current devices, those 2D arrays would be Linear in regards to time passage ("Times arrows moving in one direction"), however I have a hypothesis for an encapsulation method which basically means copying a single array, inversing it's direction and then reading both arrays together back to back. So while one array travels from A to Z, the other arrays is travelling Z to A.

I see the Doppler Effect being an example of this, it's just we are still only linearly interpreting the information in one direction.
(Incidentally if I'd applied this dual inverse 2D array's to say the creation of a storage media like a DVD, ideally each array would be written and read using a different laser colour, Blue and Red. I actually queried that "If the whole universe was a giant computer, it's current limitations are that we build computers with a linear sequence. So if we wanted to be able to manipulate time or utilise the system to it's fullest we'd require reworking the hardware to no longer be following the linear system we were use to, this means how the system processes data and how it stores data. and in both those cases, the data is 2D arrays, even if the data itself contains information in greater dimensions)
 
Pincho,

[I apologise for word soup, the following is something that I should be having a discourse over with multiple people so that between myself and them, something more digestible can emerge.]

swap you collision nodes for emulations of a cm3 volume, housed in separate universes neural network arrays (One world could technically support many dimensional clouds that formulate the cm3 emulation) and you might be a little closer to the truth.

I initially looked at emulation by Particle or Atom, however the problem with locking into particles or atoms is that "they didn't exist to begin with". The suggested course is actually pure energy which incidentally would be measured by vacuum space (cm3)

It's suggested that the initialisation phase is just one world creating that one small vacuum space. The plan being to utilising the generation of a temporal paradox to create multiple instances of the same emulated node, infinitely in all dimensional directions (or at least to radix horizon signified by the inverse square law)

Based upon the theory that "you can't make a parallel without a universes amount of energy", it can be suggested that a parallel plays proportion into the making the paradox of what the beginning of the universe is, therefore it incorporates a pointer into the universe at the beginning so you don't have to create a parallel you just hook to one that already exists.

This basically is like initialisation a String variable at the beginning of a program to make sure that the pointer has been set and isn't violated by alternative data. You can then use the pointer during the program and assign a new value.

This from a multiworld's perspective implies that you'd have an infinitely sized universe with initialised "pointers" all initially set with the same values. The real magic however is utilising methods to create not just paradoxes and difference between each node, but also have the capacity to communicate as a overall structure.

As with any programming test however, you initialise the system further with a LOAD TEST. After all you need to know that the parameters of your physical resources are capable of "running the program". This means ramping up the initial emulation node (Which is auto-paralleled by all other nodes because they haven't been assigned to be different. So from a multiworlds perspective all of "space" becomes energy and mass) to be filled to the point where they can't be filled any more, in this particular instance with pure digital energy [String Theory]. (Symbolically The Big Bang or preferably, "The Big Number Crunch")

Once the initial LOAD TEST is complete, it's then possible to start manipulating those nodes to become different, this is where the creation of particles and matter occur. You see pure energy is great, however it doesn't have structure, particles are basically a construct. In some respects the system would mirror the concepts of Conway's Game of Life with a very simplified method to create particles ("balls of string") and leave "space/spacing".

To stop there and just cut to the chase:


(I guess what I'm saying is if an atom exist in an emulation, perhaps there is a way to "talk" to it from out relative perception, to communicate with the universe that emulates it.)

Incidentally Fermilab might like this interpretation or maybe not if I didn't read what they were up to with their Holometer correctly.

I don't know what you have posted. But I can't imagine a particle knowing all that if I don't. A particle has to be treated as a particle. It is a shell, and holes. No point adding more than a particle can handle.
 
I don't know what you have posted. But I can't imagine a particle knowing all that if I don't. A particle has to be treated as a particle. It is a shell, and holes. No point adding more than a particle can handle.

Ah but that's the entire point of an Emulation system. You see we increase the power of our computers, the available throughput in regards to how much information can be processed. This means that if we create an emulation of a cm3 area, the equipment used and techniques over time would evolve, that cm3 can become infinitesimally complex over time where we forever look at smaller and smaller details.

It's much the same as saying rather than limiting the information to a 12 decimal place value, we can eventually start adding more decimals. There is obviously certain physics values that would remain "rounded" but it doesn't mean they were absolutely correct, just correct at the time when the decimal placing hadn't gone further. (Obviously this touches on Chaos theory in regards to Lorentz's observation of the same data but to different decimalplacing's causing chaotic results in a weather prediction system)

While you say a particle "Doesn't need to be complex", you just don't necessarily know how much data that particle can actually contain or hold.

I guess you can say I see the universe as potentially being Nanotechnology through this emulation method. Instead of commanding little robots, it's all about understanding how to "call" the code at the level the code exists at to get the results that you want from the nanotech. (in fact it might be even less than picotech but that's something's for people to argue in the future)
 
Ah but that's the entire point of an Emulation system. You see we increase the power of our computers, the available throughput in regards to how much information can be processed. This means that if we create an emulation of a cm3 area, the equipment used and techniques over time would evolve, that cm3 can become infinitesimally complex over time where we forever look at smaller and smaller details.

It's much the same as saying rather than limiting the information to a 12 decimal place value, we can eventually start adding more decimals. There is obviously certain physics values that would remain "rounded" but it doesn't mean they were absolutely correct, just correct at the time when the decimal placing hadn't gone further. (Obviously this touches on Chaos theory in regards to Lorentz's observation of the same data but to different decimalplacing's causing chaotic results in a weather prediction system)

While you say a particle "Doesn't need to be complex", you just don't necessarily know how much data that particle can actually contain or hold.

I guess you can say I see the universe as potentially being Nanotechnology through this emulation method. Instead of commanding little robots, it's all about understanding how to "call" the code at the level the code exists at to get the results that you want from the nanotech. (in fact it might be even less than picotech but that's something's for people to argue in the future)

Well you do know how much Data a particle can hold just by looking at ants compared with humans. We can see that you need a lot of particles for intelligence. A single particle can maybe store 12 things at most. Science likes to give a particle credit for its mathematical knowledge. Unfortunately, science is a religion, and its members try to fob you off with their Bible. There is no magical maths to the Universe. It's just particles bumping together. You just need to know what a bump actually is.
 
Back
Top