This is for those advanced thinkers who may (or may not) have thought about what a computer actually is.
Information is physical; there is no computer anywhere in the universe which is not made out of real atoms.
(See if you can think of something that's made out of unreal atoms)
Every atom which is real (has mass) is a computer, because it's made out of little 'particles' that are also individual computers.
Information is dimensional; there is no "one-dimensional" information anywhere in the universe.
The most fundamental, irreducible, logical form of information, which is mappable to any fundamental, logical computer, is 0 and 1, which has 2 dimensions of 'state'.
Without these 2 dimensions, there is no way to define an 'input' or 'output', or separate the computation from the information it transforms.
{You cannot transform information in just one dimension.)
The following is from a course on quantum information processing (as a heads-up to anyone who feels challenged by words with more than 2 syllables):
This is where the computational basis changes from classical electric and magnetic fields, to quantum versions of those same fields, as 'inputs' which are operated on by Hermitian transforms in a complex probability space. [What does a quantum measure space look like? How many corrections do various mathematical models need in the various fields of real QC]
Classical circuits operate on the same fields with 'components' that transform them and create phase differences between them. Quantum circuits create these phase angles (between say, electron spin) by processing the fields with 'components' that are superpositions of wavefunctions.
It's a matter of seeing what these fields "look like", to classical and quantum systems, what the components are and what they do to the 'input signals', how to interpret or read the 'output signals'. One key difference is quantum circuits have to 'produce' a classical output, where this is something classical circuits do already. This is not trivial.
So who knows why an electron precessing in a known magnetic field, isn't computing a spin phase? Why the electron isn't transforming the 'input' as spin, and performing a computation?
Anyone?
Information is physical; there is no computer anywhere in the universe which is not made out of real atoms.
(See if you can think of something that's made out of unreal atoms)
Every atom which is real (has mass) is a computer, because it's made out of little 'particles' that are also individual computers.
Information is dimensional; there is no "one-dimensional" information anywhere in the universe.
The most fundamental, irreducible, logical form of information, which is mappable to any fundamental, logical computer, is 0 and 1, which has 2 dimensions of 'state'.
Without these 2 dimensions, there is no way to define an 'input' or 'output', or separate the computation from the information it transforms.
{You cannot transform information in just one dimension.)
The following is from a course on quantum information processing (as a heads-up to anyone who feels challenged by words with more than 2 syllables):
Abbas Edalat said:A computer is a physical machine and any computation performed by such a machine is in essence a physical process. This is a simple factual statement but it has a profound consequence. It can be logically argued from this premise that:
* the laws of [physical] computation depend on the physical laws obeyed by the computer machine under consideration, and,
* there are no absolute laws of computation valid for all computational machines
The prominent logician/mathematician, Alan Turing, formulated the classical theory of computation in [the] 1930s. He assumed that computation is performed by an idealised mechanical computer (with potentially infinite storage capacity) obeying the classical laws of physics. This model, now called the Turing model of computation, has proved to be adequate for describing the computational process performed by mechanical or modern electronic computers.
...
At atomic scales the laws of classical physics, which are the basis of Turing Machines and the classical theory of computation, collapse. ..Every aspect of computing, storing information, loading and running programs and reading the output [is] governed by laws of quantum physics.
...
Quantum gates ...necessarily have a [continuous] evolution in discrete time. But physical systems evolve [discretely] in continuous time. Therefore, in order to ...implement a quantum gate as a physical system, we need to know how quantum systems evolve in continuous time.
--Department of Computing
Imperial College, London (from a course "Quantum Computing" 2001-2002)
This is where the computational basis changes from classical electric and magnetic fields, to quantum versions of those same fields, as 'inputs' which are operated on by Hermitian transforms in a complex probability space. [What does a quantum measure space look like? How many corrections do various mathematical models need in the various fields of real QC]
Classical circuits operate on the same fields with 'components' that transform them and create phase differences between them. Quantum circuits create these phase angles (between say, electron spin) by processing the fields with 'components' that are superpositions of wavefunctions.
It's a matter of seeing what these fields "look like", to classical and quantum systems, what the components are and what they do to the 'input signals', how to interpret or read the 'output signals'. One key difference is quantum circuits have to 'produce' a classical output, where this is something classical circuits do already. This is not trivial.
So who knows why an electron precessing in a known magnetic field, isn't computing a spin phase? Why the electron isn't transforming the 'input' as spin, and performing a computation?
Anyone?
Last edited: