That is like saying you don't need a brain to have memories. It's really meaningless, and I wonder why you would say such a thing.
Because when you get into what consciousness actually means within philosophy and science you realise that what we observationally deduce as being 'concsious' often relates to things that dont have brains atall.
One of the current arguments raging in philosophy is in regards to AI - we're already reaching the point where we can create programs and devices that seem to demonstrate a frightening sense of free-will.
And it looks set to create a huge divide over what consciousness as a process really needs to demonstrate to be worthy of that title.
There is a difference between having highly evolved parts of the brain, and an ant going about it's job. Humans have developed the ability to betray their genes, eg. suicide, abstinence...
Of course, which makes absolute sense - the more complex a system becomes the more options for self-agency present themselves.
whereas the ant is just going on auto pilot.
In fact I think it's up for debate how conscious humans really are... Are we really 'conscious' or are we really just going on auto pilot... albeit a little more advanced way than the ant.
I think its much more helpful to look at consciousness on a gradient that offers decreasing to increasing options for self-determing expression.
Im wary of just labeling things as 'insentient automations', as i dont think it really gets us anywhere and you cant really prove anything on those terms - as you yourself seem to suggest.
But maybe our difficulty in understanding 'consciousness' is because we have difficulty understanding how all parts of the brain work together simulataneously.
The central problems seems to be - how do we get subjective experience from objective matter. Panpsychists/panexperimentalists offer the solution of experience and 'isness' being innate to all matter, and so this base level of 'isness' creates the fundamental building blocks for higher grades of consciousness.
The central problem with emergent theories of consciousness seems to hinge on the question of how we go from physical matter to this immaterial stuff we call 'subjective experience'
Theyre both such fundamentally opposed realities that its hard to see how one could have emerged from the other - its not like not like taking say; steel and melting it into a liquid, we're going from one kind of fundamental to a
completely new kind of fundamental.
And how can anyone say consciousness is immaterial? The brain has billions of cells, and it's common sense that consciousness does not occurr 2 feet to the right of your head, does it?
It depends on how you think of the term 'immaterial' really, consciousness is certainly immaterial in the sense that you cant measure it subjectivity, even if i could point to a neuron firing in your brain when you were thinking about death - is that physical matter the experience
in itself?
Its a mind
DELETEDwhen you really start to think about it.