Sure we can. We don't have to understand everything about how it works, we can emulate it's structure, and how it grows. We can make neural networks that program themselves how to do new tasks. Even our own DNA does not contain the total plan for the brain, only a series of instructions on how to grow.
I'd agree here. Even though we don't fully understand how our brain works, its really hard considering there isn't a whole lot of research that has been done on how people develop their personality. We've only managed to map out the direct correlations of body functions to the brain, such as motor reflexes, physiological homeostasis regulators, fight and flight responses and etc. Sensory functions have been mapped to certain parts, emotional stimuli can be linked to pituitary gland and its hormonal and chemical secretions, but as to how all these sensory stimuli and internal reception is received... we still don't have a definite answer to all that. I think it partly has to do with the fact that people's lives are so varied that it would be hard to control any factors that would affect the development of cognitive characteristics within the frontal cortex.
However, it could theoretically be possible under extremely controlled situations if we had a ton of identical babies, twins/triplets/clones or what not from the moment they are fertilized into zygotes... but no way in hell would anybody find the means to justify such an experiment ethically.
Given enough flexibility and ability to retain information and make decisions based on previous experiences or programmed emotional responses (here it would prove to be difficult because emotional response can be sometimes emulated with a specific random set of variables, ie CPU experiences what correlates to a painful experience to a human, there is 10% probability it will learn from the pain, 40% chance it will want revenge, or 50% that it will deal with it and learn. These percentages would still have to be calculated so it wouldn't be in fact genuine emotional response to a situation). There has been recent advances in "memristors" which would be able to retain previous voltages or electrical currents even when all power has been shut off. Even though the technology is still primitive and expensive, it can prove to be valuable and allow silicon based machines to be able to retain information even when turned off, cutting down boot times and possibly reduce board chipsets and space.
I guess in a ways we would have the most difficulty in emulating our "soul" or "ghost" whatever you want to call it. There would be no way to actually measure or determine a machine's ability to recognize itself. I mean, we have no way to even test for another human being's cognitive abilities aside from measuring brain waves and topography. In essence, we'd only be emulating the abilities our brain has, such as calculating problems, deducing answers to social problems with information being presented based on logic factors, and machines would be able to follow protocols within corporations without falling to human errors such as corruption, mistakes due to personal conflicts, illnesses, attention span, etc. However it would lack the human aspect of all those, which includes emotional and personality.
If it were up to me, and I were still alive in the times when computers would be advanced enough to be tested for "self-recognition", I would ask for a few things.
1) Compose a tune, it wouldn't even have to be a good one. It would have to be somewhat unique, but could also be a mix of previous works. Then it would have to tell me what its 5 favorite songs of all time are and why it liked them.
2) It would have to paint a piece of art or create an artistic work
3) It woudl have to write a poem. Then it would have to interpret it
4) It would have to find something it valued and explain its sentimental value
These would be some of the questions I'd ask it. Reminds me of Robin Williams in Bicentennial man. lol