Many believe that human consciousness is algorithmic, that self-awareness is simply the emergent property of a large number of logic gates. It must at least be partially true. Our brains do share a great deal of common traits with computers, and efforts to mimic neurons have yielded useful algorithms in computer science. Yet this misses a critical aspect of our self-awareness: our ability to feel. Are logic gates adequate to create the quality of the experience of seeing a color, hearing a note, feeling pain or pleasure? They seem more appropriate when considering our faculty of reason than when we ponder feeling.
It is important to distinguish the experience from the behavior. One could easily imagine a robot that perfectly duplicates the pain reaction without actually feeling anything.
I think that when we use the term “consciousness” we are describing a fusion of the ability to process logic and to feel, to attach emotional content to ideas. I think the ability to feel is the more fundamental mystery to consider when trying to understand consciousness, and the more difficult thing to define or explain.