>>2600391I'm not sure if you're approaching this from a Materialist or Spiritualist perspective and I'm also not sure which angle you are assuming I am approaching it (I'm a Materialist).
Consciousness is a property of "thinking" creatures that are capable of detecting and responding to their environment in non-deterministic manner, that then store the environmental information in such a manner that it can be recalled. At this tier of complexity alone, creatures are not conscious, but merely sentient, meaning aware of their environment.
Consciousness arises as an emergent property of creatures that are able to apply "meaning" to environmental data by a process that is usually called intentionality. In short, it is the ability to add non-environmental data onto environmental data. To put it in simpler terms, the creature recognizes discrete objects (purely material phenomena) then overly subjective properties (soft, dangerous, tasty, etc.). This is important because it allows the conscious mind to manipulate data as necessary.
Human level intelligence is a higher tier of complexity in this process. In addition to intentional states towards various objects, labels can be appended to categorize large amounts of data and create subjective relationships between phenomena.
The capacity to do that requires extreme amounts of memory storage. That's probably why some AI researchers think that by using a method called memory-based processing they'll be able to create human-like AIs, but they are going down a dead end, of course.
There are two tricks that make human level intelligence possible. The first is the ability to "understand", which far from being a mystery is merely the ability to cross-categorize so that a rock can be a stone, a tool, a weapon, a memento, and so on. In short, it's the ability to take a blurry object and treat it as any number of clear subjective things. The second ability is language, which is obvious desu.