In my previous article, I referred to mental constructs that I named "symbols" but I never explained what exactly I meant. I also referred to something that comprised of these entities, the "symbolic representation of the world humans have in their minds".
Well, a symbol is nothing more than what the word implies in its literal meaning: it's a formalistic representation of something. Since we are on the topic of the human mind, these symbols are thoughts. Obviously, the entities symbols represent may or may not have a tangible form. In practice, this formalism is not very strict. In other words, the outline is more important than the sharpness of the mental picture, especially when the definition of the symbol is done intuitively, using patterns and not with other symbols.
These symbols may have attributes and be subject to editing. They may even interact with other symbols. Thus, an object-oriented view of the world is created from these symbols. This symbolic representation of the world humans have in their minds I call consciousness.
Technically, the human brain works by storing data distributed to neurons. What data each neuron holds doesn't mean much on its own, but when the brain is fed with a stimulus, it is able to recreate relevant information by putting together data from a specific path on the neural network. Consciousness is like a virtual machine. It is based on the functions I described above, but implements a radically different way to represent the world. It uses relational memory and machine learning to create an object/symbol-oriented system.
Even if this is a good model to describe consciousness, it still needs to be noted that it is in no way absolute. Neither do all human functions go through consciousness nor is this network of symbols implemented as well as I imply. Where I'm getting to is that the fact that consciousness is not fully compatible with the human mental faculties nor does it cover them fully, means that there is a gap between it and the rest of the brain's functions.
I think this is the reason why there is this perceived distance between mind and body. In any case, the tools consciousness provides enable humans to build a variety of object-oriented systems, which may or may not have anything to do with the real world. For instance, mathematics is a logic system which is comprised from entities with relations, all which have nothing to do with senses. Also, it is really interesting to study the way in which consciousness is related and interacts with the "lower" level functions. Using mathematics as an example once more, one may notice that while a problem's solution is a sequence of logical steps, most of the time the solution is not calculated algorithmically-symbolically but intuitively.
To sum up, consciousness is defined as a system of mental symbols which is founded on the human brain but functions with a completely different logic. In computers, there are logic programming and object-oriented languages whose code ends up being compiled to machine language, which is always procedural. I think that a similar process takes place inside the human brain, which compiles the symbols of human consciousness into data which can be stored on biological neurons.
Well, a symbol is nothing more than what the word implies in its literal meaning: it's a formalistic representation of something. Since we are on the topic of the human mind, these symbols are thoughts. Obviously, the entities symbols represent may or may not have a tangible form. In practice, this formalism is not very strict. In other words, the outline is more important than the sharpness of the mental picture, especially when the definition of the symbol is done intuitively, using patterns and not with other symbols.
These symbols may have attributes and be subject to editing. They may even interact with other symbols. Thus, an object-oriented view of the world is created from these symbols. This symbolic representation of the world humans have in their minds I call consciousness.
Technically, the human brain works by storing data distributed to neurons. What data each neuron holds doesn't mean much on its own, but when the brain is fed with a stimulus, it is able to recreate relevant information by putting together data from a specific path on the neural network. Consciousness is like a virtual machine. It is based on the functions I described above, but implements a radically different way to represent the world. It uses relational memory and machine learning to create an object/symbol-oriented system.
Even if this is a good model to describe consciousness, it still needs to be noted that it is in no way absolute. Neither do all human functions go through consciousness nor is this network of symbols implemented as well as I imply. Where I'm getting to is that the fact that consciousness is not fully compatible with the human mental faculties nor does it cover them fully, means that there is a gap between it and the rest of the brain's functions.
I think this is the reason why there is this perceived distance between mind and body. In any case, the tools consciousness provides enable humans to build a variety of object-oriented systems, which may or may not have anything to do with the real world. For instance, mathematics is a logic system which is comprised from entities with relations, all which have nothing to do with senses. Also, it is really interesting to study the way in which consciousness is related and interacts with the "lower" level functions. Using mathematics as an example once more, one may notice that while a problem's solution is a sequence of logical steps, most of the time the solution is not calculated algorithmically-symbolically but intuitively.
To sum up, consciousness is defined as a system of mental symbols which is founded on the human brain but functions with a completely different logic. In computers, there are logic programming and object-oriented languages whose code ends up being compiled to machine language, which is always procedural. I think that a similar process takes place inside the human brain, which compiles the symbols of human consciousness into data which can be stored on biological neurons.
Quite an ambitious topic here! If I understand, you're saying the brain has 2 levels, the consciousness, and the underlying neural functionality, and the 2 are separated from each other, in a way somewhat analogous to a high level computer language and a low level assembly language? And you're looking for the analogous conversion (compilation) routine for the human mind? And one way to do this is by trying to quantify (define) the objects (symbols) that make up consciousness? (Or something like that?)
ReplyDeleteThis all makes me think of Gödel Escher Bach, by Douglas Hoftstadter. Read it? If not, I would very highly recommend it. If so, well then I guess that explains this blog! It talks about some similar things, like how something like consciousness can arise from smaller, insignificant things. Or more generally, how something can be greater than the sum of its parts. It also covers Gödel's incompleteness theorem quite thoroughly without going into all of the mathematics (that's the extent of my knowledge of it, I'm not a mathematician either). It becomes important because it illustrates how logical systems can fail. More or less it all centers around trying to logically analyze a paradoxical statement like "this sentence is false". A logical system just has to throw up its arms and give up, yet a human consciousness can easily see it for what it is.
The human brain though is clearly one complex beast, and you've set quite a task for yourself! I don't know much about artificial neural networks, though I must confess to being a bit of a skeptic about AI in general. Or, rather, making analogies between computer or logical functionality and the human mind or brain. But, don't mean to be discouraging, the attempt should be interesting!
Yes, I think my most important premise here is that there is a low-level system (the neural functionality) which implements a higher-level system (consciousness).
ReplyDeleteI don't really wish to delve into how this compilation (perhaps a better term would be "interpretation") happens, mostly because I'd say that this is the field of real research and science rather than vague theoretical philosophy.
The only reason I do definitions is because I think it's generally useful to have a formalistic and more accurate description for a "symbol" than the patterns the brain recognises.
The book you are suggesting seems really good. I think I'll get it. Thanks! I'm not sure how a logical system treats a paradoxical statement like the one you have there any differently than a human would: both would argue that the sentence makes no sense, as far as I know.
The analogy between the way humans and machine works is more poetic than ghastly in my mind. It's only natural for the creator to imbue his creations with his own characteristics. This is a rather old idea too: "So God created man in his own image" and all that.
Yeah the incompleteness theorem may not be super relevant here, it's just sometimes used as the sort of thing that shows how formalized logical systems can run into problems. The paradoxical statement is problematic because Gödel found a way to encode it within mathematics as a valid statement, and also showed that it can't be proven true or false. Of course if you do read GEB you'll see what I'm getting at so I won't try and beat it to death here. It is a very interesting book and I think very relevant to your interests, though be warned, it's also a very big book! Fortunately it's not a linear narrative though so you can skip through parts that you find a bit tedious.
ReplyDeleteI like your "God created man" analogy. (now you've got me thinking about Battlestar Galactica!) If I may hijack it for my own means, if God created us, and we presume ourselves to be a step or two below God in the grand scale of things, wouldn't machines then be relegated to being a step or two below us? Sort of like the saying that we may be smart enough to understand an ant's brain, but an ant isn't. Similarly, we're not smart enough to understand our own brains. Just some thoughts.