So yesterday I came back to thinking about consciousness again after some weeks away from it and, inevitably, the idea of robots with human consciousness came up again. I was also pointed in the direction of some interesting videos put on You Tube by the Dalai Lama in which he and some scientists educated more in the western, scientific tradition had a conference around the areas of mind and consciousness.
But it really all started a couple of days ago with a thought I had. I was sitting there, minding my own business, when suddenly I thought "Once we can create consciousness procreation will be obsolete." (This thought assumes that "consciousness" is something that can be deliberately created. That is technically an assumption and maybe a very big one.) My point in having this thought was that if we could replicate consciousness, which we might call our awareness that we exist and that there is a world around us, then we could put it (upload it?) into much better robot bodies than our frail fleshly ones which come with so many problems simply due to their sheer physical form. One can easily imagine that a carbon fibre or titanium (or carbotanium) body would last much longer and without any of the many downsides of being a human being. (Imagine being a person but not needing to eat, or go to the toilet. Imagine not feeling tired or sick.)
So the advantages immediately become apparent. Of course the thought also expressly encompasses the idea that if you can create consciousness then you can create replacements for people. Imagine you own a factory. Instead of employing 500 real people you employ 500 robots with consciousness. Why wouldn't you do that? At this point you may reply with views about what consciousness is. You might say, for example, that consciousness implies awareness of your surroundings which implies having opinions about those surroundings. That implies feelings and the formation of attitudes and opinions about things. Maybe the robots don't like working at the factory like its very likely some of the people don't. Maybe, to come from another angle, we should regard robots with consciousness as beings with rights in this case. If we could establish that robots, or other creatures, did have a form of consciousness, would that not mean we should give them rights? And what would it mean for human beings if we could deliberately create "better people"?
At this point it becomes critical what we think consciousness actually is. It was suggested to me that, in human beings, electrochemical actions in the brain can "explain" the processing of sense data (which consciousness surely does). Personally I wonder if this does "explain" it as opposed to merely describing it as a process within a brain. One way that some scientists have often found to discuss the mind or consciousness is to reduce it to the activities of the brain. So conscious thoughts become brain states, etc. This is not entirely convincing. It is thought that the mind is related to the brain but no one knows how even though some are happy to say that they regard minds as physical attributes like reproduction or breathing. That is, they would say minds are functions of brains. Others, however, aren't so sure about that. However a mind comes to be, it seems quite safe to say that consciousness is a machine for generating data (as one of its functions). That is, to be conscious is to have awareness of the world around you and to start thinking about it and coming to conclusions or working hypotheses about things. Ironically, this is often "unconsciously" done!
So consciousness, as far as we know, requires a brain. I would ask anyone who doesn't agree with this to point to a consciousness that exists where there isn't a brain in evidence. But consciousness cannot be reduced to things like data or energy. In this respect I think the recent film Chappie, which I mentioned in previous blogs, gets things wrong. I don't understand how a consciousness could be "recorded" or saved to a hard disk. It doesn't, to me, seem very convincing, whilst I understand perfectly how it makes a good fictional story. I think that on this point thinkers get seduced by the power of the computer metaphor. For me consciousness is more than both energy or data, a brain is not simply hardware nor is consciousness simply (or even) software. If you captured the electrochemical energy in the brain or had a way to capture all the data your mind possesses you wouldn't, I think, have captured a consciousness. And this is a question that scientist Christof Koch poses when he asks if consciousness is something fundamental in itself or is rather simply an emergent property of systems that are suitably complex. In other words, he asks if complex enough machine networks could BECOME conscious if they became complex enough. Or would we need to add some X to make it so? Is consciousness an emergent property of something suitably complex or a fundamental X that comes from we don't know where?
This complexity about the nature of consciousness is a major barrier to the very idea of robot consciousness of course and it is a moot point to ask when we might reach the level of consciousness in our human experiments with robotics and AI. For, one thing can be sure, if we decided that robots or other animals did have an awareness of the world around them, even of their own existence or, as Christof Koch always seems to describe consciousness, "what it feels like to be me" (or, I add, to even have an awareness of yourself as a subject) then that makes all the difference in the world. We regard a person, a dog, a whale or a even an insect as different to a table, a chair, a computer or a smartphone because they are ALIVE and being alive, we think, makes a difference. Consciousness plays a role in this "being aliveness". It changes the way we think about things.
Consciousness, if you reflect on it for even a moment, is a very strange thing. This morning when I woke up I was having a dream. It was a strange dream. But, I ask myself, what was my state of consciousness at the time? Was I aware that I was alive? That I was a human being? That I was me? I don't think I can say that I was. What about in deep sleep where scientists tell us that brain activity slows right down? Who, in deep sleep, has consciousness of anything? So consciousness, it seems, is not simply on or off. We can have different states of consciousness and change from one to the other and, here's another important point, not always do this by overt decision. Basically this just makes me wonder a lot and I ask why I have this awareness and where it comes from. Perhaps the robots of the future will have the same issues to deal with. Consciousness grows and changes and is fitted to a form of life. Our experience of the world is different even from person to person, let alone from species to species. We do not see the world as a dog does. A conscious robot would not see the world as we, its makers, do either.
In closing, I want to remind people that this subject is not merely technological. There are other issues in play too. Clearly the step to create such beings would be a major one on many fronts. For one thing, I would regard a conscious being as an individual with rights and maybe others would too. At this point there seems to be some deep-seated human empathy in play. There is a scene in the film Chappie where the newly conscious robot (chronologically regarded as a child since awareness of your surroundings is learned and not simply given) is left to fend for himself and is attacked. I, for one, winced and felt sympathy for the character in the film - even though it was a collection of metal and circuitry. And this makes me ask what humanity is and what beings are worthy of respect. What if a fly had some level of consciousness? (In a lecture I watched Christof Koch speculated that bees might have some kind of consciousness and explained that it certainly couldn't be ruled out.) Clearly, we need to think thoroughly and deeply about what makes a person a person and I think consciousness plays a large part in the answer. Besides the scientific and technical challenges of discovering more about and attempting to re-create consciousness, there are equally tough moral and philosophical challenges to be faced as well.