Self-Actualization, “Reason”, and Religion
When considering the future of technology, robots and artificial intelligence often come to mind. Humans tend want to advance technologically as quickly as possible, but also have a sense of fear about computers surpassing us in intelligence.
A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Asimov’s short story “Reason,” gives readers two important points to think about. The first is how it is reminiscent of how religion is often accepted without any thought. The second is what may happen if robots gain a sense of curiosity.
Maslow’s Hierarchy of Needs
Abraham Maslow was a psychologist who believed that humans were subject to what he called a “hierarchy of needs”. The idea was that people have to fulfill certain basic requirements before moving onto higher trains of thought. These include food, shelter, and safety. Once those have been fulfilled, people can move onto friendships, esteem, and finally, self-actualization. Self-actualization is a concept he described as “realizing personal potential, self-fulfillment, seeking personal growth and peak experiences.”
There are several characteristics that Maslow attributed to people who he considered self-actualized, which included Abraham Lincoln and Albert Einstein. Some of these were:
- Highly creative.
- Problem-centered (not self-centered)
- Unusual sense of humor
- Perceive reality efficiently and can tolerate uncertainty.
Robots though, would probably not need to worry about the same things that humans do. Therefore, once they reached a high enough computing power, they could possibly be able to skip through the basic needs onto higher thought. The main lower need that they may worry about would probably be safety, assuming they had enough access to a power source. Humans are unlikely to react well to robots if they begin to match or surpass our intelligence.
This could lead to them trying to deactivate robots, or lessen the amount of power that they have. Depending on how sentient they are, the robots may retaliate.
In Asimov’s “Reason,” QT-1 decides that it must be created by a higher power, because its intelligence surpasses that of the humans. Upon being told that Gregory and Donovan put it together just one week ago, it states, “For you to make me seems improbable.” It continues, “I intend to reason it out, though. A chain of valid reasoning can end only with the determination of truth, and I’ll stick till I get there.”
Based on Maslow’s described characteristics, it does not seem that QT-1 has achieved self-actualization. One key point that Maslow stated was the ability to “perceive reality efficiently” and able to “tolerate uncertainty.” It also seems that QT-1 is more self-centered than problem-centered. It refuses to accept Gregory and Donovan’s statements as true. Gregory makes an attempt to explain things to QT-1, “You’re the first robot who’s ever exhibited curiosity as to his own existence—and I think the first that’s really intelligent enough to understand the world outside.”
QT-1 is shown where Gregory and Donovan come from in the far distance, but is somewhat dismissive, asking “But where do I come in, Powell? You haven’t explained my existence.” It would seem that QT-1 is still within the stage that Maslow refers to as “esteem needs.” It has issues with the idea that it has been created simply for the use of humans, and wants to find higher meaning for itself and the other robots.
It decides that the “Master,” or Energy Converter, is the creator of robots as well as humans, who were created “as the lowest type, most easily formed.” There is no actual proof for this, but is decided based on the fact that QT-1 cannot accept that humans, who are “soft and flabby, lacking endurance and strength,” are the ones who created it.
It earlier stated that it would use reason to sort the issue out, but in reality only bases its assumptions off of its inability to accept the idea that humans could have created it. It dismisses their attempts to show it the truth and even treats the humans as criminals, saying “Sacrilege,” when Donovan speaks against “the Master.”
The Trouble with Reason
In “Reason,” Gregory states, “There’s one trouble with [reason]…You can prove anything you want by coldly logical reason—if you pick the proper postulates. We have ours and Cutie has his.”
Reason is only the ability or capacity to apply logic. People can reason anything they want, but that doesn’t necessarily make it fact. QT-1 may have good reason to believe that it is not created by humans, a lesser being, but the fact is that they did create it.
Part of the problem with QT-1’s reasoning is that it is based on self-centered thoughts. It is not reaching beyond into self-actualization, but rather stuck in the idea that surely, these fleshy inferior beings cannot have created it. The idea of that being true is repulsive to it, so it chooses to find another truth.
This premise is similar to modern day religion. In a time of high powered technology which is rapidly increasing, there is no reason to find supernatural excuses for natural occurrences that can be explained through science. But the idea of having to step past self-esteem and reach their highest personal potential or growth is difficult for people to grasp, so they fall back into religion.
Religion allows people to attribute things that happen to them as acts of God, or what was meant to happen, rather than thinking about how or why it happened. It allows people to fall back onto random passages in old books as excuses for their behavior.
Asimov shows that reason is not such a high train of thought, because it can be shaped based on what someone wants to think. Reason may be a part of being “human,” but that doesn’t mean that the ideas that come from it are always good. Gregory and Donovan are saved only through the logic of the Three Laws of Robotics.
Humans often experience feelings of fear and discomfort when it comes to the idea of a cold, logical intelligence. Many would feel a sense of comfort if machines were programmed with human-like emotions. In “Reason,” Asimov shows that emotion may not be such a good thing for robots to have though, because it could lead to the sort of thought process that QT-1 exhibits.