In Sterling's “Deep Eddy”, there is a type of castrophe called a “wende”. It’s easy for this vague description of a type of unmentionable to stoop a reader in their tracks. The occurence, or gathering is used with context clues very vaguely, much like some of the bantering in the short story. The wende could be described as a natural disastor of sorts. A hurricane, travesty, or some horrible affliction by human nature that is inevitable. Whether this affliction be death, or some horrible freak of nature that brings a group of individuals together as a result, it is for sure some two-step happening does the inexplicable, and occurs…but yet, strings people together.
It is no surprise the Wende seems to includes people dying. When reading about a Wende, it’s easy to contemplate other ways that in the modern world people come together for better or for worst in a sort of established gathering. The Olympics, which bring together different cultures, sports, and the like, can also attract such hate. The Olympic Centenial Park shooting is a example of a Wende because it fits the same criteria and occurrence as the below descriptions. While reading, it may be easy for the reader to point out that the worst outcome could be it’s a “cyber punk genocide”. Easiest way to kill a lot of people? Put them altogether in one sitting.
A Wende, according to Sterling’s piece contains the following(149) .
- “Wende people”… A type of similar-minded group of thinkers.
- “weird things happen” …Rape and pillage?
- “Capclug studies them” … Various tech observation.
- “you don’t throw it” … “it throws you”… The Wende is telepathic in some way?
- “word gets around…they start”
- “when it happens to you, it changes everything.” By this statement, one might think of it as a type of festival.
This leads the reader to believe a Wende could be a cacophony of different elements that signify a near disaster, or the end being near. That is simply because when people here a wende they flock to it, much like an anticipated concert. A wende is a sort of purging technological phenomenon, soon-to-be born apocalypse. For some others who have reviewed the book, they consider the wende to be a sort-of catch phrase, or an undescriptive happenstance.
“What are they going to fight about?”
“Anything, everything.” (165)
This a disastrous protest of sorts, or at least it would be in the current day and time. The kind of protest that results in destruction. The communication breakdown in the Post-Cypunk society as talked about in Sterling’s story warns the reader of what something as vague but imaginable as a wende could be. If technology weren’t present, our communication breakdowns would be even worst.
Let’s think for a moment that a wende other than, as established in Deep Eddy, could be any type horrified circumstance with an established group of people. For Deep Eddy, this is when a lot of people flea together due to data.
Who Watches the Watchers?
Staying Behind by Ken Lui is a haunting prophetic tale set in the aftermath from an apocalypse that no one predicted. One answer frequently given about what separates man from machine is the concept of free will. But what if that choice is to remove oneself from being a biological human, transferring, hopefully, your sentience into a machine? Conversely, what about being transferred against your will?
Staying Behind sits at the crossroads of these questions and poses many more. Liu goes to great length to chronicle the decline of civilization and the technology that binds it together. Parallels are drawn repeatedly against the real world and the virtual world.
The most immediate question that came to mind was who was taking care of the machines and networks where billions of residents reside? In spite of the details revealed in the tale, Liu makes no mention of the creators setting up a permanent, reliable maintenance system to keep the machines running.
If it assumed that one was indeed put in place, what is to prevent those machines from evolving toward their own sentience? What if the new, dominant life forms then forget, or choose, to stop serving an ancient machine?
It isn’t a stretch to see this story as a potential precursor to Harlan Ellison's I Have No Mouth And I Must Scream. Would the machines have an ethical right to no longer be subservient? Who’d be around to make that judgement, and under what authority?
Asimov's famous Three Laws of Robotics would not apply, either. After all, if the machines could logically believe that a space station's generator is their god, then it is easy to imagine the sentient machines can come to the logical conclusion that the “Digital Adam” harboring the consciousness of the former inhabitants of the planet is neither human nor contains humans.
Which brings up the next question: what is human? Liu addresses this concept no less than four times in Staying Behind.
- The narrator ponders the following: “For every Uploaded man, there was a lifeless body left behind, the brain a bloody pulpy mess after the destructive scanning procedure. But what really happened to him, his essence, his — for lack of a better word — soul? Was he now an artificial intelligence? Or was he still somehow human, with silicon and graphene performing the functions of neurons? Was it merely a hardware upgrade for consciousness? Or has he become a mere algorithm, a clockwork imitation of free will?”
[Narrator’s father]: “If you’re doing things the exact same way as your ancestors, then your way of life is dead, and you’ve become a fossil…”
[Narrator’s mother]: “You have no sense of what’s really important
in life, what’s worth holding onto. There’s more to being human than progress.”
- The narrator speaks of his mother, he say she taught him that, “…our mortality makes us human.”
- And lastly the narrator tells his wife, as to why he can’t stop trying to protect his daughter from Digital Adam, he says, “I can’t give up,” I tell Carol. “I’m human.”
Each question, and each statement seems to ask and answer the question succinctly. The difference between a sentient biological human and a sentient machine will only be the questions an organic human can ask that a machine cannot. In other words, the exact same logic, the exact same questions and answers can be asked by any sentience, not just organic human.
Each of these forementioned stories deal with machine sentience and human sentience. What’s clear is that the question of what is human is the wrong question. What is sentient is the right one. At least, it has a better chance of being answered.
Self-Actualization, “Reason”, and Religion
When considering the future of technology, robots and artificial intelligence often come to mind. Humans tend want to advance technologically as quickly as possible, but also have a sense of fear about computers surpassing us in intelligence.
A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Asimov’s short story “Reason,” gives readers two important points to think about. The first is how it is reminiscent of how religion is often accepted without any thought. The second is what may happen if robots gain a sense of curiosity.
Maslow’s Hierarchy of Needs
Abraham Maslow was a psychologist who believed that humans were subject to what he called a “hierarchy of needs”. The idea was that people have to fulfill certain basic requirements before moving onto higher trains of thought. These include food, shelter, and safety. Once those have been fulfilled, people can move onto friendships, esteem, and finally, self-actualization. Self-actualization is a concept he described as “realizing personal potential, self-fulfillment, seeking personal growth and peak experiences.”
There are several characteristics that Maslow attributed to people who he considered self-actualized, which included Abraham Lincoln and Albert Einstein. Some of these were:
- Highly creative.
- Problem-centered (not self-centered)
- Unusual sense of humor
- Perceive reality efficiently and can tolerate uncertainty.
Robots though, would probably not need to worry about the same things that humans do. Therefore, once they reached a high enough computing power, they could possibly be able to skip through the basic needs onto higher thought. The main lower need that they may worry about would probably be safety, assuming they had enough access to a power source. Humans are unlikely to react well to robots if they begin to match or surpass our intelligence.
This could lead to them trying to deactivate robots, or lessen the amount of power that they have. Depending on how sentient they are, the robots may retaliate.
In Asimov’s “Reason,” QT-1 decides that it must be created by a higher power, because its intelligence surpasses that of the humans. Upon being told that Gregory and Donovan put it together just one week ago, it states, “For you to make me seems improbable.” It continues, “I intend to reason it out, though. A chain of valid reasoning can end only with the determination of truth, and I’ll stick till I get there.”
Based on Maslow’s described characteristics, it does not seem that QT-1 has achieved self-actualization. One key point that Maslow stated was the ability to “perceive reality efficiently” and able to “tolerate uncertainty.” It also seems that QT-1 is more self-centered than problem-centered. It refuses to accept Gregory and Donovan’s statements as true. Gregory makes an attempt to explain things to QT-1, “You’re the first robot who’s ever exhibited curiosity as to his own existence—and I think the first that’s really intelligent enough to understand the world outside.”
QT-1 is shown where Gregory and Donovan come from in the far distance, but is somewhat dismissive, asking “But where do I come in, Powell? You haven’t explained my existence.” It would seem that QT-1 is still within the stage that Maslow refers to as “esteem needs.” It has issues with the idea that it has been created simply for the use of humans, and wants to find higher meaning for itself and the other robots.
It decides that the “Master,” or Energy Converter, is the creator of robots as well as humans, who were created “as the lowest type, most easily formed.” There is no actual proof for this, but is decided based on the fact that QT-1 cannot accept that humans, who are “soft and flabby, lacking endurance and strength,” are the ones who created it.
It earlier stated that it would use reason to sort the issue out, but in reality only bases its assumptions off of its inability to accept the idea that humans could have created it. It dismisses their attempts to show it the truth and even treats the humans as criminals, saying “Sacrilege,” when Donovan speaks against “the Master.”
The Trouble with Reason
In “Reason,” Gregory states, “There’s one trouble with [reason]…You can prove anything you want by coldly logical reason—if you pick the proper postulates. We have ours and Cutie has his.”
Reason is only the ability or capacity to apply logic. People can reason anything they want, but that doesn’t necessarily make it fact. QT-1 may have good reason to believe that it is not created by humans, a lesser being, but the fact is that they did create it.
Part of the problem with QT-1’s reasoning is that it is based on self-centered thoughts. It is not reaching beyond into self-actualization, but rather stuck in the idea that surely, these fleshy inferior beings cannot have created it. The idea of that being true is repulsive to it, so it chooses to find another truth.
This premise is similar to modern day religion. In a time of high powered technology which is rapidly increasing, there is no reason to find supernatural excuses for natural occurrences that can be explained through science. But the idea of having to step past self-esteem and reach their highest personal potential or growth is difficult for people to grasp, so they fall back into religion.
Religion allows people to attribute things that happen to them as acts of God, or what was meant to happen, rather than thinking about how or why it happened. It allows people to fall back onto random passages in old books as excuses for their behavior.
Asimov shows that reason is not such a high train of thought, because it can be shaped based on what someone wants to think. Reason may be a part of being “human,” but that doesn’t mean that the ideas that come from it are always good. Gregory and Donovan are saved only through the logic of the Three Laws of Robotics.
Humans often experience feelings of fear and discomfort when it comes to the idea of a cold, logical intelligence. Many would feel a sense of comfort if machines were programmed with human-like emotions. In “Reason,” Asimov shows that emotion may not be such a good thing for robots to have though, because it could lead to the sort of thought process that QT-1 exhibits.
dataSTICKIES are the next generation of data portability. They are graphene-based flash drives that replace USB pen drives and hard discs.
USB-based drives can be inconvenient to use as the positioning and insertion of the drive in the USB slot needs to be done precisely. When the slots are at the rear of a device, as is the case for many desktop computers, this task becomes even more troublesome.
dataSTICKIES solve this problem by carrying data like a stack of sticky-back notes. Each of the dataSTICKIES can be simply peeled from the stack and stuck anywhere on the optical data transfer surface (ODTS), which is a panel that can be attached to the front surface of devices like computer screens, televisions, music systems, and so on. The special conductive adhesive that sticks the dataSTICKIES to the ODTS is the medium that transfers the data. This special low-tack, pressure-sensitive adhesive is capable of being reused without leaving marks like a repositionable note. When the dataSTICKIES are being read by the device, their edges light up.
Quantic Dream’s “Kara”, a PS3 Tech Demo that explores the boundaries of artificial intelligence and consciousness.
What is Life?
The chapter “An Open Universe" in Kevin Kelly’s book "Out of Control", Kelly poses the ambitious question; ‘What is life?’ while differentiating the real from the artificial, and attempts to give it meaning applicable to the modern technological era.
At the beginning of the chapter, Kelly delves immediately into setting the science of artificial life apart from the science of biology. Kelly states;
"BIology seeks to understand the living by taking it apart and reducing it to pieces. Artificial life, on the other hand, has nothing to dissect, so it can only make progress by putting the living together and assembling it from pieces."
Because the concept of artificial life is relatively new, it lacks the complex and intricate pieces that work together in order to produce what humanity knows as natural or organic life. Thus, artificial life must take what information is already in existence and configure it in new way to create a semblance of new life.
Kelly goes on to explain that artificial life can take many forms and introduces the term ‘hyperlife’.
For instance, institutions such as schools, businesses, and organizations are bustling with life of many working individuals and programs. For this, they have been metaphorically deemed ‘alive’. According to Kelly, it is correct to assume that such institutions are living, and take on the form of an organism.
Kelly defines this broad scope of individuals coming together on a large scale ‘hyperlife’.
"Hyperlife is a particular type of vivisystem endowed with integrity, robustness, and cohesiveness - a strong vivisystem rather than a lax one"
Such systems could include but are not limited to a city, a virus, a rainforest, an artificial intelligence software, or a network and it’s operator. Much like organs work together to make an organism, individuals work together as a sub-species of hyperlife.
Despite the similarities shared between organic life and artificial life there are still distinct characteristics which set them apart in order to form a necessary differentiation.
As humans continue to expand upon what is hyperlife, it will begin to gather and construct more data and eventually grow into other systems. Gradually, this process of expansion will intensify and the processes of hyperlife organisms will develop to (and even surpass) that of human capabilities.
The possibilities presented in artificial life technology are outstanding and almost unfathomable. The more hyperlife expands, more opportunities present themselves for gathering and organizing information on subjects once unknown to humanity. The more lifelike hyperlife becomes, humans will be able to study processes similar to that of the human brain (Von Neumann architecture).
"What could be more human than to give life?" Kelly asks. "I think I know: to give life and freedom. To give open-ended life.”
Deep Eddy and The Cultural Critic
In Bruce Sterling’s “Deep Eddy”, a short story about a young American man in Germany on a dangerous mission to deliver information, a technology called “spex” is the driving force behind the spread of information and the cause of tumultuous gatherings of people called “Wendes”.
Points presented in this story:
- Knowledge construction and preservation
- The underlying concept of culture
- The ethics of sharing illegal data
Google Glass, a technology similar to Spex in “Deep Eddy”
Spex, a type of augmented reality technology that functions as eyewear with a computer display, makes it possible for the wearer to be able to access information instantly while providing details about a person, event, or location in real-time.
One of the immediate effects of this technology are flash mob-esque events known as Wendes. In a Wende, large amounts of people will gather in a location, usually with the intention of celebration. However, Wendes also have the potential to grow into protests and rioting, as it does in Düssledorf, Germany.
Eddy, a 22 year old from Chattanooga, Tennessee is one of the many tourists in transit to Düssledorf at the request of his local spexware users civil liberties group in order to smuggle a data disk to a man called The Cultural Critic.
In “Deep Eddy”, The Cultural Critic is a scholar whose goal in the story is to gather information, regardless of whether or not it is legal, in order to build information structures as a means of defining the underlying concept of culture. His rival, The Moral Referee, is a philosopher with a large following who intends to eliminate such illegal or criminal information, thereby rendering it inaccessible.
Towards the end of the story after Eddy has delivered the data disk, The Moral Referee is successful in storming the safe house in which The Cultural Critic has taken refuge. However, it is revealed by The Cultural Critic that most of the illegal data in the safe house is banal and that it was necessary to stage a resistance. The Cultural Critic states that the purpose of this resistance is that contradictory attitudes serve to fertilize society.
Despite having opposing viewpoints, both The Cultural Critic and The Moral Referee are similar in that they are artists and philosophers who are both fighting to shape the future of knowledge construction.
According to The Cultural Critic, all information, be it legal or illegal, intellectual or criminal, serves to build a greater sense of humanity’s self-knowing. While this information should be preserved, there should also exist opposing ideologies in order progress in the construction of knowledge. The society portrayed in “Deep Eddy” lacks an understanding of the difference between what has a cultural impact, and what is merely ornamentation. Technology such as spex is not only capable of preserving and spreading this information, but is also responsible for bringing clashing ideals together.