Source:
Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. New York: Basic Books.
Excerpts:
Part One: “The robotic moment: In solitude, new intimacies"
Overview & Synthesis:
Part one discusses the “robotic moment” and sociable robots and how they are creating new unsettling relationships and instabilities in how we understand privacy, community, intimacy, and solitude. Interactions with robots like ELIZA, Tamagotchis, Furbies, AIBO, My Real Baby, Cog, Kismet, and Paro illustrate certain bonds, deep emotional connections, and the creation of feelings of pseudo-parental attachment. Turkle poses a troubling observation that robots are filling gaps in the society and we hope to use them as a solution to our own imperfections, as an easy substitute for the difficulty of dealing with others.
Turkle (2011) describes technology as the “architect of our intimacies” and seductive when offering what meets our human vulnerabilities (p. 1). We are drawn by the illusion of companionship with convenience and without the demands of intimacy. However, this presents risks of emotional dislocation. In her examination of the relationship between humans and robots, Turkle suggests we don’t mind giving human qualities to inanimate objects and are content to treat each other as things.
In chapter one, “Nearest neighbors” she uses the phrase “alive enough” to describe a relational readiness when speaking about sociable robots (Turkle, 2011, p. 28-9). She argues that robots are evocative objects because they prompt people to think about them and in turn people think about themselves. Chapter two, “Alive Enough,” continues this conversation about how we are at the point of seeing digital objects as both creatures and machines. Computers and robots no longer ask us to think with them, but ask us to feel for and with them as they become more sociable, affective, and relational (Turkle, 2011, p. 39). She argues that robots are a slippery slope because they take advantage of our instinct and capacity to project human traits on to inanimate objects. For example, children are able to empathize with the “needs” and “feelings” of the Tamagotchi and Furby, who in reality have no intelligence but can fake attachment. This interactivity prompts our minds to start projecting consciousness and we compensate by filling the gaps. A machine only needs to act clever and people will play along. Additionally, she points out that this simulation is enough to provoke empathetic urges, as shown by the test where subjects are asked to take a Furby, a Barbie doll and a live gerbil and hold them upside down.
AIBO, the robotic dog |
Paro, the therapeutic seal |
Chapter seven, “Communion,” describes an encounter between Rich and Kismet, providing an example of how robots can offer a fantasy of near communion and we can thus imagine a meeting of the minds. Robots offer what meets our human vulnerabilities. Turkle (2011) points out that we can interact with robots in full knowledge of their limitations, but are still comforted by what must be an unrequited love (p. 133).
Turkle’s consideration of the robot for real opens up daunting philosophical and psychological conversations. As society gradually seems to be growing more “machine ready,” it seems less shocking to put robots in places where people used to be (Turkle, 2011, p. 146). This prevalence of robots makes us unwilling to put in the work required by real human relationships, which presents certain risks. Section one describes how the boundaries between people and things are shifting, but makes the reader question which of these boundaries are worth maintaining.
Questions & Reflections:
Is a machine “alive enough” to die? Or does it just break/stop working and need fixed? Do we mourn the loss of this artificial life? Think about things that may be important to you like your phone or computer. How would you feel if it “died”? Is it okay in our society to “mourn” technology?
Do we attach ourselves to particular machines? Like how the children attach to their Furbies and Tamagotchies, where even an identical one cannot replace the original. Think about how you have personalized your own technologies like your laptop or cell phone. If you lost or broke one of them, can it ever be the same again? Would you feel differently towards a replacement?
Is using robots as “stand-ins,” such as care for children or the elderly, morally wrong? Does this substitution pose ethical problems? On one hand it frees up people to devote their time elsewhere, but at the same time does it neglect these people from receiving a human touch in their lives? What is the balance and which is more important in our society?
No comments:
Post a Comment