Viva la Robot

My emotions are high, people. High like the high wind. High like the high tide.

I quit smoking six days ago, and although I still have the vague hungry itch under my skin, I’m pretty pleased that I’m no longer killing myself–at least not by smoking, at least not today. But I am surprised by the moodiness. I find myself erratic, weeping in the drugstore, compulsively cleaning the sink, laughing at nothing. I am a branching tree of nerve endings raw to every wind.

But who am I kidding? I have always been prone to brisk shifts of emotion. I have a feather trigger which is sprung by gentle pressure into joy or its hundreds of opposites, and even my attitude about this trait swings sharply from gratitude to disgust. But we who love words are so often mercurial. Why is this? Could the literary impulse be a pressure to express with accuracy and precision these shades of mood (express, as you would a wound or a fruit–squeeze out the excess)? Do we have a feeling that if we could explain them just right, utter the spell with just the right intonation, the emotions would disappear?

Who the hell knows. I want a cigarette. Is it time to go to bed yet?
I’ve thought those three small thoughts in succession about 20 times today. But in between, I re-stumbled upon Vona Groarke’s poem, Desire, in KR Online:

“I would like
to feel indifferent
as a plinth or tabletop
of pure Carrara marble
that has all its darkness
corralled in veins
that hold themselves
instinctively intact.”

Oh, wouldn’t I.

Which brings us to…you guessed it…

My passion for robots (which term is said to have originated in Karel Capek’s mind-numbing 1923 play, RUR) is true and deep. There are so many things to love about them; or at least about the IDEA of them, because really, that’s what I’m talking about: an image, drawn from boxy, jerky prototypes in cartoons and B-movies. As a pop culture icon, this archetypal “robot” is, interestingly, several opposites at once: mythic and futuristic, nostalgic and unsettlingly prescient.

Now I’ve done a lot of research on the real thing, and I know scientists are doing frightening and awesome work to endow real-life androids (some developers call them “humanoids”!) with human emotion and physicality. Yes, yes, great, wonderful, the world is coming to an end, androids will take over the universe, etc. Whatever. This unwelcome intrusion of reality is missing the point. The point of a robot is that, although it can think and behave like one of us, it cannot feel. Oh, lucky, lucky robot; how I envy you.


I have tried to express my fascination with this indifferent species in play form, but the very things that fascinate me about robots make their depiction a particular dramatic challenge. Because the very backbone of Aristotelian and Stanislavskian “character” is based on the relationship between objective (desire), choice and action.

I am no Aristotle acolyte, and Stanislavsky’s dogma gets on my nerves. But desire provides a pretty handy engine for a piece of drama. So what is the locus of a robot’s desire? Can it be said to have desire, or is this anathema to “robot”? In my estimation, yes; to give a robot desire is to destroy its wonderful freedom. Let’s say it’s programmed to “want” certain things; doesn’t this really reflect the wants of its programmer, a “character” who may exist only offstage? What does this interruption of desire and limit to the power of choice do to dramatic action? And what is a new engine or paradigm that might carry these beings into the realm of dramatic storytelling?

And another unsettling query:

In writing for the stage, we are, after all, programming language and requiring specific utterances of live agents. True, the actor fills these lines with meaning for himself, and thus for the character, but an actor’s verbal agency is limited to a playwright’s choices; the actor’s words are utterly out of her control. This is law in the world of new plays: the primacy of the text. It is as if I embed in the character every word he knows, feed him his entire lexicon, and in doing so, shape his worldview. And the actor is controlled by these restrictions, not to mention by the demands of the director; and as in the case of Meyerhold, you may be subject to “[moving] like stylized puppets under the authoritarian controls of the director” as well.

I’m not sure how I feel about this role as programmer, or how correct this analogy is. There are definitely times when I wish I was a robot/character, incapable of choice, blank, “indifferent as a plinth”. I wish I couldn’t choose to smoke or not to smoke. I wish I had no desire to satisfy the hungry itch. I wish it was bedtime. I wish not to wish, not to feel with such vigor. But somehow with “feel” comes “speak” and “write”. This is the way it is.