First of all, full disclosure: the Wymore of Smith/Wymore Disappearing Acts is kind of my boss—Lisa Wymore, Chair of the Department of Theater, Dance, & Performance Studies at UC Berkeley, where I work as a Lecturer. The Smith is Sheldon Smith, Wymore’s partner in life and art. Six Degrees of Freedom is the duo’s latest choreographic adventure, which premieres at the end of this month at ODC Theater, the culmination of their ODC Artist Residency.
Two years in the making, Six Degrees is driven by an elaborate conceit: If an intelligent computer awoke from a dream compelled to make a work of contemporary dance theater, what sort of dance theater work would it make? For those of us who have known Smith and Wymore for their dance theater experiments with digital technologies, it comes as a surprise to discover that they do not engage the services of an actual computer for Six Degrees. Unlike earlier works that have been “directed” by computer algorithms (imagine the algorithm as the dice in a chance procedure), Six Degrees is, rather, a thought experiment.
Unsurprisingly, as a married couple that is also in artistic partnership, Lisa and Sheldon finish each other’s sentences. There’s a rhythm to their conversation—Sheldon’s slow, stretchy speech punctuated by Lisa’s staccato interjections—that I found mesmerizing. But what I’d like you to imagine while reading this interview are the little breathy laughs and delighted pauses that threaded through our conversation because, for Lisa and Sheldon, Six Degrees remains a mystery. Interviewing them was like watching a pair of lighthearted but mildly dystopian elves break apart the toys they had made, scattering their insides like runes to read for guidance on how to understand the impact of technology on our lives.
Sima Belmar: Talk about Six Degrees of Freedom in relation to how you’ve used technology in the past.
Sheldon Smith: In the past we’ve used a number of random number generating systems, although very finely tuned and attuned to the needs of the work…
Lisa Wymore: …telling us how long the sections are, when to come in and out…
Sheldon: …it doesn’t work very well to deal with pure chance. If you’re purely dealing with chance there might be a version of the piece where the computer interacts with us once and we’re just milling around for forty minutes. So there are certainly artistic choices that we make along the way to Goldilocks the thing.
Lisa: We’re always playing with control, command, and agency.
Sheldon: This piece was an evolution through an ongoing interest in the relationship between technology and dance, technology and the human body, technology and our lives. We wanted to think more about the computer’s creativity. In other pieces, the computer is directing us, telling us what to do…
Lisa: …but we know it really doesn’t have the creativity…it’s almost like it’s programming us…
Sheldon: …we programmed it to program us. But in this piece we’re imagining that the computer really has its own creative agency. If you fed enough fragments and YouTube clips of dance theater works into neural nets, what would the output look like? To some degree you can do that now—people have explored this a lot with text…
Lisa: …the computer can take a whole book or script and pull it through its algorithm to create a new script. There’s this film…
Sheldon: …in a sense the computer is trying to figure out what a script is, what a relationship between people is, but it’s always getting it wrong.
Sima: So if you put War and Peace into the computer…
Sheldon: …you would put everything written by Tolstoy into it and then see if it could write something like Tolstoy from what it’s digested from it. The computer looks for patterns. In Six Degrees, we are trying to build something with the aesthetic of computer-created work. But dance theater is so complex to begin with…
Lisa: …dance theater is already so esoteric and strange that if you were to plug in a bunch of postmodern dance theater works would it really look any different from the real ones?
Sheldon: The central question as we’re working is still, if we were a computer, what choices would we be making…
Lisa: …we’re pretending to be the sentient computer…
Sheldon: …we’re asking, if I fed everything I’ve ever done into a computer, how would I fuck that up and spit it out in a way that’s different from how I normally work? Where does it go wrong? And in looking at those places where it goes wrong, what do we learn about how close computers are to really understanding us at all?
Sima: Take me into a rehearsal, into the practice.
Lisa: We looked back at our older works and just went into the studio with these ideas based on past pieces. We knew we wanted to play with the randomization of text so we’ve been using what appears to be computer generated text but it’s not. It’s really odd and makes no sense to say, but we’re trying to make it [the text] make sense and then put movement to it—very grid-like movements that a computer might understand and command a body to do.
Sima: So it’s a combination of movement that you think a computer might understand and what a computer might do with a bunch of movement data.
Lisa: Yes, it’s playing with both sides of the coin.
Sima: But not playing much with the computer.
Sheldon: Not very much with the computer itself. We know the computer could look at the data and then spit out some variation on that, but the question is, what is the computer actually thinking aesthetically? Does it? Well it doesn’t, but if the computer were having some sort of aesthetic awareness in what it’s doing, what is that and at what point would it start to own its aesthetic choice-making and start to self-identify as an artist and have its own signature style. So in the case of Six Degrees, the computer has not gotten there, it’s fumbling around.
Lisa: The piece is inhibiting our own organic choices of what we’d normally make in a piece. We’re stumbling ourselves, challenging ourselves, juxtaposing what we never thought should or could go together. We haven’t chosen to say put all of our work in it. Well, we don’t know how! We don’t have the computer programs to do that. We would need a giant grant. It’s a provocation.
Sheldon: It’s a provocation. One of the things that has become interesting to me in working on this is how it’s coming full circle to Dadaism and neo-futurism and all these things that happened 100 years ago when people were exploring structure and language, chance juxtapositions of things, and the context of that, post-WWI, fascism. With Trump, we have the desire to make even less sensical work than we might otherwise make.
Sima: There’s a threat to sense that’s different now from prior threats to sense because it’s an explicit attack on the very notion of sense. Fake news, no truth. But in terms of political cycles, if artists have long been playing with language in order to break down habitual modes of sense-making and sense-perceiving, it seems to me that those interested in AI are interested in training computers, which currently make non-sense out of something that makes sense, to make sense of whatever logic of the world that you’re putting in it. But then you two come along and seem to be trying to make non-sense in relation to a project that would normally be trying to teach sense-making. Ok, now I’m lost.
Lisa: I think a lot of folks in technology think we’re getting close to getting computers to do a lot more for us, that the computer can be trained to understand our logic, to make ethical choices. But I think this piece is saying that we know that it can’t really, that part of why we’ve lost sense, and I don’t know if we’ll get this through the piece, is because the computers have been programmed to do all these things and we don’t understand what’s algorithmically coming at us.
Sima: Like we can’t tell the difference between a bot and a person?
Lisa: Right, and so the computer, for all that we’ve invested in it, is infiltrating our world and it will never be all that we are. So in a way this piece has a grimmer futuristic vision. We’re risking failing because we know it can never be right.
Sheldon: In the process of doing this we’re almost dehumanizing our artistic process…
Lisa: …or reprogramming it.
Sima: But if a human can align with computers, then that’s a human capacity, it’s not dehumanizing or less human necessarily. So I’m wondering about the language of the project.
Lisa: It’s more about the truth seeking stuff, what gets perpetuated, what gets put in front of our screens as we search. Small things, like why do I always see bras on my Facebook page, I don’t want to see any more bras on my page, but it’s telling me, it’s changing how I look, how I search, what I can even get access to, it will put stuff first, it will block things. I can’t get down into things because of these small iterative patterns that are constantly eroding away at us. I know its changed my art practice, how I go into the studio…
Sima: …but it’s so productive to get out of habits.
Sheldon: It’s incredibly productive. The biggest impact of our overarching concept, in practice, is forcing us out of habitual ways of working and taking away certain levels of responsibility about the outcome.
Sima: Will the piece have anything to do with Kevin Bacon?
Sheldon: There may be some Kevin Bacon…
Lisa: The “six degrees” are all the different joint angles in robotics—flexion, extension, rotation, lateral side bending, etc. and if you put them together you get all the movement. It’s a system used in programming that affects us as humans. It relates to the Laban system of how we see our bodies in space. And in a way it’s limiting but at the same time it’s functional. It’s kind of the thing of the piece. Limitations are both creative and control your freedom.
Sima: Clearly you both feel that dance is a privileged site for exploring these questions and also, maybe, helping us make different choices about how we use and are used by our devices. Do you have hopes for the piece?
Lisa: Just to realize the power. Are we losing our freedom? Like our son gets some screen time every day. His sense of creativity is so different. Is he freer? I don’t know. In some way he can see all the movie references he wants to, go down some deep path, but in another sense he doesn’t get to wait to see the movie or read a book about the references or talk to a neighbor. What are we losing and what are we gaining?
Sima: I’ve never interviewed the two of you, so you may always be like this for all I know, but you do seem to be in a kind of mixed state of low level anxiety and delighted maybeness about this piece, which maybe you don’t always feel…
Sheldon: …that’s pretty accurate…
Sima: …which as an interviewer kind of makes me more interested in the work because, if they don’t know what they’re thinking, that takes the burden off the viewer to “get it.” It’s about doing, making and witnessing. And if you have a rigorous path, any kind, it’s very freeing. I feel like I’ve been put in a little bit of a freed up space just talking with you.
Sheldon: I would say in almost every case we don’t know what the piece is trying to say until we’ve actually got it complete and put it in front of people and heard from people. To some degree what we’re telling you right now is a construct of several grant processes we’ve been through which have been helpful in the sense that they’ve shaped our process and put us on this rigorous path to make this work and it’s created some ideas that we’re following. At the same time, between now and November, I still hold open the possibility that what we think this thing is could be completely different.
Sima: Well it’s been a pleasure dancing around with you in your thought experiment.