NEW YORK (NYT NEWS SERVICE).- In a messy warehouse, a dancers silhouetted spine and legs begin to undulate. Illuminated only by a spotlight attached to a machine, she begins to move, tracing shapes with her arms.
But then something odd happens: The spotlight starts to move of its own accord. Responding to the dancer, it finds its own rhythm and sways side to side. It soon becomes unclear who is leading whom. What is clear is that a dance is being made, and that one partner is 9 feet high, more than 500 pounds. Called ABB IRB 6700, it is one of the largest industrial robots in the world.
Dancer and choreographer Catie Cuan is the human star of this show, Output, which is a part of a project with the Pratt Institute. And while dancing with robots may sound a bit like science fiction, to Cuan, who is completing her doctorate in mechanical engineering at Stanford University, it feels like an extension of my body and of possibility.
The working assumption for most of history has been that dance is a thing done by and for humans. Yet it does not seem beyond the pale that robots will one day perform for us mortals, particularly when choreographers like Cuan are using technology to explore the outer limits of the art form.
One of Cuans projects is translating basic jazz and ballet vocabulary into robot joint angles and creating what she called a ballet for swarms of robots, mapped onto robot morphology that leverages their innate nature.
That innate nature has to do with their distinctive movement qualities the precise torques of their joints, that they have no muscles to contract or relax which totally changes the perception of weight placement and bodily distribution.
Meshing an art form so tied to the body with machines may seem like a paradox. But, Cuan said, AI is a choreographic tool that can disrupt the habitual dance-making process.
At the forefront of this growing field is Sydney Skybetter, a former dancer and a professor of what he calls choreographics at Brown University, where his students approach dance in a way that is heavily computational like using machine learning to create ghostly digital avatars that dance along with live performers.
Skybetter and Cuan join a line of working artists who have experimented with technology to break new ground in dance. The pioneer was choreographer Merce Cunningham who, working with electronic artist Thecla Schiphorst, used a software program called LifeForms that could sketch movement.
Trackers (1991) was Cunninghams first dance made with LifeForms, and roughly a third of the movement was created on the computer. Using the software opened up possibilities of working with time and space that I had never thought of before, he said at the time.
By the end of the 20th century, motion capture, wearable tech and virtual reality had arrived on the scene. Then came artificial intelligence. One of the first major artists to work with it was choreographer Trisha Brown, who in 2005 employed a program that responded to her dancers movements by drawing graphics that were then projected onstage.
In the last five years, Google Arts and Culture has been collaborating with dance artists, including the Bill T. Jones/Arnie Zane Company and the Martha Graham Dance Company. Last year, Google released the Living Archive an interactive atlas of a half-million movements drawn from choreographer Wayne McGregors repertory. The archive allows a user to choose poses and construct a dance phrase; or to dance in front of the camera and let the computer find the closest visual match that will then be a building block in a new sequence.
Damien Henry, project lead at Google, also developed a more advanced machine for McGregor to use. That machine was fed a diet of the Living Archive and 100 hours of footage of McGregor dances. In rehearsal, the algorithm could capture dancers movements via webcam and then immediately render onscreen a selection of 30 original McGregor-esque sequences. McGregor and his dancers could choose to adopt or develop this output.
At times, the algorithm produced suggestions that the dancer wouldnt want to do, Henry said. But then Wayne realized it was extremely useful. It forced a dancer to explore unnatural territory.
In July 2019, Company Wayne McGregor premiered Living Archive: An AI Performance Experiment, a 30-minute work developed in conjunction with the program, at the Music Center in Los Angeles.
We all have biases, McGregor said in an interview, and ways in which we frame the world. AI affords more self-knowledge. It helps you play your instrument differently.
Some dance artists are thinking about AI beyond its utility as a technical tool and predictor of movement. Pontus Lidberg, the artistic director of the Danish Dance Theater, set out to use AI as a more integral part of his choreography, in rehearsal and performance. The goal: To create a dance that articulated the tension between man and machine, by putting the two together onstage.
In 2019, Lidberg began working with computer artist Cecilie Waagner Falkenstrom. We didnt want to create something proving that an algorithm can find patterns, she said. That is boring. We wanted to create something that touches us as human beings.
To achieve this, the AI (affectionately called David) was fed information from myriad sources ranging from planetary movements to the structure and semiotics of Greek tragedy.
Because the AI was trained on more than just my movement vocabulary, Lidberg said, it learned a lot, deconstructed this knowledge, and then built it up again with the dancers this created something entirely new. Each performance of the dance, called Centaur (2020), is a distinct, unpredictable event a neat allegory to our relationship with technology. (The production is currently touring Europe).
This type of work opens an intense conversation about where the choreography is, and by whom, Skybetter said. With machines, it becomes difficult to point to any singular choreography by one person or system.
As more choreographers deconstruct and redefine their craft with the help of AI, they are often faced with the question: At what point does human creation end and the machine take over?
Arguments against AI making art are as old as AI itself: It is ethically abhorrent, it cheapens art, it accelerates the redundancy of humanity the list goes on. Sean D. Kelly, a professor of philosophy at Harvard University and the author of a book on AI called All Things Shining, wrote in a 2019 article for the MIT Technology Review wrote about what worried him: We will come to treat artificially intelligent machines as so vastly superior to us that we will naturally attribute creativity to them. Should that happen, it will not be because machines have outstripped us. It will be because we will have denigrated ourselves.
But for those working with dance and AI, this view seems fatalistic. What is often not clear, Lidberg said, is that an AI with a consciousness doesnt exist. That is science fiction. However you choose to intellectualize it, the human body and mind are still central to dance.
As the dance world continues to navigate the pandemic, McGregor stresses the importance of finding ways that audiences can engage viscerally with work not just cerebrally. He suggests that haptic technologies like virtual reality headsets and other user engagement tools primarily used in gaming could be the way to experience what he calls a chemical engagement with dance.
Some are already coming up with ideas. Kate Sicchio, a choreographer and an assistant professor of dance and media technologies at Virginia Commonwealth University, is developing a visual dance score created by a machine. As part of her research, she stages live coding jams, in which dancers respond to notations projected onscreen. There are obvious applications to our pandemic context: You could do one of these jams distantly I have done it once. But nothing beats being in the room.
Skybetter is similarly adamant that none of these technologies can really exist without a human hand. A human is needed at every stage: to input the coding, feed it with information and design the algorithm to reach creative goals. Science is not at a point where choreographic software can generate its own material. It is hard enough to translate dance to other humans, let alone a computer.
Art needs the fallibility of the human mind to recognize where the unexpected is exciting, Waagner Falkenstrom said. Lidberg agrees: AI can probably replace everybody. But a true artist, one with embodied knowledge, interested in posing questions and making aimless research no, that cannot be replaced.
For the skeptics, it might be reassuring that there remains an instinctive bias to retain human ownership over art. Machines or virtual reality headsets will struggle to replicate the tension between artist and audience. Just like a robot will struggle to feel or mimic the training held in the bodies of dancers.
Cuan said she believes that learning to choreograph and move alongside intelligent robots will fundamentally change our understanding of human locomotion and therefore of dance. Particularly because she one day, she said, robots will be ubiquitous, moving with us.
It is unclear if AI will carve out a significant space in the dance canon. But at this nascent stage, it is taking dance to fascinating, sometimes uncomfortable, places. Where this relationship goes next is largely up to science and artistic appetite.
McGregor, for one, is unequivocal about AIs potential. Creatives think that their process is a mysterious thing that happens to you and it cannot be formalized, he said. But the more we understand about how we make our choices, the more we can make different choices.
And, hes sure about one thing: Nothing can replace the human heart.
© 2020 The New York Times Company