Video games and robots want to teach us a surprising lesson. We just have to listen
The entrance banner to the Never Alone exhibition at the Museum of Modern Art in New York. Image: Min Shin/ZDNET
The fast, colorful ghosts that made their way through the maze greeted me as I gazed at the screen of a Pac-Man machine, part of the Never Alone: Video Games and Other Interactive Design exhibit. from the Museum of Modern Art in New York.
Using the smallest amount of RAM and code, each ghost is programmed with its own specific behaviors, which combine to create the masterpiece, according to Paul Galloway, collections specialist for the Architecture and Design department.
This was my first time seeing video games in a museum, and I had come to this exhibit to see if I could glean some insight into technology through the prism of art.
It’s an exhibit that’s more relevant than ever, as technology has been absorbed into almost every facet of our lives, both at work and at home – and what I’ve learned is that our empathy for technology leads to new kinds of relationships between ourselves and our robot friends.
As well: How to use Dall-E 2 to turn your wildest imaginations into art
The exhibition wants to show how interactive design “informs how we move through life and design space, time, and connections, far beyond the game screen,” according to MoMA. The interfaces we use to access the digital universe “are visual and tactile manifestations of code that both connect and separate us, and shape the way we behave and perceive life,” he said. stated when the show was announced.
While visiting the exhibit, I continued past other masterpiece video games – Minecraft, Tempest, SimCity 2000 and Never Alone (Kisima Ingitchuna) to name a few – in stopping to play with all consoles open.
Many games seemed simple at first, limited to a single joystick and a few buttons, or a keyboard. Yet when I tried to play them, it took me a while to learn the ways to play. Some of them, especially Minecraft, made no sense to me, and I had to watch a kid play with them to understand the game’s intricacies of world-building.
The other visitors to the museum strolled through the games, waiting for a place to become free. When we did, their eyes immediately glued to the screen as they plunged into a new world with new rules.
Two people exploring the Never Alone exhibit. Image: Min Shin/ZDNET
I was most drawn to robots and gadgets, including a 1984 version of the Macintosh SE Home Computer, the iPod, and the EyeWriter, eye-tracking technology created by designers for a graffiti artist with ALS who allowed him to create tags on the city. buildings from his bed.
According to Galloway, the Never Alone exhibit is related to an Iñupiaq video game included in the exhibit titled Never Alone (Kisima Ingitchuna). This idea came from the Cook Inlet Tribal Council, which represents the native peoples of Alaska, and it was created with the goal of keeping the legacy of their culture alive and connecting with the younger community.
As well: One of the smallest countries plans to upload to the metaverse in the face of climate change
“They created a video game and the central idea of the game is that it is through a connection with each other and with our shared cultures that we can find wisdom and peace, especially in the face of the challenges of a changing world, and I think that just seemed like a perfect metaphor,” Galloway said.
Thus, according to Galloway, there are two meanings of the Never Alone exhibition here. The first is that when we’re in a video game, we’re never technically alone, because the input, the player, and the designer are all things that need to work together for the technology design to work.
As players of the game, we are constantly interacting with the input the designer has created for us to explore such an interface. In this sense, it is impossible for us to be truly alone when using interactive design.
The second thread is that – thanks to technology – we are truly never alone, even during the most difficult times, such as during a pandemic. We are constantly connected through technology, whether connecting a community to a culture or simply staying in touch with each other online.
This exhibition is a way to explore our humanity and how our relationship with technology can reaffirm our empathy instead of making us become less human alongside these robots.
As well: Remote work has changed everything. And it gets even weirder
Galloway told me that the exhibit was divided into three parts: the entrance, the designer, and the player.
“We thought about three different parts of this exchange. There are the actual machines, there is the person who uses the machines – the user or the player – and then there is the person who designs all the experiences”, Galloway said.
A Pac-Man game on the wall of Never Alone. Image: Min Shin/ZDNET
“Part of the reason this exhibit is taking place after the pandemic is that we’ve spent two years glued to our screens and interacting with each other through the various programs, whether it’s calls Zoom or Fortnite Battle Royale, or playing Among Us,” Galloway says. “Our interactions with everyone were mediated by these tools and it made us really good interactive design pros.”
As well: Can the metaverse record video meetings? Here’s what we found
For a while, many of us were effectively forced to channel our interactions with each other through devices and screens. And the Never Alone exhibit also asks – perhaps unexpectedly – how far we can extend our empathy not just across devices, but to devices themselves.
One way to examine these interactions is through the Technological Dream series: no. 1, Installation of the Robots project by Anthony Dunne and Fiona Raby, which is in a corner of the exhibition.
A variety of differently shaped objects – a red circle, what looked like a large shower head, a rectangular curved wooden prism, and something that looked a lot like a lamp – are all spread out on the floor.
Objects symbolizing robots at Never Alone. Image: Min Shin/ZDNET
In the accompanying video, a woman stands next to these objects, periodically picking them up, examining them, and apparently listening to them moan, as if yearning for her attention.
Are these objects supposed to be robots?
“Robots can take any form, and even [we’re] investigating our ability to extend our empathy to those things that are completely alien and inhuman-looking,” Galloway said.
“It’s not like a Roomba mopping your floor for you, it’s more like a dumb robot that can’t even move. All it can do is cry,” Galloway said. “How do we look at ourselves and extend our humanity to something like this?
“I think [the pandemic] was so mediated and informed by screens, digital devices and interactive software that I can’t think of it all the same way after this experience,” he said.
As well: The best robot toys for kids
This exhibition is the perfect opportunity to examine our renewed empathy and realize that our empathy for these devices may have always been there.
For example, consider the Tweenbot.
The Tweenbot was born from a project in 2009 when Kacie Kinzer let this smiling little cardboard robot wander through New York’s Washington Square Park with only the help of a passerby and a flag that read “Help me”, pointing in a specific direction towards helping him get to his destination.
Amazingly, the fast New Yorkers walking at their New York walking pace stopped to help the Tweenbot stay on track and untangle it whenever it encountered obstacles.
The Tweenbot managed to reach its destination and, surprisingly, did not end up maimed in a ditch somewhere in the city trenches.
The Tweenbot could not have accomplished its mission without the help of humans to guide it.
So there must be something inside of us humans that – walking the busy city streets daily, never making eye contact with anyone – stops and takes the time to put the little robot back on rails.
It seems counter-intuitive for humans to help a robot (or any piece of technology) achieve a goal, instead of the other way around. After all, robots are supposed to make our lives a little easier. They can perform tasks ranging from simple to complicated, such as cleaning, deliveries, and even cooking.
But Kinzer’s project showed us that when the tables are turned and robots are the ones who depend on humans to do something, humans are able to empathize with them. This is perhaps a positive sign for all of us – that our interactions via technology can keep us connected with the people we care about, but also make it easier for us to extend that empathy to the world around us.