Confronting ‘virtual’ dualities in the work of multimedia artist Lawrence Lek

15c3390d-d6ea-4b3b-a27a-323bff8ac934.sized-1000x1000.jpeg

Confronting ‘virtual’ dualities in the work of multimedia artist Lawrence Lek

In the documentary “HyperNormalisation,” Adam Curtis explains that cyberspace, as it was initially conceived, promised an alternate world free from the politics and corruption of the “real world.” This digital realm, its idealist advocates believed, presented an opportunity to build a democratic utopia accessible anywhere by anyone — it would be a sacred and protected space separate from reality.

Yet, in contemporary society, it is impossible to divorce the digital from the real. Through our use of digital platforms and technologies, the virtual undergirds every moment of the quotidian, creating a cognitive dissonance in our experience of the “real world.” What is real, and what is just a simulation? What differentiates humans from artificial intelligence? How has our very ontology begun to change as a result of the blurring between real and virtual worlds?

ADVERTISEMENT

These are a few of the many questions the multimedia artist, filmmaker, and musician Lawrence Lek explores in his practice. Lek is currently a Ph.D. candidate in machine learning at the Royal College of Art in London, England, and is of Malaysian-Chinese descent. He received his B.A. in architecture from the Trinity College at the University of Cambridge, and his Masters in architecture from The Cooper Union in New York. Lek has been awarded several residencies and has been exhibited widely, from Rome to Hong Kong. He is represented by Sadie Coles HQ, an art gallery in London.

Through a constellation of works generated across various platforms and mediums, Lek builds parallel worlds of his own to reimagine the present in the future (or the future in the present). On Nov. 5, Lek gave a public talk hosted by the Princeton Art Museum to discuss the research interests and questions that guide his practice, focusing in particular on three films: “Sinofuturism (1839-2046 AD)” (2016), “Geomancer” (2017), and “AIDOL” (2019). Lek is the 2020 Sarah Lee Elson International Artist-In-Residence, a residency sponsored by the Museum in which an artist working outside the United States visits the University campus to present a public lecture and hold workshops, discussions, and meetings with students and faculty.

Perhaps influenced by his background in architecture, Lek is fascinated by worldbuilding. He remakes familiar cityscapes digitally as a means of exploring how time bends in the virtual world. In “Geomancer,” Lek manipulates time on multiple fronts: On the one hand, he warps narrative time by making an AI satellite the protagonist in a bildungsroman (coming of age) story set during the centennial of Singapore’s independence, superimposing the timeline of the AI onto that of the nation-state.

On the other hand, Lek alters the physical landscape — the visual elements of worldbuilding — to signal the fluidity of geological and historical time. In the film, the contemporary cityscape of Singapore’s Marina Bay grounds the film in the present, while environmental details denote the futurity of the film, such as flooding induced by climate change. Place becomes a way to bring divergent moments together into a singular point in time, obfuscating the distinction between present and future, real and virtual.

Lawrence Lek, Geomancer, 2017 [still] / Lawrence Lek, courtesy Sadie Coles HQ, London

Lek also interrogates the real-digital divide by incorporating AI into his work. Citing game theorist Alan Turing GS ’38, he suggests that “intelligence is not a definite thing … it’s not a destination to arrive to, but it’s a state where you actually don’t know the difference between A or B … it’s this idea of uncertainty … and it’s also the way a lot of machine learning and deep-learning algorithms operate today.”

ADVERTISEMENT

In his talk, Lek mentioned a historic match of the Chinese board game Go, in which Google DeepMind’s AlphaGo AI defeated a human Go master for the first time. Human genius, Lek seems to imply, is not so far removed from the machine — a suggestive claim he explores in his work, such as in the film “AIDOL,” in which an AI songwriter helps a fading pop star revitalize her career.

Lek most directly probes at the overlap between human intelligence and AI in his video essay, “Sinofuturism (1839-2046 AD).” A blend of documentary, social realism, and conspiracy, the film puts forth a theory of ontology modeled on several stereotypes about China, which Lek calls Sinofuturism, defined in the video essay as “a form of artificial intelligence, a massively distributed neural network focused on copying rather than originality, addicted to learning of massive amounts of raw data rather than philosophical critique or morality, with a posthuman capacity for work, and an unprecedented sense of collective will to power.”

Lek first conceived of the work while researching the relationship between East Asia and AI and the media’s representations of the two. He observed that “portrayals of Chinese industrialization and of AI were actually mirror images of each other — [that through rapid expansion and growth, they would] either … save us all or destroy us all.”

In “Sinofuturism,” Lek highlights similarities between the two through seven overarching themes — computing, copying, gaming, studying, addiction, labor, and gambling — claiming in the film that the “essential unknowability of the AI to the human, of the mystique of a consciousness beyond emotional understanding, is exactly the same other identified in Orientalism.”

Subscribe

Get the best of ‘the Prince’ delivered straight to your inbox. Subscribe now »

His self-evident awareness of the narrative convergence between AI and Chinese industrialism transforms his embrace of their stereotypes into an act of subversion, one that articulates a broader truth about human civilization — namely, as Lek puts it, that cultures simply want to perpetuate themselves.

Lawrence Lek, Geomancer, 2017 [still] / Lawrence Lek, courtesy Sadie Coles HQ, London

Most of the film consists of found footage — a melange of newsreels, InfoWars clips, gaming tournaments, movie stills, and virtual reality simulations — which document the fiction of China put forth by the media. By appropriating the original source material and thus creating a copy, Lek presents his theory of Sinofuturism through the very form of the work.

The copy — the virtual, the counterfeit, the non-human — becomes synonymous with the real, as it is only through this act of copying that the video essay and Lek’s theory of Sinofuturism come into being.

As an assemblage of internet footage, the (im)materiality of “Sinofuturism” challenges traditional hierarchies within the art world, blurring the distinction between “high art” and “low art” — that is, the “autonomous artwork” of genius and the art of the masses (pop culture). Through “Sinofuturism,” Lek asks us to reassess our valuation of the copy: “[the] video essay [‘Sinofuturism’] … [aims] to see things from the machine’s perspective, to see copying as not lesser than originality … to see gameplay and gambling as not lesser than the fine arts or humanities or sciences as well. To configure not just the right and wrong relationship through bias, but also to reconfigure the role of art, broadly speaking, in transforming reality.”

Lek thus challenges the singularity and authenticity of the original (the human) and elevates the position of the copy (the inhuman AI, the Chinese worker). The copy is distributed, consumed and reprocessed to produce new contents; much like the original, it molds reality.

All works of art, regardless of whether they are copies or “originals,” are only legible insofar as they recycle a language familiar to their audience. What is a work of art, then, but a re-presentation of signs already in circulation?

Published at Mon, 07 Dec 2020 00:56:15 +0000

AI-Assisted Cough Tracking Could Help Detect the Next Pandemic

When Joe Brew worked for the Florida Department of Health as an epidemiologist for two years starting in 2013, he helped with syndromic surveillance, meaning he had the arduous job of reviewing the symptoms of patients coming into the emergency departments from all across the state. The goal of such work: to detect an abnormal spike of symptoms in an area that may indicate there’s a public health concern. 

Public health authorities worldwide continue to use this type of surveillance. The outbreak of a novel pathogen in Wuhan, China in late 2019, for instance, was in part detected by a large uptick of patients coming to the hospital with symptoms of a respiratory infection, with unknown etiology. But Brew says this system fails to prevent the transmission of a virus like SARS-CoV-2 because by the time patients arrive at the hospital, they have likely already been infectious for a matter of days. COVID-19 tests, too, often fail to return a result in time for patients to properly isolate while they’re infectious.

This realization led Brew to turn to a device that billions have in their pockets—a smartphone—to provide public health authorities with real-time symptomatic data from the community. Brew and several colleagues founded Hyfe, a free phone application that uses artificial intelligence to detect and track users’ coughs, a hallmark of many respiratory conditions including COVID-19. 

Cough has long been a symptom that physicians record, yet the method for monitoring it is typically limited to a self-report during a clinic visit.

“The way you beat COVID is by acting fast—by being ahead of it,” says Brew, who is now the CEO of Hyfe. “Those places that very quickly identified clusters and outbreaks were able to shut things down and basically control the pandemic even without a vaccine.”

People who are curious to monitor their cough frequency trends, say, if they have a respiratory condition or want to share these data with loved ones or medical professionals, can download Hyfe onto their smartphones. When the application hears a loud, abrupt noise, such as a cough, it captures that approximately half-second snippet of sound and converts it into a 3-D image called a spectrogram that represents the pitch and intensity of the sound over time. The spectrogram is then processed through a machine learning algorithm known as a convolutional neural network, which has been trained on a dataset of more than 270,000 sounds—a cough, laugh, grunt, burp, or a fork hitting a plate, for example—that two human listeners have labeled as a cough or not a cough that help the algorithm determine whether the abrupt noise was, indeed, a cough. The app also learns an individual’s unique cough, so if the app detects another person’s cough that significantly deviates from their previous coughs, it won’t log it.

Brew and a team of scientists aim to take this concept into the field to see if they can detect an outbreak of an infectious respiratory pathogen. Their pilot project will examine the cough data at a community-wide level in a small municipality on the outskirts of Pamplona, Spain.

A spectrogram, which is a visual representation of audio frequency and intensity over time, that shows an archetypical cough. Hyfe converts snippets of users’ suspected coughs into these images and processes them through a machine learning algorithm to determine whether the sound is a cough.

HYFE

“There’s a lot of people thinking about diagnostics, but everyone is thinking individually,” ISGlobal Barcelona Institute for Global Health epidemiologist Carlos Chaccour, who is leading the study, tells The Scientist. “But so far, the community perspective has not been pursued.”

While Brew admits that there could be many non–infectious disease causes for a cough—air quality, asthma, allergies, fumes from cooking, to name a few—he says the lack of specificity is a feature not a bug. In the case of wastewater surveillance, which is another tool for public health officials to use to indirectly detect and monitor the spread of infectious disease, “you’re already looking for the virus, and you’re already in an epidemic situation,” he says, whereas Hyfe could, in theory, detect a surprise flare-up from an novel disease before it progresses to that point.

The great thing about this cough monitoring is it will be passive—the patient won’t actually have to do anything.

—Alyn Morice, Hull York Medical School

Chaccour and his team have enrolled more than 60 people in the community so far and aim for as many as 500 to test the concept. As part of the study, participants grant researchers access to their hospital records and their Hyfe data to determine if a rise from a baseline level of coughing among the participants correlates to more diagnoses of respiratory conditions, including COVID-19. 

If Hyfe can successfully demonstrate that its detection of a higher community incidence in coughing precedes more respiratory diagnoses in the clinic, Chaccour says, he envisions users could then view a heat map of anonymized data showing which communities have the highest prevalence of coughing—a tool that could come in handy for public health officials but also people hoping to understand the risk of infection in a community.

A missed opportunity

Cough has long been a symptom that physicians record, yet the method for monitoring it is typically limited to a self-report during a clinic visit. Previous research suggests that patients often underreport how much they cough, which has led epidemiologists such as Brew to think that there is untapped potential in using cough data.

Alyn Morice, who is the head of respiratory medicine at Hull York Medical School in the UK and specializes in the diagnosis and treatment of cough, says that patients answering questionnaires are entirely unreliable. He’s even seen patients fabricate data from peak flow meters, portable devices that patients use at home to measure how efficiently air flows through their lungs.

“The great thing about this cough monitoring is it will be passive—the patient won’t actually have to do anything,” he says.

University of Washington infectious disease expert Peter Small, who is the senior director of Global Health Technologies and is not involved with Hyfe, tells The Scientist he is optimistic about this new frontier of technology, particularly in the context of eradicating tuberculosis. “Patients seek care very late in the disease and part of that is because we, as a society, tend to ignore cough in adults,” he says. 

With the help of AI-assisted cough tracking, he envisions a world in which users who have been coughing at a higher rate than normal for, say, two weeks would receive a text notifying them of their symptoms with directions to a public clinic that can test them for tuberculosis.

Even in patients with a confirmed TB diagnosis, Small says, the technology could show patients’ recovery progress. “I’ve been around a lot of TB patients and it’s a very disconcerting diagnosis,” he says. “Even though it’s almost always curable, it’s psychologically difficult on patients, and having objective evidence that their cough is getting better can help with their spirits.”

Cough counting would have been interesting in 1990, but right now we have some five billion humans carrying a microphone with them at all times every day, everywhere.

—Joe Brew, Hyfe

The Hyfe group is not the only one working to integrate cough into a more effective public health tool. Morice, for instance, developed an alert system using cough tracking through an external monitor users wear around their necks for impending chronic obstructive pulmonary disease (COPD) exacerbations—severe episodes that can often lead to hospitalization. His research team detected 45 percent of these flare-ups an average of four days prior to diagnosis, according to data they presented at this year’s European Respiratory Society virtual conference. With early intervention, patients who take steroids or bronchodilators can prevent or lessen the severity of these exacerbations, Morice says.

“If you’re able to prevent hospital admissions in these folks, it’s much better for the patient but it’s also much better for the health economy because [treating the] exacerbation is an expensive thing,” he says.

An MIT group tried to develop a tool to determine if it’s possible to identify a COVID-19–specific cough. The researchers processed more than 70,000 forced-cough audio samples, of which 2,660 were submitted by people with COVID-19, through a machine learning algorithm, which they claim accurately identifies 98.5 percent of coughs from people who were confirmed to have COVID-19, including 100 percent of forced coughs from those who were asymptomatic, according to their paper published in October in IEEE Journal of Engineering in Medicine and Biology

In other endeavors, researchers are soliciting healthy and COVID-19–infected individuals to help train their AI models to eventually allow users to understand if they have the virus based on their cough. These projects include a Bill and Melinda Gates Foundation–funded initiative, Cough Against Covid, at the Wadhwani Institute for Artificial Intelligence in Mumbai, the University of Cambridge’s COVID-19 Sounds project, and the Coughvid project at the Swiss Federal Institute of Technology Lausanne.

Morice remains skeptical of apps that claim they can diagnose users’ coughs: “Frankly, I don’t believe them. You can tell a wet cough from a dry cough, but that’s about it from the cough sounds.” Several coauthors of the MIT study declined requests for an interview to discuss their work.

Brew says he not only wants to better understand the acoustic signature of different ailments, but also track the diurnal pattern of cough—do people with COVID-19, for example, tend to cough more during a certain part of the day or night? “When do they begin coughing? Does a change in cough frequency indicate a certain prognosis?” Brew asks. “These are super basic questions that no one really knows [the answer to] at this point.”

Privacy concerns of recording audio

More than 40 countries and 21 US states and territories use official state-sponsored COVID-19 applications that aid contact tracers in stemming the spread of the virus. Despite the promise of these apps, few people in the US have downloaded them, in part because users worry about handing over their detailed location history to their government, says Chaccour. 

Any app monitoring cough would also require permission to record audio through users’ smartphones and, in Hyfe’s case, track their location to measure cough at a community level. Brew says he hopes to reassure users that the app would only record roughly half-second snippets following an abrupt noise. Still, he says he thinks there needs to be some value offered back to the user. One idea is to design a dashboard of users’ personal data akin to what Fitbit does with step counts. “Nobody cared about step counts 15 years ago until Fitbit made it trendy,” he says.

To Chaccour, if the technology proves useful, the end goal is not to provide it to governments, but perhaps to third-party companies such as Apple or Google, which can integrate it into their phone operating systems. He’s noticed that in Spain, people aren’t very trustful of the official government COVID-19 app, but they don’t mind their phones listening for users to summon the voice assistant or tracking how much they’ve slept.

Brew says he thinks the present moment is a perfect opportunity to roll out the technology. People care about public health and “cough counting would have been interesting in 1990, but right now we have some five billion humans carrying a microphone with them at all times every day, everywhere.”

Published at Mon, 07 Dec 2020 00:22:30 +0000