{"id":4092,"date":"2020-12-07T02:37:55","date_gmt":"2020-12-07T02:37:55","guid":{"rendered":"https:\/\/techclot.com\/index.php\/2020\/12\/07\/confronting-virtual-dualities-in-the-work-of-multimedia-artist-lawrence-lek\/"},"modified":"2020-12-07T02:37:55","modified_gmt":"2020-12-07T02:37:55","slug":"confronting-virtual-dualities-in-the-work-of-multimedia-artist-lawrence-lek","status":"publish","type":"post","link":"https:\/\/techclot.com\/index.php\/2020\/12\/07\/confronting-virtual-dualities-in-the-work-of-multimedia-artist-lawrence-lek\/","title":{"rendered":"Confronting &#8216;virtual&#8217; dualities in the work of multimedia artist Lawrence Lek"},"content":{"rendered":"<p><a href=\"https:\/\/www.google.com\/url?rct=j&#038;sa=t&#038;url=https:\/\/www.dailyprincetonian.com\/article\/2020\/12\/lawrence-lek-artist-talk&#038;ct=ga&#038;cd=CAIyHDkyYmU1MGQ5NjY1NjYxZTA6Y28udWs6ZW46R0I&#038;usg=AFQjCNGMIrAyIN-bVFJKeiaDV1PBBJrUIA\">Confronting &#8216;virtual&#8217; dualities in the work of multimedia artist Lawrence Lek<\/a><\/p>\n<p><p>In the documentary \u201cHyperNormalisation,\u201d Adam Curtis explains that cyberspace, as it was initially conceived, promised an alternate world free from the politics and corruption of the \u201creal world.\u201d This digital realm, its idealist advocates believed, presented an opportunity to build a democratic utopia accessible anywhere by anyone \u2014 it would be a sacred and protected space separate from reality.<\/p>\n<p>Yet, in contemporary society, it is impossible to divorce the digital from the real. Through our use of digital platforms and technologies, the virtual undergirds every moment of the quotidian, creating a cognitive dissonance in our experience of the \u201creal world.\u201d What is real, and what is just a simulation? What differentiates humans from artificial intelligence? How has our very ontology begun to change as a result of the blurring between real and virtual worlds?<\/p>\n<div class=\"d-block inline-ad my-4 mr-0 mr-md-4 text-center d-print-none\">\n<div class=\"bg-light my-4 px-md-5\">\n<div class=\"container-fluid\">\n<div class=\"row\">\n<div class=\"col-12 mb-4\">\n<p>\n  ADVERTISEMENT\n<\/p>\n<p>                                          <!-- START gryphon\/ads\/rectangle-1.tpl --><\/p>\n<p><!-- END gryphon\/ads\/rectangle-1.tpl -->\n                                        <\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<p>These are a few of the many questions the multimedia artist, filmmaker, and musician <a href=\"https:\/\/lawrencelek.com\/\" target=\"_self\" rel=\"noopener noreferrer\">Lawrence Lek<\/a> explores in his practice. <a href=\"https:\/\/www.sadiecoles.com\/artists\/51-lawrence-lek\/biography\/\" target=\"_self\" rel=\"noopener noreferrer\">Lek<\/a> is currently a Ph.D. candidate in machine learning at the Royal College of Art in London, England, and is of Malaysian-Chinese descent. He received his B.A. in architecture from the Trinity College at the University of Cambridge, and his Masters in architecture from The Cooper Union in New York. Lek has been awarded several residencies and has been exhibited widely, from Rome to Hong Kong. He is represented by <a href=\"https:\/\/www.sadiecoles.com\/exhibitions\/current\/\" target=\"_self\" rel=\"noopener noreferrer\">Sadie Coles HQ<\/a>, an art gallery in London.<\/p>\n<p>Through a constellation of works generated across various platforms and mediums, Lek builds parallel worlds of his own to reimagine the present in the future (or the future in the present). On Nov. 5, Lek gave a <a href=\"https:\/\/artmuseum.princeton.edu\/video\/artist-talk-lawrence-lek-2020-sarah-lee-elson-international-artist-residence\" target=\"_self\" rel=\"noopener noreferrer\">public talk<\/a> hosted by the Princeton Art Museum to discuss the research interests and questions that guide his practice, focusing in particular on three films: \u201cSinofuturism (1839-2046 AD)\u201d<em> <\/em>(2016), \u201cGeomancer\u201d<em> <\/em>(2017), and \u201cAIDOL\u201d<em> <\/em>(2019). Lek is the 2020 <a href=\"https:\/\/artmuseum.princeton.edu\/learn\/research\/artist-in-residence\/sarah-lee-elson-class-1984-international-artistinresidence-program\" target=\"_self\" rel=\"noopener noreferrer\">Sarah Lee Elson International Artist-In-Residence<\/a>, a <a href=\"https:\/\/artmuseum.princeton.edu\/learn\/research\/artist-in-residence\/sarah-lee-elson-class-1984-international-artistinresidence-program\" target=\"_self\" rel=\"noopener noreferrer\">residency<\/a> sponsored by the Museum in which an artist working outside the United States visits the University campus to present a public lecture and hold workshops, discussions, and meetings with students and faculty.<\/p>\n<p>Perhaps influenced by his background in architecture, Lek is fascinated by worldbuilding. He remakes familiar cityscapes digitally as a means of exploring how time bends in the virtual world. In \u201cGeomancer,\u201d Lek manipulates time on multiple fronts: On the one hand, he warps narrative time by making an AI satellite the protagonist in a bildungsroman (coming of age) story set during the centennial of Singapore\u2019s independence, superimposing the timeline of the AI onto that of the nation-state.<\/p>\n<p>On the other hand, Lek alters the physical landscape \u2014 the visual elements of worldbuilding \u2014&nbsp;to signal the fluidity of geological and historical time. In the film, the contemporary cityscape of Singapore\u2019s Marina Bay grounds the film in the present, while environmental details denote the futurity of the film, such as flooding induced by climate change. Place becomes a way to bring divergent moments together into a singular point in time, obfuscating the distinction between present and future, real and virtual.<\/p>\n<figure class=\"w-100 embedded-media embedded-image\">\n    <img data-recalc-dims=\"1\" height=\"640\" width=\"640\" decoding=\"async\" data-src=\"https:\/\/i0.wp.com\/snworksceo.imgix.net\/pri\/15c3390d-d6ea-4b3b-a27a-323bff8ac934.sized-1000x1000.jpeg?resize=640%2C640&#038;ssl=1\" class=\"simple-lightbox lazyload\" data-full=\"https:\/\/snworksceo.imgix.net\/pri\/15c3390d-d6ea-4b3b-a27a-323bff8ac934.sized-1000x1000.jpeg?w=1500&amp;ar=5%3A4&amp;fit=crop&amp;crop=faces\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 640px; --smush-placeholder-aspect-ratio: 640\/640;\"><figcaption class=\"embedded-caption\">\n<h6>Lawrence Lek, <em>Geomancer<\/em>, 2017 [still] \/ Lawrence Lek, courtesy Sadie Coles HQ, London<\/h6>\n<\/figcaption><\/figure>\n<p>Lek also interrogates the real-digital divide by incorporating AI into his work. Citing game theorist Alan Turing GS \u201938, he suggests that \u201cintelligence is not a definite thing \u2026 it\u2019s not a destination to arrive to, but it\u2019s a state where you actually don\u2019t know the difference between A or B \u2026 it\u2019s this idea of uncertainty \u2026 and it\u2019s also the way a lot of machine learning and deep-learning algorithms operate today.\u201d<\/p>\n<div class=\"d-block inline-ad my-4 mr-0 mr-md-4 text-center d-print-none\">\n<div class=\"bg-light my-4 px-md-5\">\n<div class=\"container-fluid\">\n<div class=\"row\">\n<div class=\"col-12 mb-4\">\n                  <!-- START gryphon\/ads\/mm-in-article.tpl --><\/p>\n<div class=\"ad leaderboard text-center\">\n<p>\n  ADVERTISEMENT\n<\/p>\n<p>        <!--MONUMETRIC [BTF|Pos 1] D:300x250,320x50 T:300x250,320x50,320x100 M:300x250,320x50,320x100 START--><\/p>\n<p>    <!--MONUMETRIC [BTF|Pos 1] D:300x250,320x50 T:300x250,320x50,320x100 M:300x250,320x50,320x100  ENDS-->\n        <\/div>\n<p>    <!-- START gryphon\/ads\/mm-in-article.tpl -->\n            <\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/div><\/div>\n<p>In his talk, Lek mentioned <a href=\"https:\/\/deepmind.com\/alphago-korea\" target=\"_self\" rel=\"noopener noreferrer\">a historic match<\/a> of the Chinese board game Go, in which Google DeepMind\u2019s AlphaGo AI defeated a human Go master for the first time. Human genius, Lek seems to imply, is not so far removed from the machine \u2014 a suggestive claim he explores in his work, such as in the film \u201cAIDOL,\u201d in which an AI songwriter helps a fading pop star revitalize her career.<\/p>\n<p>Lek most directly probes at the overlap between human intelligence and AI in his video essay, \u201c<a href=\"https:\/\/lawrencelek.com\/post\/149659359171\/sinofuturism-is-an-invisible-movement-a-spectre\" target=\"_self\" rel=\"noopener noreferrer\">Sinofuturism (1839-2046 AD)<\/a>.\u201d A blend of documentary, social realism, and conspiracy, the film puts forth a theory of ontology modeled on several stereotypes about China, which Lek calls Sinofuturism, defined in the video essay as \u201ca form of artificial intelligence, a massively distributed neural network focused on copying rather than originality, addicted to learning of massive amounts of raw data rather than philosophical critique or morality, with a posthuman capacity for work, and an unprecedented sense of collective will to power.\u201d<\/p>\n<p>Lek first conceived of the work while researching the relationship between East Asia and AI and the media\u2019s representations of the two. He observed that \u201cportrayals of Chinese industrialization and of AI were actually mirror images of each other \u2014 [that through rapid expansion and growth, they would] either \u2026 save us all or destroy us all.\u201d<\/p>\n<p>In \u201cSinofuturism,\u201d Lek highlights similarities between the two through seven overarching themes \u2014 computing, copying, gaming, studying, addiction, labor, and gambling \u2014 claiming in the film that the \u201cessential unknowability of the AI to the human, of the mystique of a consciousness beyond emotional understanding, is exactly the same other identified in Orientalism.\u201d<\/p>\n<div class=\"row align-items-center my-4 border-top border-bottom py-2\">\n<div class=\"col-12\">\n<div class=\"align-items-center font-serif text-center\" readability=\"25.024096385542\">\n            <img data-recalc-dims=\"1\" decoding=\"async\" data-src=\"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2020\/12\/Ggxr2w.png?w=640&#038;ssl=1\" alt=\"Subscribe\" class=\"d-inline-block m-0 pr-2 w-25 lazyload\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\"><\/p>\n<div class=\"d-inline-block\" readability=\"5.6506024096386\">Get the best of <em>\u2018the Prince\u2019<\/em> delivered straight to your inbox. <a href=\"https:\/\/www.dailyprincetonian.com\/page\/subscribe\" class=\"text-primary-color\">Subscribe now \u00bb<\/a><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/div>\n<p>His self-evident awareness of the narrative convergence between AI and Chinese industrialism transforms his embrace of their stereotypes into an act of subversion, one that articulates a broader truth about human civilization \u2014 namely, as Lek puts it, that cultures simply want to perpetuate themselves.<\/p>\n<figure class=\"w-100 embedded-media embedded-image\">\n    <img data-recalc-dims=\"1\" height=\"640\" width=\"640\" decoding=\"async\" data-src=\"https:\/\/i0.wp.com\/snworksceo.imgix.net\/pri\/d2b6c0f4-78f2-4d44-af05-b1d7d61c3d8d.sized-1000x1000.jpeg?resize=640%2C640&#038;ssl=1\" class=\"simple-lightbox lazyload\" data-full=\"https:\/\/snworksceo.imgix.net\/pri\/d2b6c0f4-78f2-4d44-af05-b1d7d61c3d8d.sized-1000x1000.jpeg?w=1500&amp;ar=5%3A4&amp;fit=crop&amp;crop=faces\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 640px; --smush-placeholder-aspect-ratio: 640\/640;\"><figcaption class=\"embedded-caption\">\n<h6>Lawrence Lek, <em>Geomancer<\/em>, 2017 [still] \/ Lawrence Lek, courtesy Sadie Coles HQ, London<\/h6>\n<\/figcaption><\/figure>\n<p>Most of the film consists of found footage \u2014 a melange of newsreels, InfoWars clips, gaming tournaments, movie stills, and virtual reality simulations \u2014 which document the fiction of China put forth by the media. By appropriating the original source material and thus creating a copy, Lek presents his theory of Sinofuturism through the very form of the work.<\/p>\n<p>The copy \u2014 the virtual, the counterfeit, the non-human \u2014&nbsp;becomes synonymous with the real, as it is only through this act of copying that the video essay and Lek\u2019s theory of Sinofuturism come into being.<\/p>\n<p>As an assemblage of internet footage, the (im)materiality of \u201cSinofuturism\u201d<em> <\/em>challenges traditional hierarchies within the art world, blurring the distinction between \u201chigh art\u201d and \u201clow art\u201d \u2014 that is, the \u201cautonomous artwork\u201d of genius and the art of the masses (pop culture). Through \u201cSinofuturism,\u201d Lek asks us to reassess our valuation of the copy: \u201c[the] video essay [\u2018Sinofuturism\u2019] \u2026 [aims] to see things from the machine\u2019s perspective, to see copying as not lesser than originality \u2026 to see gameplay and gambling as not lesser than the fine arts or humanities or sciences as well. To configure not just the right and wrong relationship through bias, but also to reconfigure the role of art, broadly speaking, in transforming reality.\u201d<\/p>\n<p>Lek thus challenges the singularity and authenticity of the original (the human) and elevates the position of the copy (the inhuman AI, the Chinese worker). The copy is distributed, consumed and reprocessed to produce new contents; much like the original, it molds reality.<\/p>\n<p>All works of art, regardless of whether they are copies or \u201coriginals,\u201d are only legible insofar as they recycle a language familiar to their audience. What is a work of art, then, but a <em>re<\/em>-presentation of signs already in circulation?<\/p>\n<\/p>\n<p>Published at Mon, 07 Dec 2020 00:56:15 +0000<\/p>\n<p><a href=\"https:\/\/www.google.com\/url?rct=j&#038;sa=t&#038;url=https:\/\/www.the-scientist.com\/news-opinion\/ai-assisted-cough-tracking-could-help-detect-the-next-pandemic--68233&#038;ct=ga&#038;cd=CAIyHDkyYmU1MGQ5NjY1NjYxZTA6Y28udWs6ZW46R0I&#038;usg=AFQjCNGFQBeY2zSkfDZDuORVjTJ-pvJuyQ\">AI-Assisted Cough Tracking Could Help Detect the Next Pandemic<\/a><\/p>\n<p><p><span class=\"dropcap\">W<\/span>hen Joe Brew worked for the Florida Department of Health as an epidemiologist for two years starting in 2013, he helped with syndromic surveillance, meaning he had the arduous job of reviewing the symptoms of patients coming into the emergency departments from all across the state. The goal of such work: to detect an abnormal spike of symptoms in an area that may indicate there\u2019s a public health concern.&nbsp;<\/p>\n<p>Public health authorities worldwide continue to use this type of surveillance. The outbreak of a novel pathogen in Wuhan, China in late 2019, for instance, was in part detected by a large uptick of patients coming to the hospital with symptoms of a respiratory infection, with unknown etiology. But Brew says this system fails to prevent the transmission of a virus like SARS-CoV-2 because by the time patients arrive at the hospital, they have likely already been infectious for a matter of days. COVID-19 tests, too, often fail to return a result in time for patients to properly isolate while they\u2019re infectious.<\/p>\n<p>This realization led Brew to turn to a device that billions have in their pockets\u2014a smartphone\u2014to provide public health authorities with real-time symptomatic data from the community. Brew and several colleagues founded Hyfe, a free phone application that uses artificial intelligence to detect and track users\u2019 coughs, a hallmark of many respiratory conditions including COVID-19.&nbsp;<\/p>\n<blockquote readability=\"7\">\n<p>Cough has long been a symptom that physicians record, yet the method for monitoring it is typically limited to a self-report during a clinic visit.<\/p>\n<\/blockquote>\n<p>\u201cThe way you beat COVID is by acting fast\u2014by being ahead of it,\u201d says Brew, who is now the CEO of Hyfe. \u201cThose places that very quickly identified clusters and outbreaks were able to shut things down and basically control the pandemic even without a vaccine.\u201d<\/p>\n<p>People who are curious to monitor their cough frequency trends, say, if they have a respiratory condition or want to share these data with loved ones or medical professionals, can download Hyfe onto their smartphones. When the application hears a loud, abrupt noise, such as a cough, it captures that approximately half-second snippet of sound and converts it into a 3-D image called a spectrogram that represents the pitch and intensity of the sound over time. The spectrogram is then processed through a machine learning algorithm known as a convolutional neural network, which has been trained on a dataset of more than 270,000 sounds\u2014a cough, laugh, grunt, burp, or a fork hitting a plate, for example\u2014that two human listeners have labeled as a cough or not a cough that help the algorithm determine whether the abrupt noise was, indeed, a cough. The app also learns an individual\u2019s unique cough, so if the app detects another person\u2019s cough that significantly deviates from their previous coughs, it won\u2019t log it.<\/p>\n<p>Brew and a team of scientists aim to take this concept into the field to see if they can detect an outbreak of an infectious respiratory pathogen. Their pilot project will examine the cough data at a community-wide level in a small municipality on the outskirts of Pamplona, Spain.<\/p>\n<div class=\"fr-image-container fr-full\" readability=\"11\">\n<div class=\"fr-image\"><img data-recalc-dims=\"1\" decoding=\"async\" data-src=\"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2020\/12\/qGJ1pl.png?w=640&#038;ssl=1\" class=\"fr-fic fr-dib lazyload\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\"><\/div>\n<p>A spectrogram, which is a visual representation of audio frequency and intensity over time, that shows an archetypical cough. Hyfe converts snippets of users\u2019 suspected coughs into these images and processes them through a machine learning algorithm to determine whether the sound is a cough.<\/p>\n<p>HYFE<\/p>\n<\/div>\n<p>\u201cThere\u2019s a lot of people thinking about diagnostics, but everyone is thinking individually,\u201d ISGlobal Barcelona Institute for Global Health epidemiologist Carlos Chaccour, who is leading the study, tells <em>The Scientist<\/em>. \u201cBut so far, the community perspective has not been pursued.\u201d<\/p>\n<p>While Brew admits that there could be many non\u2013infectious disease causes for a cough\u2014air quality, asthma, allergies, fumes from cooking, to name a few\u2014he says the lack of specificity is a feature not a bug. In the case of <a href=\"https:\/\/www.the-scientist.com\/news-opinion\/countries-begin-large-scale-screening-for-sars-cov-2-in-sewage-67535\" target=\"_blank\" rel=\"noopener noreferrer\">wastewater surveillance<\/a>, which is another tool for public health officials to use to indirectly detect and monitor the spread of infectious disease, \u201cyou\u2019re already looking for the virus, and you\u2019re already in an epidemic situation,\u201d he says, whereas Hyfe could, in theory, detect a surprise flare-up from an novel disease before it progresses to that point.<\/p>\n<blockquote readability=\"9\">\n<p>The great thing about this cough monitoring is it will be passive\u2014the patient won\u2019t actually have to do anything.<\/p>\n<p><span>\u2014Alyn Morice, Hull York Medical School<\/span><\/p>\n<\/blockquote>\n<p>Chaccour and his team have enrolled more than 60 people in the community so far and aim for as many as 500 to test the concept. As part of the study, participants grant researchers access to their hospital records and their Hyfe data to determine if a rise from a baseline level of coughing among the participants correlates to more diagnoses of respiratory conditions, including COVID-19.&nbsp;<\/p>\n<p>If Hyfe can successfully demonstrate that its detection of a higher community incidence in coughing precedes more respiratory diagnoses in the clinic, Chaccour says, he envisions users could then view a heat map of anonymized data showing which communities have the highest prevalence of coughing\u2014a tool that could come in handy for public health officials but also people hoping to understand the risk of infection in a community.<\/p>\n<h2>A missed opportunity<\/h2>\n<p>Cough has long been a symptom that physicians record, yet the method for monitoring it is typically limited to a self-report during a clinic visit. Previous research <a href=\"https:\/\/erj.ersjournals.com\/content\/41\/2\/277\" target=\"_blank\" rel=\"noopener noreferrer\">suggests<\/a> that patients often underreport how much they cough, which has led epidemiologists such as Brew to think that there is untapped potential in using cough data.<\/p>\n<p>Alyn Morice, who is the head of respiratory medicine at Hull York Medical School in the UK and specializes in the diagnosis and treatment of cough, says that patients answering questionnaires are entirely unreliable. He\u2019s even seen patients fabricate data from peak flow meters, portable devices that patients use at home to measure how efficiently air flows through their lungs.<\/p>\n<p>\u201cThe great thing about this cough monitoring is it will be passive\u2014the patient won\u2019t actually have to do anything,\u201d he says.<\/p>\n<p>University of Washington infectious disease expert <a href=\"https:\/\/globalhealth.washington.edu\/faculty\/peter-small\" target=\"_blank\" rel=\"noopener noreferrer\">Peter Small<\/a>, who is the senior director of Global Health Technologies and is not involved with Hyfe, tells <em>The Scientist<\/em> he is optimistic about this new frontier of technology, particularly in the context of eradicating tuberculosis. \u201cPatients seek care very late in the disease and part of that is because we, as a society, tend to ignore cough in adults,\u201d he says.&nbsp;<\/p>\n<p>With the help of AI-assisted cough tracking, he envisions a world in which users who have been coughing at a higher rate than normal for, say, two weeks would receive a text notifying them of their symptoms with directions to a public clinic that can test them for tuberculosis.<\/p>\n<p>Even in patients with a confirmed TB diagnosis, Small says, the technology could show patients\u2019 recovery progress. \u201cI\u2019ve been around a lot of TB patients and it\u2019s a very disconcerting diagnosis,\u201d he says. \u201cEven though it\u2019s almost always curable, it\u2019s psychologically difficult on patients, and having objective evidence that their cough is getting better can help with their spirits.\u201d<\/p>\n<blockquote readability=\"8\">\n<p>Cough counting would have been interesting in 1990, but right now we have some five billion humans carrying a microphone with them at all times every day, everywhere.<\/p>\n<p><span>\u2014Joe Brew, Hyfe<\/span><\/p>\n<\/blockquote>\n<p>The Hyfe group is not the only one working to integrate cough into a more effective public health tool. Morice, for instance, developed an alert system using cough tracking through an external monitor users wear around their necks for impending chronic obstructive pulmonary disease (COPD) exacerbations\u2014severe episodes that can often lead to hospitalization. His research team detected 45 percent of these flare-ups an average of four days prior to diagnosis, according to <a href=\"https:\/\/erj.ersjournals.com\/content\/56\/suppl_64\/975\" rel=\"noopener noreferrer\" target=\"_blank\">data they presented<\/a> at this year\u2019s European Respiratory Society virtual conference. With early intervention, patients who take steroids or bronchodilators can prevent or lessen the severity of these exacerbations, Morice says.<\/p>\n<p>\u201cIf you\u2019re able to prevent hospital admissions in these folks, it\u2019s much better for the patient but it\u2019s also much better for the health economy because [treating the] exacerbation is an expensive thing,\u201d he says.<\/p>\n<p>An MIT group tried to develop a tool to determine if it\u2019s possible to identify a COVID-19\u2013specific cough. The researchers processed more than 70,000 forced-cough audio samples, of which 2,660 were submitted by people with COVID-19, through a machine learning algorithm, which they claim accurately identifies 98.5 percent of coughs from people who were confirmed to have COVID-19, including 100 percent of forced coughs from those who were asymptomatic, according to their paper published in October in <a href=\"https:\/\/ieeexplore.ieee.org\/stamp\/stamp.jsp?tp=&amp;arnumber=9208795\" target=\"_blank\" rel=\"noopener noreferrer\"><em>IEEE Journal of Engineering in Medicine and Biology<\/em><\/a>.&nbsp;<\/p>\n<p>In other endeavors, researchers are soliciting healthy and COVID-19\u2013infected individuals to help train their AI models to eventually allow users to understand if they have the virus based on their cough. These projects include a Bill and Melinda Gates Foundation\u2013funded initiative, <a href=\"https:\/\/www.wadhwaniai.org\/2020\/04\/07\/cough-against-covid\/\" target=\"_blank\" rel=\"noopener noreferrer\">Cough Against Covid<\/a>, at the Wadhwani Institute for Artificial Intelligence in Mumbai, the University of Cambridge\u2019s <a href=\"https:\/\/www.covid-19-sounds.org\/en\/\" target=\"_blank\" rel=\"noopener noreferrer\">COVID-19 Sounds<\/a> project, and the <a href=\"https:\/\/coughvid.epfl.ch\/\" target=\"_blank\" rel=\"noopener noreferrer\">Coughvid<\/a> project at the Swiss Federal Institute of Technology Lausanne.<\/p>\n<p>Morice remains skeptical of apps that claim they can diagnose users\u2019 coughs: \u201cFrankly, I don\u2019t believe them. You can tell a wet cough from a dry cough, but that\u2019s about it from the cough sounds.\u201d Several coauthors of the MIT study declined requests for an interview to discuss their work.<\/p>\n<p>Brew says he not only wants to better understand the acoustic signature of different ailments, but also track the diurnal pattern of cough\u2014do people with COVID-19, for example, tend to cough more during a certain part of the day or night? \u201cWhen do they begin coughing? Does a change in cough frequency indicate a certain prognosis?\u201d Brew asks. \u201cThese are super basic questions that no one really knows [the answer to] at this point.\u201d<\/p>\n<h2>Privacy concerns of recording audio<\/h2>\n<p>More than 40 countries and 21 US states and territories use official state-sponsored COVID-19 applications that aid contact tracers in stemming the spread of the virus. Despite the promise of these apps, <a href=\"https:\/\/www.wsj.com\/articles\/more-states-offer-covid-19-contact-tracing-apps-but-adoption-is-uneven-11605974401\" target=\"_blank\" rel=\"noopener noreferrer\">few people in the US<\/a> have downloaded them, in part because users worry about handing over their detailed location history to their government, says Chaccour.&nbsp;<\/p>\n<p>Any app monitoring cough would also require permission to record audio through users\u2019 smartphones and, in Hyfe\u2019s case, track their location to measure cough at a community level. Brew says he hopes to reassure users that the app would only record roughly half-second snippets following an abrupt noise. Still, he says he thinks there needs to be some value offered back to the user. One idea is to design a dashboard of users\u2019 personal data akin to what Fitbit does with step counts. \u201cNobody cared about step counts 15 years ago until Fitbit made it trendy,\u201d he says.<\/p>\n<p>To Chaccour, if the technology proves useful, the end goal is not to provide it to governments, but perhaps to third-party companies such as Apple or Google, which can integrate it into their phone operating systems. He\u2019s noticed that in Spain, people aren\u2019t very trustful of the official government COVID-19 app, but they don\u2019t mind their phones listening for users to summon the voice assistant or tracking how much they\u2019ve slept.<\/p>\n<p>Brew says he thinks the present moment is a perfect opportunity to roll out the technology. People care about public health and \u201ccough counting would have been interesting in 1990, but right now we have some five billion humans carrying a microphone with them at all times every day, everywhere.\u201d<\/p>\n<\/p>\n<p>Published at Mon, 07 Dec 2020 00:22:30 +0000<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Confronting &#8216;virtual&#8217; dualities in the work of multimedia artist Lawrence Lek In the documentary \u201cHyperNormalisation,\u201d&#8230;<\/p>\n","protected":false},"author":3,"featured_media":4093,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[3],"tags":[],"class_list":["post-4092","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence"],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2020\/12\/15c3390d-d6ea-4b3b-a27a-323bff8ac934.sized-1000x1000.jpeg?fit=1600%2C900&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p3orZX-140","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts\/4092","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/comments?post=4092"}],"version-history":[{"count":0,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts\/4092\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/media\/4093"}],"wp:attachment":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/media?parent=4092"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/categories?post=4092"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/tags?post=4092"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}