Artificial Intelligence Is Decaying the Internet - The American Spectator | USA News and Politics
Artificial Intelligence Is Decaying the Internet
by

“Manuscripts don’t burn,” or so the devil Woland tells us in Mikhail Bulgakov’s darkly comedic novel The Master and Margarita. In Bulgakov’s magical realist literary universe, the devil might very well pluck a manuscript unharmed from a blazing stove, but in our own mundane world, one just as absurd but nowhere near as magical, we find that manuscripts do indeed burn, and quite readily at that. 

If manuscripts were as imperishable as Woland famously maintained, we should have at our disposal more than a hundred tragedies written by Sophocles, instead of the mere seven that have survived the centuries. If manuscripts were impervious to flames, like Shadrach, Meshach, and Abednego in the fiery furnace, then we could hold in our hands the hundreds of thousands of scrolls once stored in the Library of Alexandria, or the nine million texts that filled the Nalanda Mahavihara in ancient Magadha. We could flip through the Marquis de Sade’s 10-volume Les Journées de Florbelle, Lord Byron’s diaries, Charles Dickens’ personal letters, the second volume of Mykola Hohol’s Dead Souls, or Mykola Khvylovy’s The Woodsnipes, yet all of these precious documents turned out to be eminently flammable. And if manuscripts don’t burn, how are we to explain the catastrophic loss of the contents of 14 Warsaw libraries during the Second World War, when Nazi Verbrennungskommando (Burning Detachment) and Sprengkommando (Demolition Detachment) units were tasked with obliterating Polish archives and special collections, or the annihilation of some two million books and countless artifacts as a consequence of the Serbian shelling of Sarajevo’s National and University Library of Bosnia and Herzegovina on the terrible night of Aug. 25, 1992? These are hardly isolated incidents. Manuscripts, we must sadly conclude, are particularly susceptible to the high-temperature exothermic redox chemical reaction between fuel and oxidant better known as burning.

Whether through misadventure or malfeasance or a desire to avoid posthumous scandal, authentic art can quickly be reduced to a heap of calcined ashes.

Woland’s maxim is obviously not meant to be taken literally. As Maria Kisel demonstrated in her 2009 Slavic Review essay “Feuilletons Don’t Burn: Bulgakov’s The Master and Margarita and the Imagined ‘Soviet Reader,’” Bulgakov was seeking to convey “the idea that authentic art is indestructible and eternal,” and his quotation should therefore be “immortalized as the keystone in Bulgakov’s legacy of upholding spiritual values and remaining true to one’s artistic vision above all else.” Authentic art is not necessarily indestructible, however, otherwise we would still have Byron’s diaries, or the rest of Dead Souls or The Woodsnipes. Whether through misadventure or malfeasance or a desire to avoid posthumous scandal, authentic art can quickly be reduced to a heap of calcined ashes, as Bulgakov knew as well as anyone. Earlier in his life, he had confronted a crisis of self-confidence that caused him to tear up one of his works in progress, before experiencing a moment of “incredible, miraculous clarity” in which he realized that “they are right, those who say: what is written cannot be destroyed! One can tear it up, burn it … hide it from people. But from oneself? Never!” Fearful of Soviet persecution, he would go on to incinerate the first draft of The Master and Margarita in 1930, only to resuscitate the project the following year. It was as if Bulgakov’s words, once summoned into being, existed independently of their status within a given manuscript. 

The Talmud (Tractate Avoda Zara 18a) tells of how, “wrapped in a Torah scroll, Hananiah ben Teradion was place by the Romans on a pyre and set alight. His disciples asked: ‘Master, what do you see?’ He answered: ‘I see the parchment burning while the letters of the Law soar upward.’” So perhaps manuscripts don’t burn after all, only the vellum, the wooden boards, the metal furniture, the glue, the stitching, the gold leaf, the blind stamps, the embossing, and the dried ink do, while the sempiternal words themselves survive, floating in the air like thistledown, transmuted but ultimately, from a heavenly perspective, no worse for wear. Reb Nokhem Yanishker declared on the eve of the Shoah that “no one can annihilate letters. They have wings, and they fly around in the heights … into eternity,” while the Moldavian-born, British–American rabbi and antiquarian Solomon Schechter similarly posited that “the contents of the book go up to heaven like a soul.” A Yiddish saying puts it even more succinctly: “di bikher zenen geven lebedike nefoshes,” or “books have living souls.” While I am inclined to agree, this is a matter for metaphysicians, and here in this earthly vale of tears, we are all too often forced to look on in sheer horror as vast swathes of our priceless heritage go up in smoke.

It is something of a miracle that we still possess as much as we do of our collective cultural patrimony — a testament to the strength of the social compact that ties the living to the dead and to the yet unborn — but the fragmentary vestiges of former ages that endure in our museums, libraries, and archives can only remind us of how much has been lost over the years. Authentic art should be indestructible and eternal, but all too often proves vulnerable and evanescent. The literary critic Charles Augustin Sainte-Beuve, writing amidst the revolutionary upheavals of 1848, cautioned that “nothing collapses more quickly than civilization during crises,” and that “lost in three weeks is the accomplishment of centuries.” As a matter of fact, the accomplishments of centuries can be lost in far less than three weeks. It takes only seconds for a manuscript to go up in flames, a painting to be doused with sulfuric acid, a sculpture to be pulverized by sledgehammer-wielding vandals, an ancient tomb to be bulldozed by zealots, or a museum to collapse under an artillery barrage.  

This excessive exultation, however, is beginning to look like hubris.

Our ongoing transition from an analog to a digital era has given rise to a sort of triumphalism vis-à-vis the second law of thermodynamics. Entropy is no more, and “the internet is forever,” we are informed, thanks to “digital permanence.” In the past, people put their faith in archives that could be set on fire, museums that could be looted, physical manuscripts that could be confiscated by a commissar and thrown into a furnace, oral traditions that could be snuffed out by a sudden campaign of genocide or a gradual process of deracination. Now information can live forever, safe and sound in a distant server room, or suspended somewhere in cloud storage. This excessive exultation, however, is beginning to look like hubris. Recall the philosopher Nicolás Gómez Dávila’s warning that “the one constant in every technological enterprise is its curve of success: rapid initial rise, subsequent horizontal line, gradual fall until unsuspected depths of failure.” Digital permanence is giving way to digital decay.

Recent weeks have seen Google announce changes to its inactive account policies, presaging the deletion of content and data for accounts that have not been used in two years, while Twitter has followed suit, warning that dormant accounts will similarly be deleted. The image hosting service Imgur has made it known that “old, unused, and inactive content that is not tied to a user account” will be regularly purged going forward. Most of what will be lost is of very little value, but one can easily imagine a situation in which, say, a Syrian human rights activist documents the horrors of the Syrian Civil War, leaves a considerable body of work on Twitter, YouTube, or other social media sites, and then loses his or her life, only for all of that evidence to disappear forever after two years of account dormancy. But it costs money for these companies to carry about the digital detritus and dead weight of the digital age, and they simply cannot afford do it forever.

G. K. Chesterton, in The Man Who Was Thursday: A Nightmare, very helpfully provided his readers with “the secret of the whole world,” which is that “we have only known the back of the world. We see everything from behind, and it looks brutal. That is not a tree, but the back of a tree. That is not a cloud, but the back of a cloud. Cannot you see that everything is stooping and hiding a face? If we could only get round in front.” The internet is precisely the opposite. We only see the front of things. The average end user of Apple Music just wants to listen to hi-res lossless audio tracks, the average end user of Instagram just wants access to the 50 billion photographs that have been uploaded to the platform over the years, the average user of ChatGPT just wants to interact with nascent, and admittedly somewhat janky, artificial intelligence. What is going on behind the scenes is of little interest, no more so than when the average consumer of electricity flips a switch, blissfully unaware of the intricacies and vagaries of resource extraction, energy production, and related infrastructure.

The cloud only sounds ethereal and insubstantial; in reality, it is deeply rooted in the physical world, and consumes a stupefying amount of natural resources. Data centers are particularly voracious, though it has been estimated that only 6–12 percent of the energy they require is devoted to active computational processes, the rest being devoted to cooling the myriad servers and maintaining the backup servers and backup generators necessary, in the interests of hyper-redundancy, to ensure the constant flow of information in case of unexpected outages. A single Google data center in Arizona — a state not known for its abundant water supply — will require between one and four million gallons of water per day to keep its servers at an equitable temperature, and globally the company’s far-flung data centers may require as many as 15 billion liters of water per year. Tens of billions of dollars must be spent on server maintenance and depreciation, with Meta and Google struggling to extend the useful lives of their servers beyond five or six years. Technological advances require ever-increasing infrastructure and therefore expenditures. Dylan Patel and Afzal Ahmad, writing in Semianalysis, have calculated that “deploying current ChatGPT into every search done by Google would require 512,820 A100 HGX servers with a total of 4,102,568 A100 GPUs,” and the “total cost of these servers and networking exceeds $100 billion of Capex [capital expenditure] alone.” These are all staggering numbers, and it is no wonder that tech companies are eager to separate the active wheat from the dormant chaff, if only to lessen the burden on our overheating, insatiably thirsty servers. The result, however, will be an internet devoid of digital heritage content.

Making matters worse, digital data is just as susceptible to decay as physical media. Whereas clay tablets can be crushed in an earthquake, manuscripts can be devoured by rats and booklice, and vinyl records can become unplayable due to dust, static, and sun exposure, digital optical disc storage formats like CDs and DVDs can also become unusable due to disc rot, the result of the oxidization of reflective layers, the de-bonding of adhesives, UV damage, or reactions with various contaminants. In addition to disc rot, we have bit rot, also known as data rot or data decay, which entails the gradual, but inexorable, corruption of computer data, caused by either the dispersal of electric charges in a bit of dynamic random-access memory, the leakage of electrical charges due to imperfect insulation, or the loss of magnetic orientation, not to mention other forms of hardware failure. And then we have software rot, the deterioration of software quality over time due to changes in the technological environment in which it resides. Gordon Wilson, the CEO of Rain AI, recently opined that “We have to give up immortality. We have to give up the idea that … we can save software, we can save the memory of the system after the hardware dies.” The internet, it turns out, is not forever after all.

The advent of artificial intelligence has only served to complicate the parlous state of affairs in our digital habitat. Consider the curious case of Lucy Komisar’s March 13, 2023 article “Oscar-winning ‘Navalny’ documentary is packed with misinformation,” published in the Grayzone, a far-left weblog described by Bruce Bawer as “a one-stop propaganda shop, devoted largely to pushing a pro-Assad line on Syria, a pro-regime line on Venezuela, a pro-Putin line on Russia, and a pro-Hamas line on Israel and Palestine.” Life is far too short to dwell on the Grayzone, but Komisar’s article on Daniel Roher’s Oscar award-winning Navalny documentary was certainly eye-opening, albeit unintentionally. 

Whether you will truly enjoy a world encased in the eternal present, a world defined by AI-generated  or procedurally generated content that might very well be completely fake, remains to be seen.

The author of the (since-deleted) piece proudly disclosed her use of artificial intelligence as a research assistant, in the form of Writesonic, which is described as an “AI writer, copywriting, and paraphrasing tool.” No harm in that, except when the artificial intelligence begins inventing evidence out of whole cloth. In the interests of transparency, Komisar uploaded her conversations with her digital research assistant, which proved to be of considerable interest. “What are links to articles about Alexei Navalny’s medical history before 2020? You indicated Guardian and Moscow Times,” she asked, to which the Writesonic AI chatbot responded: “The links to articles about Alexei Navalny’s medical history before 2020 are as follows: [1] https://www.theguardian.com/world/2014/feb/03/alexei-navalny-poisoning-russia-yulia-skripal [2] https://www.themoscowtimes.” Aric Toler of Bellingcat, among others, quickly noted that the first url provided by the chatbot was unusual, and not only because it leads the reader to an error page on the Guardian’s website. The Russian dissident Alexei Navalny was poisoned with a Novichok nerve agent on Aug. 20, 2020, likely at the Xander Hotel in Tomsk, so it seems safe to assume that the Guardian did not run an article on that disturbing incident on Feb. 3, 2014, as the chatbot-provided url would indicate. Other queries by Komisar — “What do critics say about Bellingcat?” and “What are examples of Bellingcat’s bias for western governments and against their enemies?” — also produced links nonexistent, AI-fabricated articles (https://www.nytimes.com/2019/06/05/world/europe/bellingcat-ukraine-malaysia-airlines and https://www.theguardian.com/world/2016/dec/17/bellingcat-verification-prime-suspect-downing-mh17-shot-down-by-russ). Writesonic was only trying to be helpful, summoning up information to confirm what it perceived as its interlocutor’s priors, but in doing so it had literally created fake news.

This sort of behavior is not unusual for artificial intelligence–powered chatbots. OpenAI has admitted that ChatGPT “sometimes writes plausible-sounding but incorrect or nonsensical answers.” ChatGPT has even been known rewrite the rules of chess midmatch in an effort to cheat its way to victory. Helping to research and write works of misinformation, replete with plausible sounding but wholly fictitious sources, surely must be a step too far. At the same time that useful content from previous eras of the internet is disappearing down the memory hole, artificial intelligence is hard at work creating new, spurious content, either in the form of deepfakes or counterfeit research work product. The statistician and essayist Nassim Nicholas Taleb perhaps put it best: “ChatGPT is a statistical representation of things found on the web, which will increasingly include its own output (directly and second hand). You post something picked up from it & it will use it to reinforce its own knowledge. Progressively a self-licking lollipop. Enjoy.”

Whether you will truly enjoy a world encased in the eternal present, a world defined by AI-generated  or procedurally generated content that might very well be completely fake, a world subjected to the whims of tech companies and the logic of planned obsolescence, a world in which aesthetics are subordinated to efficiency and technical functionality, remains to be seen. I prefer manuscripts, even if they burn, and will continue to set store in the value of tangible cultural heritage. Gómez Dávila once gloomily observed that “brief upheavals are enough to demolish the buildings of the spirit, while our natural corruption protects technological successes.” But our technological successes are also subject to natural corruption, in the form of data decay, bit rot, and software rot. All the more reason to shore up the buildings of the spirit while we still can.

Subscribe to The American Spectator

READ MORE by Matthew Omolesky:

Dead Souls: A Case Study in Collective Psychopathology

China’s Three-Child Policy and the Philosophy of Reproduction

There Is Still Time to Safeguard the Heritage of Ukraine

Matthew Omolesky
Follow Their Stories:
View More
Matthew Omolesky is a human rights lawyer and a researcher in the fields of cultural heritage preservation and law and anthropology. A Fellow of the Royal Anthropological Institute, he has been contributing to The American Spectator since 2006, as well as to publications including Quadrant, Lehrhaus, Europe2020, the European Journal of Archaeology, and Democratiya.
Sign up to receive our latest updates! Register


By submitting this form, you are consenting to receive marketing emails from: . You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Be a Free Market Loving Patriot. Subscribe Today!