In an age it is disturbingly familiar for news maps to place us on tenterhooks by grabbing our attention, the existential urgency of the blanket of the continent with icy arctic air was no exception. But if the images of sudden entrance of frigid air shocked most states in the union and lower forty eight, the farther one collapsed the week of freezing cold, the more one could see a clarion call for the re-entrance into the Paris Accords,–as if the visualizers of meteorological disturbances at NOAA, newly liberated, were able to show the dangerous consequences of the tippy polar vortex and uncertain weather in an era of extreme climate change. Bright color ramps foregrounded falling temps in rich magenta or icy blue were almost off the charts, from the uppermost end of the spectrum in their duration–below–or in the low temperatures that were advanced–in maps that push the boundaries of expectations with urgency. As maps of the hours the nation was plunged into subzero trace a purple cold front advanced all the way into the deep south as it spread across the continent from up north, the continent shivered under the icy blues over the mid-February cold spell.
The chromatic intensity jarred with the familiar spectrum of meteorological maps to shock the viewer: the map challenged any reader to try to place the arrival of cold air and hours below freezing in a frame of reference, to dismiss the incursion of icy air up to the US-Mexico border as an irregular occurrence, more than a harbinger of premonition of the cascading effects of extreme weather, let alone a warning of the limits of our national infrastructure to adjust to it. If the focus of the NOAA maps of the National Weather Service fulfilled their mandate by focussing on the territoriality of the United States, these images and the news maps made of them communicated a sense of national violation, if not of the injustice of the incursion of such unexpected freezing temperatures and Arctic air, as if it were an unplanned invasion of the lifestyle, expectations, energy policy, and even of the electric grid of the United States, oddly affirming the American exceptionalism of the United States’ territory and climate, as if the meteorological maps that confounded predictions were not a global climactic change.
And in the maps of the fall in national temperatures, as in the header to this post, the news that the nation witnessed a frozen core spread south to the southwest, almost reaching the border, seemed to shift our eyes from a border that was mapped and remapped as permeable to migration, to a map of unpreparedness for climate change, almost echoing the systemic denial of climate change that has been a virtual pillar of the Trump Presidency on the eve when Donald Trump had permanently relocated to Mar a Lago, one of the last areas of the nation that was not hit by the subzero temperature anomalies that spread across north Texas, Oklahoma, Kansas, Missouri, New Mexico and Iowa, plunging the many states we though of as “red” during the past election an icy deep blue interior in mid-February down to the Gulf Coast–as if the colors were a national crisis not of our own making for a nation that had obsequiously voted Republican, withdrawn from the Paris Accords, and allowed the warmer temperatures to be located only in the state where Donald Trump was now residing in Mar a Lago.
–that , as the week of arctic air’s arrival wore on, the newspaper of record glossed by a color ramp of low temperatures few residents southern states expected to be plunged into subzero surroundings. The color ramp they chose to chart how gelid air poured set off a cascade of events and disasters nicely demonstrated cascading effects of climate change on the nation, as the shock of low temperatures sucked the national attention away from the border, and begged one to come to terms with the challenge of climate emergencies in global terms. The frozen core of the nation was a wake-up call.
The entrance of gelid air from a polar vortex poured across much of the midwest in unrelenting fashion. Plunging subzero temps hit the Texas coast that overloaded electric grids and shocked weather maps that seemed out of whack even for mid-February, as even the sunbelt of the southwest turned gelid cold as subzero temperatures arrived over a week, plunging the arctic neckline down into Texas, and almost across the southwestern border.
The shock of this map is its dissonance, of course, from the weather maps that we are used to seeing, the entire nation now, in mid-February, almost blanketed by subzero temperatures of deep blue cold, extending wispy breezes into Utah and Arizona, as well as across Georgia, South Carolina, and Virginia, leaving only SoCal and Florida pleasantly warm. The national composite that forecast a deep freeze running right down the center of the United States and spreading to both coast at northern latitudes gave the nation a frozen core at the end of a hotly tempered election, that seemed a wake-up call to attend to long-term as well as immediate dangers of climate change, but made it difficult to disentangle the global issues from the existential question of millions in Texas and other states who were left without heating faced dangerously cold and unprecedented subzero temperatures, without clues about where to keep warm.
The impact of climate change has rarely been so directly placed on the front burner of national security–climate change deniers have preferred to naturalize polar melting by removing it from human agency so far to attribute shifting temperature to sunspot activity, or invoke longue durée theories of geological time enough to make noted paleontologist Stephen Jay Gould turn over in his grave. Doing so has stoked a devious confusion between local and global, and immediate and long-term, are bound to be increasingly with us in an era of extreme climate change. The sudden entrance into our borders of such gelid air is an effect of global warming. We are loosing our beaches, and cities like Galveston, TX, Atlantic City NJ, Miami Beach FL, Key West, and Hilton Head SC are not alone in falling into the sea to lie mostly underwater in 100 years. As Ron Johnson assured us that “Greenland” derived its name from the green leafy bucolic forests of the continent–“There’s a reason Greenland was called‘ Greenland’–it was actually green at one time [even if] it’s a whole lot whiter now”–as if the truth about deep time was concealed by those overly alarmed ice shelves falling into the Atlantic, shifting ocean salinity with a sudden injection of freshwater that may alter the Gulf Stream, we were invited to contmplate the fierce urgency of now.
Perhaps the whole question of a span of time, as much as the theoretical proposition of global warming, was a concern. For we are as a country already looking forward with apprehension at maps of economic costs of flood damage to residences, amidst the anxiety charged year of COVID-19 pandemic, with multiple variants now on the loose, to prepare for escalating costs of climate change across the country, and not only on the coasts.
If Louisiana and California coastal cities will seem destined to stand the greatest risk of damage or residences, both due to the high valuation of California’s coastal properties, and the danger of hurricane damages across the Gulf Coast, the increased risk that residences alone face bodes serious economic losses across the United States. Yet as risk rises and brings with it escalated insurance rates, we stand to see the cascade of economic losses, of the sort we have not come to terms in imagining the fanciful image of a time when Greenland enjoyed lush forests in the past–a scenario that never happened, inventive etymologies aside–although it may soon host plant life as it looses its permafrost.
While we are increasingly deadened by data visualizations that track the infectious spread of COVID-19 across the world and country, their logic has often been implicit. As much as tracking real-time data of deaths and “hot-spots” in the world and the nation, we trust the data viz to orient us to the infectious landscape to better gain understanding of viral spread. We seek to grasp nature of the virus’ transmission, and perhaps hope that we can better grasp its spread. We depend on these daily updates to retain a sense of agency in the chaos, but realize that they are provisional, contingent, and selective snapshots, based on testing, and exist at a time delay from the virus’ actual distribution–without much predictive value. We maddeningly realize they are dependent on testing rates and reporting, and only as good as the datasets which they re-present.
On the heels of a 5% statewide positivity rate on December 5, 2020, California was declared in a state of shut down in all its counties. It almost seems that such graphics have started to fail us, as the spread of the virus overflows the boundaries of the map and permeates its space. The choropoleth renders individual counties all but indistinct, the state drowned in widespread infections, with only a few of its less populated regions as refuges. With a flood of purple overflowing the coastal counties, the delta, the Central Valley, and the entire south of the state, was there even any point in mapping the danger of viral spread beyond a state of red alert?
While mapping offers little comfort in the era of saturation of heightened risk, the color-codes alert inhabitants to the danger of increased stresses on the public health system–as much as visualization challenges to translate tools of data aggregation to visualize the pandemic., as December 6 rates grew by December 19. As we shift to map a decreasingly multi-colored state by the moderate, substantial and widespread virus, and widespread cases seem to flood the state, the map offers a security of some sort of monitoring of the pandemic’s spatial spread.
The sea of purple is like Spinal Tap going raising the volume “up to eleven,” and are a sign that we are in unexplored territory that won’t be accommodated by a simple color ramp–or, indeed, a familiar cartographic iconography among our current tools of styling space. While we are used to maps embodying meaning, what the colors of the map embody–beyond risk–is unknown. To be sure, at a time when fatalities from the coronavirus in the south of the state have skyrocketed from the middle of the month, hitting records in ways terrible to even contemplate, the field of purple is a deeply human story of loss, as a surge of hospitalizations have flooded the entire healthcare community, and stretched facilities of critical care beyond anything we have known, filling half of intensive care beds in LA County at Christmas. 2020 enough to make it hard to feel any relief in the close of a calendar year, as if that unit still held any meaning, and very grim about 2021: and while the CDC allowed that there may already be a new, more contagious strain, in the nation two days before Christmas, the arrival of the more contagious strain in California and Colorado increased alarm before New Year’s.
How to get a handle on the novel coronavirus that we have been pressing against COVID-19 dashboards since March to grasp better, and will we able to do so in 2021?
Whatever sense agency the maps impart, it is an agency that is only as good as the compromised sense of agency that we expect in an era of geolocation, on which most maps track reports of infection. Even as we face the rather grim warning that we are waiting for the arrival of a vaccine that, in the Bay Area, rates of immunization face steep obstacles of vaccine distribution due to pragmatics of freezer space required, training of extra health care workers, and monitoring and tracking the two-stage process of vaccination, we will depend for public sanity on maintaining clear communication in maps. The actual tracking of the novel coronavirus doesn’t translate that well to a state-wide model, or a choropleth, although it is the method for public health advisories that makes most sense: we do not have small-scale public health supervision in most of the nation, although they exist at some counties. The stressed Departments of Public Health in areas are without resources to manage COVID-19 outbreaks, public health compliance, or retaliations for public health violations: and the effort to create public health councils to manage compliance and violations of public health orders may be seen by some as an unneeded bureaucracy, but will give local governments resilience in dealing with an expanding epidemic, at the same time as governmental budgets are stressed, and no body of law about COVID violations exists.
Rather than map on a national or state-wide level, we can best gain a sense of how much virus is out there and how it moves through attempts of contact tracing–even if the increasing rates of infection may have gone beyond the effectiveness of contact tracing as a methodology that was not quickly adapted by a federal government the prioritized the rush to a vaccine. The basis for such a map in LA county can reveal the broad networks of contagion, often starting in small indoor gatherings across the region, and moving along networks of spatial mobility across the city and San Fernando Valley, and help embody the disease’s vectors of transmission as we watch mortality tallies on dashboards that give us little sense of agency before rising real-time tolls.
If such ESRI maps suggest a masterful data tracing and compilation project, the data is large, but the format a glorification of the hand-drawn maps of transmission that led to a better understanding of the progress of Ebola on the ground in 2014, used by rural clinics in western African countries like Liberia and Rwanda to stop the infectious disease’s transmission and monitor the progress of contagion to limit it–as well as to involve community members in the response to the virus’ deadly spread.
We may have lost an opportunity for the sort of learning experience that would be most critical to mitigate viral spread in the United States, as no similar public educational outreach was adopted–and schools, which might have provided an important network for diffusing health advisories to families, shifted predominantly to distance learning and providing meals, but we became painfully aware of the lack of a health infrastructure across America, as many openly resisted orders to mask or to remain indoors that they saw as unsubstantiated restrictions of liberty, not necessary measures.
We are beyond contact tracing, however, and struggling with a level of contagion that has increased dramatically with far more indoor common spaces and geographic mobility. Yet the broad public health alerts that these “news maps” of viral spread offer readers omits, or perhaps ignores, the terrifying mechanics of its spread, and the indoor spaces in which we know the virus is predominantly acquired. The rise of newly infectious mutated strains of the novel coronavirus was in a sense inevitable, but the rising tension of what this means for the geographical distribution and danger of the coronavirus for our public health system is hard to map to assess its wide distribution, and we take refuge in mitigation strategies we can follow.
Why have we not been more vigilant earlier to adapt the many indoor spaces in which the virus circulates? It bears noting that the spread of virus in the state was undoubtedly intensified by over a hundred deaths and 10,000 cases of infection to spread in the density of a carceral network, which seems an archipelago incubating the spread of viral infections in the state. We only recently mapped the extent of viral spread across nineteen state prisons by late December 2020, tracked by the Los Angeles Times, but often omitted from public health alerts–
–and the density of Long-Term Care centers of assisted living across the state, which were so tragically long centers of dangers of viral spread, as the New York Times and Mapbox alerted us as the extreme vulnerabilty of elder residents of nursing homes, skilled nursing facilities, retirement homes, assisted-living facilities, residential care homes who cannot live alone was noted across the world. The data that was not provided in the grey-out states interrupted the spread of infections among those often with chronic medical conditions was not surprising, epidemiologically, but terrifying in its escalation of the medical vulnerability of already compromised and vulnerable populations–and steep challenges that the virus posed.
unlike those greyed out states that fail to release data on deaths linked to COVID-19 infections, congregate on the California coast: while the New York Times depicted point-based data of the over 100,000 COVID-related deaths in nursing homes are a small but significant share of COVID deaths, exposure for populations with extraordinarily high probability of possessing multiple possibilities for co-morbidities is probably only a fraction of infections.
We strain to find metrics to map the risk-multipliers that might be integrated into our models for infectious spread. It seems telling to try to pin the new wave of infections in a state like California to increased contact after Thanksgiving–a collective failure of letting up on social distancing in place since March–as the basis for a post-Thanksgiving boom in many regions of the state, using purely the spatial metrics of geolocation that are most easily aggregated from cell phone data in the pointillist tracking of individual infections in aggregate.
Based on cell-phone data of geolocation, a proxy for mobility or social clustering that offered a metric to track Americans’ social proximity and geogarphical mobility–including at shopping centers, oceanside walks in open spaces, and even take-out food curbside pickups, as well as outdoor meals and highway travel, few counties curbed aggregation as one might hope–although the fifty foot metric accepts the many outdoor congregations that occurred, well within the Cuebiq metric, wearing or without masks. A magenta California registered pronounced proximity, grosso modo, discounting any mindful innovative strategies in the state.
It is stunning to have a national metric for voluntary mobility, rough as it is, to measure internalization of social distancing protocols and potential danger of a post-holiday COVID-19 bump. To be sure, we are stunned by geolocation tools to aggregate but risk neglecting the deeper infrastructures that undergird transmission, from forced immobility. While geolocation tools offer opportunities for collective aggregating whose appeal has deep historical antecedents in measuring contagion and anticipating viral transmission by vectors of spatial proximity, geospatial tools create a geolocation loop in visualizations which, however “informative” perpetuate a spatiality that may not clearly overlap with the actual spatiality of viral transmission.
Even if we demanded to map what were the novel coronavirus had “hot-spots” in mid- to late March, as if processing the enormity of the scale we didn’t know how to map, aggregating data without a sense of scale.
Across the summer, it seems best to continue to map daily numbers of cases, relying on whatever CDC or hospital data from Hopkins we had, trying to aggregate the effects of the virus that was spreading across the country whose government seemed to provide little economic or medical plan, in maps that tallied the emergence of new cases, as new hotspots appeared across the land, meriting attention difficult to direct.
We are plowing infections and mortality with abandon in a steady diet of data visualizations that purport to grasp disease spread, that were once weighted predominantly to the New York area, even as they spread throughout the nation by the end of March, but remaining in the thousands, at that point, as even that low threshold was one by which we were impressed. The tracking of the local incidence of reported cases seemed to have meaning to grasp the meaning of transmission, with a pinpoint accuracy that was assuring, even if we had no way to understand the contagion or no effective strategy to contain it. But we boasted data visualizations to do so, focussing on the nation as if to contain its spread in antiquatedly national terms, for a global pandemic, not mapping networks of infection but almost struggling to process the data itself.
After all, the John Snow’s cholera maps of John Snow are the modern exemplars foregrounded in data visualization courses as game-changing images as convincingly precise pictures of infection progressing from a water pumps in London neighborhoods is often seen as a gold standard in the social efficacy of the data visualization and information display. The elevation of the pinpoint mapping of cholera mortality in relation to a water pump from which the deadly virus was transmitted in a nineteenth-century London neighborhood:
The Snow Map so successfully embodies a convincing image of contagion that it has taken on a life of its own in data vis courses, almost fetishized as a triumphant use of the plotting of data that precisely geolocated mortality statistics allow, and can indeed be transposed onto a map of the land to reveal the clustering of death rates around the infamous Broad St. pump, that created a legible vector of the transmission of diseases in the Soho neighborhood, so convincing to be touted as a monument of the data sciences.
Snow is lauded for having effectively showed that, in ways that changed scientific practices of collective observation and public health: rather than being communicated by miasmatic infections that spread to low-lying London from the Thames, mortality rates could gain a legibility in proximity to a pump that transmitted an infectious virus, often presented as a conceptual leap of Copernican proportions (which it was, when contrasted to miasma that emanated from the Thames to low-lying areas–if it anticipated a bacteriological understanding of viral transmission). The association of danger with the water procured on errands from neighborhood pumps however replaced the noxious vapors of a polluted river, as in earlier visualizations of the miasma that lifted the noxious fumes of the polluted Thames river to unfortunate low-lying urban neighborhoods, who were condemned by urban topography to be concentrations of a density of deaths of more striking proportions and scale than had been seen in the collective memory.
Snow made his argument by data visualizations to convince audiences, but he mapped with a theory of contagion. But if Snow’s maps works on how the virus is communicated in outdoor spaces–and how a single site of transmission can provide a single focus for the aggregation of mortality cases, COVID-19 is an infection that is most commonly contracted in indoor spaces, shared airspace, and the recycled unfiltered air of close quarters. And the indoor spaces where COVID-19 appears to be most often transmitted stands at odds with the contraction in outdoor common spaces of the street and service areas of water pumps, that create the clear spacial foci of Snow’s map, and the recent remapping by Leah Meisterlin that seeks to illuminate the urban spaces of the contraction of cholera in a digital visualization as a question of intersecting spatialities.
Current visualization tools compellingly cluster a clear majority of cholera deaths in proximity to a publicly accessible pump where residents drew water where viral pathogens that had colonized its handle. But we lack, at this point, a similarly convincing theory of the transmission of the novel coronavirus SARS-CoV-2.
But the logics of COVID-19’s communication is nowhere so crisp, and difficult to translate to a register that primarily privileges spatial contiguity and proximity–it is not a locally born disease, but a virus that mutates locally across a global space: a pandemic. And although contact-tracing provides a crucial means of trying to track in aggregate who was exposed to infection, we lack any similarly clear theory to metaphorically grasp the contagion–and are increasingly becoming aware of the central role of its mutation to a virus both more infection and that spreads with greater rapidity in confronting the expansive waves of infection in the United States–as if an escalated virulence grew globally in the first months of this rapidly globalized pandemic.
Our dashboards adopted the new versions of web Mercator, perhaps unhelpfully, to offer some security in relation to the nature of viral spread, which, if they served as a way of affirming its truly global scope–
–also suggested that global traffic of the virus demands its own genomic map, as the virus migrates globally, outside strictly spatial indices of global coverage, and that perhaps spatial indices were not the best, in the end, for accounting for a virus that had began to develop clear variants, if not to mutate as scarily as many feared, into a more virulent form.
And it may be that a genomic map that allow the classification of viral strains of genomic variability demand their own map: for as we learn that genomic mutation and variation closely determines and affects etiology, communication of the viral strains offers yet a clearer illustration that globalization articulates any point in terrestrial space to a global network, placing it in increased proximity to arbitrary point not visible in a simple map, but trigger its own world-wide network of markedly different infectiousness or virulence.
From December 4 2019, indeed, we could track emergent variants of the virus best outside of a spatial scale, as much as it reminded us that the very mobility of individuals across space increased the speed and stakes of viral contagion, and the difficulty to contain viral spread, in the interconnected world where viral variation recalled a flight map, set of trade routes, or a map of the flow of financial traffic or even of arms. Mutations were understood to travel worldwide, with a globalism that a spatial map might be the background, but was indeed far removed, as we moved beyond questions of contact tracing to define different sizes of genomic mutation and modifications that we could trace by the scale of mutations, not only the actual places where the virus had arrived.
Was place and space in fact less important in communicating the nature of COVID-19’s increasing virulence?
The maps of genomic variation traced not only the globalization of the virus, but its shifting character, and perhaps etiology across some thirty variants by late April, that show both the global spread of the virus, and the distinct domination of select strains at certain locations, in way that researchers later theorized the ability to “track” mutations with increasing precision. If researchers in Bologna defined six different variants of coronavirus from almost 50,000 genomes that had been mapped globally in laboratory settings to map variants of the virus whose signatures showed little more variability than strains of the flu in June, variations of signatures seemed a manner to map the speed of coronavirus that had traveled globally from by February 202 to the lungs of the late Franco Orlandi, an eighty-three year old retired truck driver from Nembro, Italy, whose family could not place China on a map when, following diagnostic protocol, attendant physicians in Bergamo asked if Orlandi had, by chance, happen to have traveled to China recently.
Despite lack of serious mutation, thankfully, the data science of genomic sequencing of the COVID-19 cases triggered by genomic mutations of SARS-CoV-2 genome of just under 30,000 nucleotides, has experienced over time over 353,000 mutation events, creating a difficult standard for transmission into equivalent hot spots: some hot spots of some mutations are far more “hot” than others, if we have tried to plot infections and mortality onto race, sex, and age, it most strikingly correlates to co-morbidities, if all co-morbidities are themselves also indictors of mortality risk. While the mutations have suggested transmission networks, have the presence of different levels of mutations also constantly altered the landscape of viral transmission?
It makes sense that the viral variant was tracked in Great Britain, the vanguard of genomic sequencing of the novel coronavirus as a result not only of laboratory practices but the embedded nature of research in the National Health Services and the monitoring of public health and health care. Enabled by a robust program of testing, of the some 150,000 coronavirus genomes sequenced globally, England boasts half of all genomic data. Rather than being the site of mutations, Britain was as a result the site where the first viral variant was recognized and documented, allowing Eric Volz and Neil Ferguson of Imperial College London to examined nearly 2,000 genomes of the variant they judged to be roughly 50% more transmissible than other coronavirus variants, magnifying the danger of contagious spread in ways feared to unroll on our dashboards in the coming months. As teams at the London School of Hygiene and Tropical Medicine studied the variant in late 2020 in southeast England, they found it to be 56% more transmissible than other variants, and raised fears of further mutations in ways that rendered any map we had even more unstable.
The virus SARS-CoV-2 can be expected to mutate regularly and often. While England boasts about half of all global genomic data on the virus, of the 17 million cases of SARS-CoV-2 infections in the United States, only 51,000 cases of the virus were sequenced–and the failure to prioritize viral sequencing in America has exposed the nation to vulnerabilities. And although California has sequenced 5-10,000 genomes a day of the novel coronavirus samples by December, and Houston’s Methodist Hospital have mapped 15,000 sequences as it watches for new viral variants; an American Task Force on viral variants will be rolled out early in 2021, as the discovery of viral mutations haves spread across five states in the western, eastern, and northwestern United States. While it is not clear that the viral variant or mutations would be less susceptible to polyclonal vaccines, most believe variants would emerge that would evade vaccine-induced immunity.
While I was phone banking in Texas, Nevada, and other states in months before the 2020 election, I fielded a surprising number of questions of access to absentee ballots and mail-in voting, as well as being assured by many voters that they had refrained from mailing in ballots, and were planning to drop their ballots off directly in polling stations, or brave the lines, to ensure their votes counted. I’d like to think they did. (The woman I reached in Texas who had moved from Nevada and was awaiting an absentee ballot to arrive two days before the election, past the deadline of registering in Texas, may have not.) Even as we advance through “Trump’s final days of rage and denial,” and charges of fraudulence and the robbery of red states from the Grand Old Party’s self-appointed King haunt public White House pronouncements and social media posts, the electoral map that provide the formal reduction of how votes were tallied is cast as a contested ground, questioned on the basis of voting machines, absentee ballots, and socially distanced voting practices, as if these inherently distance the franchise and undermine democratic practice. Donald Trump invites the nation to squint at the map, examine its mediated nature and instability, querying the resolution of any election as, shockingly, only a handful of congressional Republicans admit he lost a month after voters cast seven million votes for his opponent, whose victory 88% of Republicans in Congress refuse to acknowledge.
Unlike other elections, for a month after Election Day–November 3, 2020–the nation waited in eery limbo, uncertain about the legitimacy of the election so that even by December 2, CNN was projecting victors in several “swing” states. Although the New York Times and AP projected the conclusion of the election on paper, announcing late-arriving news of electoral victory almost a full week after Election Day, seeking to invest a sense of conclusion in a protracted debates–if oddly channeling “Dewey Defeats Truman.”
The inset map still indicated three states still “not called.” But the new President Elect appeared boosted by the classic alliance of Democratic voters that Donald Trump saw as unlikely, and had failed to align in 2016.
Months after Election Day, CNN was still “projecting” Biden’s surpassing the electoral vote threshold of 270, shifting two midwestern and one southern state to the Democratic column, with Arizona: the delay of verification in a range of legal gambits still being followed by the Trump campaign, which raised over $170 million to press its case for recounts, investigations into allegations voter fraud through the Save America PAC, disorientingly stubbornly refusing to admit the validity of the electoral map, and even repeating, into December, hopes that an opening for a Trump victory materialize if one state select electors, to reassemble the swath of red that flooded the national map back in 2015 as if playing a puzzle: “If we win Georgia, everything falls in place!” The electoral map was something of an idol of the Republican Party, as Donald Trump’s hopes for electoral victory faded, but refused to recede into mid-December.
Weeks after Election Day, we entered into a weirdly protracted attempt to game the electoral map, long after the initial tallying of votes had ceased. A range of recounts, hand-counts, investigations of absentee ballots and even querying of the legitimacy of voting machines have been launched to challenge the representational validity of the electoral map in ways that should give us pause for how it aimed to undermine the representational value of the voting practices. In querying the functions of the map as representation–by querying the tabulation of votes that comprise the electoral map–Trump has stoked tensions in representational democracy. With unsettling abandon, Trump stoked national tensions by refusing to acknowledge he did not win the election, as if determined to break with Presidential decorum for a final time, as if seeking to leave a legacy of disruption in his wake.
To be sure, gaming the electoral college has emerged as a recognized campaign strategy in 2020, increasingly distancing the franchise of the nation, as campaigns focussed with assiduity on the prospect not of “swing state” voters as in the past, but in flipping or holding a slate of states, that left the electoral map rendered as a sort of jigsaw puzzle that would add up to 270 votes from the electoral college, as the Wall Street Journal reminded us by mapping the Republican “game plan” that Donald Trump long knew he faced for holding onto tot the states where often slim majorities put him in office, as Democrats aimed to flip states to their column: the rhetoric of “gaming” the map to create the victorious outcome was echoed in the news cycle,–and not only in the Journal–in ways that seemed to have dedicated the distribution of public rallies that Donald Trump held long before announcing his candidacy officially, almost as soon as he entered office, in an attempt to solidify the bonds of the red expanse he celebrated as America’s heartland with his political charisma.
If Trump may have wished he didn’t take the southern states so much for granted, he had targeted Pennsylvania, Florida, and Montana–as well as Arizona and Nevada–by staging rallies, in those pre-COVID years, as if to shore up his support as if investing in the electoral votes of 2020.
If that map from National Public Radio, based Cook’s Political Report and the White House, only takes us through 2019, the campaign stops of Biden and Trump show a density to Pennsylvania, Michigan, Wisconsin, Florida, and North Carolina that suggest the depths of commitment to the gaming of the electoral map, and a deep battle in Arizona between the population centers in Phoenix and its suburbs and more rural regions.
The metaphor of “gaming” the map was hard to stop, and its logic seems to have inevitably led to the endless endgame that may result in clogging the nation’s courts with suits about the circumstances of mail-in voting in multiple states. Trump’s insistence in claiming the election not “over,” as if unfamiliar with someone else setting the parameters for television attention, speechless at the unfolding of a narrative shattering conviction of his inability to lose–that “in the end, I always win“–is not only a deepest reluctance to admit losing.
The logic of the gaming of the electoral map clearly has him and his campaign in its sway. The deeply personal sense of the election as a referendum on him and his family may have been rooted in a sense o the legal difficulties that his loss might pose: among the many emails that were sent to his base, pleading for campaign donations to the “Save America” PAC, which seemed the last line of defense to Make America Great Again,” supporters were begged to do their part in “DEFENDING THE ELECTION” and hope they hadn’t “ignored Team Trump, Eric, Lara, Don, the Vice President AND you’ve even ignored the President of the United States” given how much was on the line. The sense of impending alarm reminds us of the confidence that Trump lodged in preserving the red electoral map of 2016, a confidence that seemed almost born from his ability tot game the electoral map yet again, and overcome the polls even after they pollsters had tried to recalibrate their predictive strategies and demographic parsing of the body politic.
1. The very close margins voting margins suggest we narrowly escaped an alternative history of a second Trump term, and can explain the tenacious grip that Trump seems to have had on an alternative outcome, an outcome that he has tried to game in multiple ways and strategies that eerily echoes with the strategies of gaming the electoral map that seems to have occurred through the orchestration of telling postal delays, delayed returns of absentee ballots, and the strategic gaming of the distribution of a distanced franchise. It forces us to contemplate the counterfactual history of the far darker reality of a scenario where his expectations came true. Indeed, it should make us consider the closeness of overturning democracy. In was as if the reporting of the timestamped electoral map of Saturday, November 7 that was an inset of the Times only encouraged resistance to admitting the failure of Trump to preserve the “red swath” of 2016 across what coastal elites long bracketed as “flyover country,” where the effects of economic recession had never stopped.
It had almost happened. In Trump’s White House, a boisterous watch party was underway, crowded with FOX anchors, watching the big screen that FOX results showed to the audience, anticipating the reality of a second Trump term. But all of a sudden, Trump was so incredulous he refused to admit seeing Arizona called at 11:20 as a Biden victory, shouting to no one in particular, “Get that result changed!” Hoping to calm her triggered boss, who must have been catapulted into alternate scenarios of having to leave the White House where he had expected to encamp, former FOX employee Hope Hicks fretted about the newsfeed.
Could the map be changed? Trump was frustrated at his in ability to manipulate the news, and already apprehensive at what endgame was in store. At this point, it seems, Trump’s every-ready servile son-in-law, Jared Kushner, hurriedly placed a direct call to Rupert Murdoch to rectify the call, assuring better data would arrive from Arizona’s COVID-denying governor, Doug Ducey (R), to restore the state’s redness on the electoral map, in desperate hopes of jerry-rigging his electoral fortunes. Back in 2016, Trump had indeed only won Arizona by the narrowest of margins–by about half of the margin by which Romney won in 2012–and only third-party candidates’ popularity concealed that Democrats boosted margins of victory in precincts beyond Republicans, flipping seventy precincts to their column–perhaps as Maricopa County featured a PAC that attracted millions of dollars to defeating Sheriff Joe Arpaio’s bid to consolidate an anti-immigrant agenda.
Trump quickly recognized the danger a flipped state posed to hopes for another red swath, as the contestation over the state that he had hoped to pry from the Democratic map was a poor omen of the election, and needed to be stayed.
In 2017, Trump was so enamored of the expanse of his electoral victory to given paper copies to White House visitors–until he framed a version for the West Wing, five months after the election. And if the state is visibly fragmented in an identical mosaic in the map that Trump framed in the White House, the brilliant red of nearby Nevada and bright red diagonal suggest the state was more firmly in Republican hands than we might remember. After hoping that The Washington Post might celebrate his hundredth day in office by featuring the “impressive” the electoral map on its front page, his pride in the map led it framed the map in the West Wing, a reporter from One America News Network obligingly showed.
This alternate world of electoral victory created what must have been a prominent counter-factual map that had dominated the Trump team’s plans for victory in 2020. The White House watch party must have been haunted by the very same map of which Trump was so proud.
The entry of the data visualizations into the pitched narrative of the Presidential election is not new. If thought to begin in the collective unfolding of the election-night drama on television screens, as the casting of ballots long understood as a collective action of union has prompted a narrative of division, CNN offers a new model to personally intervene on one’s iPhone or android, as if to offer the means to ramp up agency on social media, inviting users to tap on one’s personal screen to build-your-own electoral map, perhaps to assuage one’s heightened anxiety, granting the illusion to allow yourself for entering your own alternative future. Echoing the algorithmic thinking of tallying “pathways to victory” we’d been following to exist the Trump Era with increased desperation, courtesy FiveThirtyEight and others, we imagined scenarios of the electoral constellation that might prepare for the dawning of something like a new age. As different campaigns used maps to assert “multiple pathways to victory,” the statistical likelihood of a victory seemed to suspend agency, in ways that would come to haunt the nation, in the aftermath of the election, as the tally of the vote was questioned in multiple ways, undermining the accuracy of the tally of individual votes, and injecting a degree of supicion dangerous to democracy in the name of ‘transparency.’
The standard map of the United States became a model for the President’s personal lawyer to present “evidence” by appealing the vote, long after the votes were tabulated, and winner declared, in a new form of aftermath for an election we had never experienced. If the security of paper ballots were put into question by the question of “hanging chads” that demanded hand counts with observers back in 2000, a weeks-long battle that suspended any announcement of a victor in a divided nation, that demanded “optical evidence” of the will of voters, by scrutinizing some 537 votes out of the entire nation in order to determine the victor of the electoral college, and forestall the celebrations of Democrats over the nation who expected that victory was at hand, the aftermath became distilled in 2020 to the contestation of an electoral map, the map that had come to mediate the election, as the President’s lawyer, looking like Frankenstein, returned from the dead, declared the continued existence of “multiple pathways to victory”–the very phrase that Joseph R. Biden’s circle had announced in predicting his victory. The “post-truth” announcement was not only post-truth, but a dumbed down version of voting before multiple American flags, presenting the states that the map labeled “red” that had voted for Biden to be at basis “red states,” and inevitably destined to fall into President Trump’s column. The news conference that was presented at Republican National Committee headquarters on November, 19, almost three weeks after the election was held, seemed to reclaim states’ electors as if they were enemy territory, as Trump’s legal team insisted that a spate of “irregularities in the voting system” had created numerous bases for serious fraudulence in tallies of the voting process.
The made-for-TV moment that was designed to circulate online as an iconic image crystallized the post-truth debates about the actual results of the election–a basis for the myth of a “stolen victory” that would continue until the tragically violent insurrectionary invasion of the U.S. Capitol building on January 6, 2021, a readily recognized power play of seizing the electoral map from the networks, denying the role of the media or television networks from making a prediction or declaring the victor, and deciding to gesture to the selective distortion of the electoral map as if it was evidence of the true “map” of the election, as the image of five “battleground” electoral states that the Trump campaign was announcing were the basis of its campaign to Keep America Great or Make It Great Again focussed, in a new use of Cold War rhetoric, on removing “outrageous iron curtain of censorship.”
The results of the Presidential election in these states were not particularly close, and did not recall the nail-biter of 2000, twenty years ago, when the inspection of paper ballots by impartial judges provided an unplanned basis for showcasing the legal efforts of moving the election to the United States Supreme Court as a final arbiter.
But if votes in either Michigan or Pennsylvania were hard to say were not conclusive, without either a legal theory or strategy to discard the existing tallies of the election, without disenfranchising hundreds of thousands, the post-truth campaign posited a systematic lack of vigilance of Democrats to play rough and tumble with registered voters and enshrined voting practices, arguing that the norms of voting practices were so systematically violated both in the voting machines themselves, especially paperless touch-screen voting machines that were argued to be open to manipulation, as well as the farming of ballots, and unreliability of mail-in voting practices. The proliferating basis of instability for the tabulation of votes–the foundation of the democratic process–was argued to be inherently imperfect and corrupted at its root, suggesting the election was stolen. The argument that a small Texas company had made–“Allied Security Operations Group”–posited all software used in Smartmatic voting machines demonized as designed by a corporation with ties to Venezuelan founders: the basis among staunchly conservative activists to push a sense of widespread voting fraud–perpetuated on Newsmax in Dominion voting machines–was launched not by experts, but a myth of fraudulenceWashington Post has tied to Texas businessman Russell J. Ramsland, Jr., Trump advocates would adopt to discredit the outcomes of voting tallies already tabulated in battleground states.
The story of deep skepticism about the outcome of the election was in many ways nourished by the relative indeterminacy of possible outcomes for 2020, all of them hinging on battleground states that would push the electoral college one way or another. If the process seemed to remove the voting systems from the voters, the unfounded conspiracy theory Ramsland endorsed suggested the shaky foundations of democratic institutions that were trotted out with readiness to defend the outcome of the election they sought–and seemed to find consolation in an iconic map that painted these “swing” states a uniform red. The accusations that seem to have found particularly fertile ground on one side of a digital divide, increasingly skeptical of the “irregularities” in the tabulation of votes without a paper trail, and ready to doubt the need to question faith in the electoral process as itself an erosion of democracy–a questioning that led to the belief that the hacking of voting machines were at the basis of a pernicious electoral fraud which would overturn the election that a third of American still consider fraudulent. The cartographic affidavit that was presented with the air of an undertaker as all courts had officially thrown out Trump’s case for voter fraud–even as new lawsuits proliferated, originating from Texas and filed by Texas Attorney General Ken Paxton (R) with members of the Trump campaign and seventeen Republican attorneys general, who seemed to sanction a new standard of electoral authenticity. The figure of Giuliani, unfortunately and unintentionally channeling the image of the Don Corleone charaters in the five mafia families he prosecuted when he worked for the Southern District of New York, who offered an icon of Republican attempts to strong-arm an electoral map into submission.
We’ve rarely had so divisive a President as Donald Trump, who has sought to divide the country by race, region, religion, and income, and the hopes for emerging with a new vision of the union are slim–making the amount of weight and meaning that rests on the map appear greater than ever. How it would spin out was unclear, but the red block that Trump had pulled to the considerable surprise of all political pundits was promised to be able to be chipped away at in multiple ways, sketched by so many algorithmic story maps as alternate “paths to victory.”
The array of paths each candidate faced–though we focussed on Biden’s range of options and winced at those of Trump–could be organized in what seems a rehearsal for the glossing of possible eventualities, as multiple data visualizations that led to alternative futures like so many forking roads out of a dark, dark wood.
The hope to find coherence in the map seems even greater than ever, as if it might finally purge the divides of the last four to six years. There was a grim sense of being defeated by the electoral map during the 2016 and 2020 election, with the skewing of electoral votes to low-density rural states–skewed further by the increasing distance at which those local problems appear from Washington, DC. The configuration of the electors, as the configuration of the federal representative government, are compromised by giving more pull to residents of many rural states and creating a red block that one can only hope to chip away at in the age of coronavirus either by online donations, phone-banking, or, at this late stage, by imagining alternative futures, and playing around with the map to see how the post-election endgame will play.
This election, sequestered behind our walls, often having already cast our ballot, the parlor game of playing with the CNN interactive graphics may come as a relief offering an interactive model for adjusting and tweaking the electoral map, playing out alternative scenarios whose conclusion and potential endgames we can indulge ourselves and to an extent confront our fears in this most anxiety-producing of elections by imagining alternative scenarios playing out, using a tentative set of color choices, more familiar from polls than television, to suggest the possible outcomes of the elections as we try to assemble the final tabulations of the vote, and the disputes that may arrive in each locality about margins of victory this time round, hoping to heal the abrupt chromatic divide still huring from 2016, using polls’ take on “battleground” states to game outcomes of potential electoral maps.
The above (imagined) electoral map would be the narrowest of Democratic victories, but affirm some deep divides across the nation from 2016, but might be arrived at only after recounts and disputes. The fantasy map suggests not only the open-ended nature of the vote this year, where the large number of absentee ballots tabulated during the pandemic poses problem of tabulation exacerbated by local restrictions on when the tally of votes is able to begin.
But cognitively trained as we were over the previous months–conditioned?–to entertain multiple contingencies of electoral paths “to victory” in the ecosystem of data visualizations, schooled by the acumen of considering “paths to victory” entertained by Nate Silver, the CNN maps offered not only a parlor game, but a rehearsal for glossing electoral configurations based that might emerge on November 3, 2020, should we be forced to entertain multiple “pathways to victory” that might emerge–or, as it happened, remain–as the evening proceeded. They cued possible narrative scripts.
In retrospect, of course, we could barely imagine an electoral map that was so delicately balanced on tenterhooks. The dramatic unfolding of multiple “roads to 270” suggested a possibility to reclaim the dominant metaphors from sports, pace Silver, to a narrative of democracy. Although some petulantly suggested that the mail-in ballot was more than a bummer but a trap, presenting more possibilities of limiting votes and discarding ballots, by making us more dependent on mail delivery and USPS, the expectations for vote-counting that were a byproducts of the COVID era may well have furthered democratic discourse, and the focus of the voting drive, as well as affirming the democratic centrality of the mail: as much as provide a route for the current joyless hack of a Postmaster General to intervene in the expression public will, the narrative of tabulating every vote and creating a true paper record was an unexpected reform of the tally of votes and voting process, as tabulation foregrounded political participation as a schooling in votes nowhere more evident than in the unexpected drama of the slowing down of the tabulation of votes and arrival of data into the electoral map provided an unexpected lesson of democracy.
We expected little conclusiveness in the electoral map on election night, even into the wee hours, unlike the intense drama of earlier years. The election will continue even after the counts are finalized in each state, as it is bound to be contested in perhaps ongoing and painful ways, if it proceeds not only to polling places but up through the federal courts, as new complaints about the validity of votes are posed by the Republican Party. The hope to restrict the franchise in any way possible plays to fears not only of aliens who are exercising a vote, but a new array of restrictions on the franchise.
And we could fear an endgame destined to subvert the narrative drama once located only on the electoral map, its narrative unhinged from the map, pursued in cases that debate the ways votes were tallied, compiled, tabulated beyond November 3. Nate Silver’s map as not purely prognostic. If it reinforces the deeply divided nation fractured on broad-based faults of terrifyingly portentous contiguity, it suggests a painful endgame narrative, as court cases were pressed, recounts demanded, and charges of illegal voting launched in the face of attempts to aggregate votes from mail-in ballots in states predicted to “go blue.” The possibility of such “I can’t go on, I’ll go on” was not at all appealing.
Even if static, the alternative electoral maps staged a sort of drama of hypotheticals that anticipated the dangers of deep dissatisfaction across the nation. There is a deep fear that if no souther state “flip blue,” even a truly “tenuous win” might be almost pyrrhic. The narrative is grim, if its end result may have positive elements. Is its biggest impact not in delivering a President–the outcome of the electoral system–but, this year, it is also a map of the painful endgame of litigating the vote, even if the nation is haunted by a Mason-Dixon latitudinal divide among electors which most of the nation valiantly hoped we somehow might soon put behind us.
The narrative is displaced from the election. While Nate Silver notoriously went wrong in prognosticating 2016, he reminds us, in case we forgot, “Trump didn’t win the last election by that much.” This year the true terrifying story may well be the aftermath, and the difficulty to call the election, and what this means for the nation–which is a narrative that one may only gloss from the map, which threatens not to materialize in any trustworthy way until all the votes are counted–and all legal battles around their tabulation are hopefully resolved. But the most despicable sort of battles about VOTER ID, and the deeply divisive questions of the legitimacy of who could cast a ballot, were immersed in the heady waters of debates about immigration, seemed game for inclusion, as eighteen states now require VOTER ID, in ways that pose broad risks for disenfranchisement that local administration of elections threaten to perpetuate, after the refusal to amend the historic Voting Rights Act whose teeth were removed.
As other nations puzzle over the arcane methods for employing an electoral college that dilutes the actual popular vote that is distributed among apparently aristocratic holdovers of electors, but is in fact far closer to an ideal model citizenry of those honorable to place nation first over sectarian interests, the passionate intensity of division made such ideals seem destined for planned obsolescence, for reasons maybe not far removed from media technologies.
The liberating nature we find in designing our own DIY electoral maps on our peripherals offer more than a fun exercise in alternative realities in a national compact; playing with the maps are far more effective and engaged than most other forms of narcotics for assuaging anxiety, and do lower blood pressure. There was some pleasant chutpah to seeing Phillipe Reines put out his own prediction of an overwhelming Biden electoral victory that kept Trump below 200 electors, on November 2 2020, with a prescience reveals that the narrative was indeed there to be unpacked.
There was a sense of liberation in the ability to easily enter alternate futures, thanks to CNN graphics team and your smart phone, of greater national harmony–if the possibility of harmony seems in many places pretty illusory or lost, across the red dust bowl of arid lands Great Plains, echoing John Wesley Powell’s “lands of the arid region,” now only starting to be imagined to be rendered other than red, and Appalachia. This alienated “forgotten” American persists even in the DYI electoral map that not based on tabulations of votes. But such a map seems telling: tapping states to flip their votes invest a sense of agency in our ability to make possible predictions, even more important than the vote: we have ingested so many polls in news maps, there is something liberating in playing with the electoral map ourselves, gaming multiple scenarios, fidgeting with the map as an outlet for nervous energy as we wonder how those polls will translate to an electoral map,–
–and how those states will add up to produce the only numeric legend the will really in the end count.
If we once relied on television pundits to explain the translation of the “raw” popular vote and the possibility of a win of electoral victory without a popular vote victory–then a deeply doubted as an eventuality–in describing the contest for “the percentage of the republican vote” as an obscure statistical construct. When even in the 1980 election, pundits bemoaned this “long electoral season,” the “magic map”
The tracking of local air quality this Fire Season both documents the atmospheric effects of a fire siege of 2020 and provides an eerily contemporaneous way to track the spread of particulate matter from clusters of fires across the western seaboard to be ignited at the end of a long, dry summer in late August. We were not really struck unawares by the dry lightning, but had left forests languishing, not beneath electricity lines–as last year, around this time–but under a hot sun, and high temperatures that we hardly registered as changing the ecosystem and forest floor. This year, the sun turning red like a traffic light in the middle of the afternoon, we were forced to assess the air quality as the blue sky was filled with black carbon plumes that left a grittiness in our eyes as well as in the skies.
Confronted with a red sun through pyrocumulus haze, we followed real-time surveys of air quality with renewed attentiveness as an orange pyrocumulus clouds blanketed usually blue skies of the Bay Area, obscuring the sun’s light, suffusing the atmosphere with a weirdly apocalyptic muted light, that were hardly only incidental casualties of the raging fires that destroyed houses, property, and natural habitat–for they revealed the lack of sustainability of our warming global environment.
The soot and fog that permeated “clean cities” like Portland and San Francisco came as a sudden spike in relation to the black carbon loads that rose in plumes from the fires, as if the payload of the first bombs set by climate change. The shifting demand for information that evolved as we sought better bearings in the new maps of fires that had become a clearly undeniably part of our landscape was reflected in the skill with which the sites of incidence of dry lighting strikes that hit dried out brush and forest floors, the growing perimiters of fires and evacuation zones across the west coast, and the plumes of atmospheric smoke of black carbon that would leave a permanent trace upon the land, liked to the after-effects of holocausts created by atom bombs by Mike Davis. The measurement of wind carrying airborne smoke emerged as a layer of meaning we were beginning to grasp, a ghostly after-effects of the fields of flams that began from sites of lightning hitting the earth in a Mapbox wildfire map of fields of fire across the states, radiating resonant waves akin to earthquake aftershocks, a lamination on hex bins of the fires that seemed a new aspect indicating their presence in the anthropocene.
The suitably charcoal grey base-map of the state integrates approximate origins of fires, fire spread and greatest intensity of hotspots from satellite imagery courtesy Descartes Labs and NOAA, and air pollution data integrates the fires’ spread across our picture of the state. While human reviewed and sourced, the satellite data embodies the ravages of fire across the state in ways echoed by its black charcoal base map, and reflects the need to develop new visual tools to process their devastation.
While we began to measure air quality to meet new needs to track ground-level ozone, acid rain, air toxins, and ozone depletion at an atmospheric level, the increased tracking of more common air pollutants since 1990 included airborne particulate matter (PM10 and PM2.5), carbon monoxide (CO), and ozone (O3), we track the effects of wildfire smoke by hourly levels of each at local points, parlaying sensors into newsfeeds as wildfires rage. If stocked with labels of each chromatic layer, are these real-time updates lacking not only legends–but the temporal graph that would clarify the shifting data feeds that lead us to give them the illusion of purchase on the lay of the land we are trying to acknowledge this fire season?
Watching slightly more long-term shifts in quality of air that we breath in the Bay Area, we can see striking spikes of a maximum just after the lighting siege began on August 19, 2020 across much of the state, as air quality decisively entered into a hazardous zone, tracking PPM2.5 concentrations, but entering the worst fifteen air days since registration four times since 1999, when Bay Area Air Quality Management District began reporting the levels of fire smoke in inhabited areas.
We measure fires by acreage, but the sudden spikes of air quality, while not exceeding the smoke that funneled into the Bay Area during the North Bay Fires in 2017, when the Tubbs and Atlas Fires devastated much of the Wine Country, created a run of high-smoke days, were followed by a set of sudden spikes of the atmospheric presence of particulate matter that we tried to track by isochomes, based on real-time sensor reading, but that emerge in better clarity only in retrospect.
It is true that while the AQI maps that offer snapshots of crisp clarity of unhealthy air might serve as an alarm to close windows, remain indoors, and call off school–
–as particulate matter spread across the region’s atmosphere. We are used to weather maps and microclimates in the Bay Area, but the real-time map of particulate matter, we immediately feared, did not only describe a condition that would quickly change but marked the start of a fire season.
Not only in recent days did the sustained levels of bad air suggest an apocalyptic layer that blanketed out the sun and sky, that made one feel like one was indeed living on another planet where the sun was masked–a sense heightened by the red suns, piercing through grey smoke-cover that had seamlessly combined with fog. Although the new landscapes of these AQI maps generate immediate existential panic, we should be more panicked that while we call these fires wild, they release unprecedented levels of toxins once imagined to be detected as industrial pollutants. The seemingly sudden ways that black carbon soot blanketed the Bay Area, resting on our car hoods, porches, windowsills and garbage bins were not only an instant record of climate emergency, but the recoil of overly dry woods, parched forests and lands as overdue payback for a far drier than normal winter, months and a contracted rainy season that had long ago pushed the entire state into record territory. The lack of soil moisture has brought a huge increase of wildfire risk, not easily following the maps of previous fire history, and persistence of “abnormally dry” conditions across a third of California, focussed in the Sierra and Central Valley–the areas whose forests’ fuel loads arrive carbonized in particulate form.
Local monitors of air quality suggest the uneven nature of these actual isochromes as maps–they are reconstructions of what can only be sensed locally, and does not exist in any tangible way we can perceive–but presented what we needed to see in a tiler that made differences popped, highlighting what mattered, in ways that left cities fall into the bottom of the new colors that blanketed the state, in which local sensors somehow revealed what really mattered on August 20: if the “map” is only a snapshot of one moment, it showed the state awash in ozone and PPM.
We were in a sort of existential unfolding in relation to these maps, even if we could also read them as reminders of what might be called “deep history”: deep history was introduced by Annalistes to trace climatic shifts, the deep “undersea” shifts of time, on which events lie as flotsam, moved by their deep currents that ripple across the economy in agrarian societies, suggesting changes from which modern society is in some sense free. “Deep History” has to some extent been reborn via neurosciences, as a history of the evolution of the mind, and of cognition, in a sort of master-narrative of the changes of human cognition and perception that makes much else seem epiphenomenal. If the below real-time map was time-stamped, it suggested a deep history of climate of a more specific variety: it was a map of one moment, but was perched atop a year of parched forests, lack of groundwater, and increased surface temperatures across the west: Sacramento had not received rain since February in an extremely dry winter; its inter was 46% drier than normal, and the winder in Fresno was 45% dryer in February. They are, in other words, both real-time and deep maps, and demand that we toggle between these maps as the true “layers” of ecological map on which we might gain purchase.
The levels of dessication of course didn’t follow clear boundaries we trace on maps. But at some existential level, these flows of particulate matter were not only snapshots but presented the culmination and confirmation of deep trends. We have to grasp these trends, to position ourselves in an adequate relation to their content. For the deep picture was grim: most of California had enjoyed barely half of usual precipitation levels after a very dry winter: Sacramento has had barely half of usual rainfall as of August 20 (51%); the Bay Area. 51%; parts of the Sierra, just 24%. And wen we measure smoke, we see the consequences of persistent aridity.
These are the layers, however, that the maps should make visible, And while these shifts of particulate matter that arrived in the Bay Area were invisible to most, they were not imperceivable; however, the waves of smoke that arrived with a local visibility that almost blanketed out the sun. Perhaps there was greater tolerance earlier, tantamount to an ecclipse. Perhaps that seemed almost a breaking point.
For almost a month after the first fires broke, following a sequence of bad air days and spare-the-air alerts marked our collective entrance to a new era of climate and fire seasons, fine soot blanketed the state at hazardous levels, leaving the sense there was nowhere left to go to escape.
We had of course entered the “Very Unhealthy” zone. If real-time maps condense an immense amount of information, the snapshot like fashion in which they synthesized local readings are somewhat hard to process, unless one reads them with something like a circumscribed objective historical perspective that the levels of PPM5 provides. In maps that are data maps, and not land maps, we need a new legend, as it were, an explanation of the data that is being tracked, lest it be overwhelmed in colors, and muddy the issues, and also a table that will put information on the table, lest the map layers be reduced to eye candy of shock value, and we are left to struggle with the inability to process the new scale of fires, so unprecedented and so different from the past, as we try to gain bearings on our relation to them.
Of course, the real-time manner that we consume the “news” today
militates against that, with feeds dominating over context, and fire maps resembling increasingly weather maps, as if to suggest we all have the skills to read them and they present the most pressing reality of the moment. But while weather maps suggest a record of the present, these are not only of the current moment that they register. Looking at them with regularity, one feels the loss of a lack of incorporating the data trends they depict, and that are really the basis of the point-based maps that we are processed for us to meet the demand for information at the moment, we are stunned at the images’ commanding power of attention to make us look at their fluid bounds, but leave us at sea in regards to our relation to what is traced by the contour lines of those isochrones.
We can, in the Bay Area, finally breathe. But the larger point re: data visualizations is, perhaps, a symptom of our inflow of newsfeeds, and lies in those very tracking maps–and apps–that focus on foregrounding trends, and does so to the exclusion of deeper trends that underly them, and that–despite all our knowledge otherwise–threatens to take our eyes off of them. When the FOX newscaster Tucker Carlson cunningly elided the spread of wild fires ties to macro-process of climate change, calling them “liberal talking points,” separate from climate change, resonating with recent calls for social justice movements to end systematic racism in the country: although “you can’t see it, but rest assured, its everywhere, it’s deadly. . . . and it’s your fault,” in which climate change morphed to but a “partisan talking point” as akin to “systematic racism in the sky.”
While the deep nature of the underlying mechanics by which climate change has prepared for a drier and more combustable terrain in California is hard to map onto to the spread of fires on satellite maps, When climate denialism is twinned with calls for reparations of social injustice or gun control as self-serving narratives to pursue agendas of greater governmental controls to circumscribe liberties, befitting a rant of nationalist rage: the explanations on “our” lifestyles and increased carbon emissions, only pretenses to restrict choices we are entitled to make, Carlson was right about the depths at which both climate change and systematic racism offer liberal “lies”–especially if we squint at tracking maps at a remove from deep histories, and cast them as concealing sinister political interests and agendas, the truly dark forces of the sinister aims of governmental over-reach in local affairs.
“Structural racism” is indeed akin to the deep structure of climate change if the cunning analogy Tucker Carlson powerfully crafted for viewers did not capture the extent of their similarities. For if both manifest deep casualties created by our society, both depart from normalcy and both stand to hurt the very whites who see them as most offensive. The extent of inequalities of systematic racism as present in our day-to-day life as is the drying out landscape. And the scope of climate change is able to be most clearly registered by the evident in trends of diminished precipitation, groundwater reserves or temperature change that create environmental inequalities, too often obscured by the events of local air quality or maps of social protests that respond to deep lying trends.
To be sure, the tracking of environmental pollutants underlay the national Pollution Prevention Act of 1990, and led to a number of executive orders that were aimed to set standards for environmental justice among minority communities who long bore the brunt of industrial pollutants, from lead paint to polluted waters to hazardous waste incinerators. And, as we are surrounded by racial inequalities that are visible in systematic inequalities before the law, and have lowered life expectancies of non-whites in America by 3.5 years, increasing rates of hypertension, cancer, and systematic disenfranchisement of blacks–these extensive inequalities hurt whites, and hurt society. As Ibrahim X. Kendi perceptively noted, White Supremacists affirm the very policies that benefit racist policies even when they undercut interests of White people; they “claim to be pro-White but refuse to acknowledge that climate change is having a disastrous impact on the earth White people inhabit.” Is there a degree of self-hatred that among Carlson’s viewers that informs Carlson’s frontal attack on climate change and structural racism as myths, more content to blame non-Whites for structural inequalities.
But these inequalities are evident in the differences in air quality that climate change creates. For if the AQI maps tell us anything, it is the absence of any preparedness for the interconnections of fire, smoke, and large dry stretches of a long story of low precipitation that have created abnormally dry conditions–indeed, drought–across the state.
The intensity of severe drought across the conifer-dense range Sierras raises pressing questions of federal management of lands: the moderate to severe drought of forested lands intersect with the USDA Forest Service manage and the over 15 million acres of public lands managed by the federal government manages or serves as a steward.
–that crosses many of the dried out wildland and rangeland forested with conifers and dense brush, a majority of which are managed by federal agencies–19 million acres, or 57%– but with climate change are increasingly drier and drier, which only 9 million are privately owned.
Yet the reduction of Wildland Fire management by 43.98% from FY2020 to FY2021 in President Trump’s budget continued the systematic erosion of funding for the United States Forest Services. As California weathered longer and longer fire seasons under Donald Trump’s watch, Trump made budget cuts $948 million to the Forest Service for fiscal year 2020, after defunding of US Forest Services by reducing mitigating fire risk by $300 million from FY 2017 to FY2019, cutting $20.7 minion from wildlife habitat management, and $18 million from vegetation management–a rampage beginning with cutting USFS research funding by 10% and Wildland Fire Management by 12% in FY 2018! While blaming states for not clearing brush in forests, sustained hampering of managing federal lands rendered the West far less prepared for climate change. As the costs of containing wildfires rise, the reduction of the Forest Service budget has provoked panic by zeroing out funding for Land and Water conservation–alleged goals of the Trump Presidency–and cuts grants to state wildfire plans by a sixth as fire suppression looms ever larger.
By defunding of forest management, rangeland research, and habitat management, such budgetary measures pose pressing questions of our preparedness for the growing fire seasons of future years; stars that denote public land management might be targets for future dry lightning.
We think of earth, wind and fire as elements. Or we used to. For the possibility of separating them is called into question in the Bay Area, as wind sweeps the smoke of five to seven fires, or fire complexes, across the skies, we are increasingly likely to see them as layers, which interact in a puzzle we have trouble figuring out. Indeed, the weirdly haunting daily and hourly maps of air quality map the atmospheric presence of particulate matter by isochrones brought late summer blues to the Bay Area. Blue skies of the Bay Area were colored grey, burnt orange, and grey again as cartoon plumes of soot flooded the skies in a new sort of pyrocumulus clouds that turned the sun red, offering a disembodied traffic sign telling us to stop.
Fire season began by remapping the town in terrifying red that registered “unhealthful,” but almost verging on the “hazardous” level of brown, based on local sensors monitoring of ozone, but is also registering a deeper history defined by an absence of rain, the lack of groundwater, the hotter temperatures of the region and the dry air. The map is both existential, and ephemeral, but also the substrate of deep climate trends.
Is fire an element we had never before tracked so attentively in maps? We did not think it could travel, or had feet. But wildfire smoke had blanketed the region, in ways that were not nearly as visible as it would be, but that the real-time map registers at the sort of pace we have become accustomed in real-time fire maps that we consult with regularity to track the containment and perimeters of fires that are now spreading faster and faster than they ever have in previous years. And soon after we worried increasingly about risks of airborne transmission of COVID-19, this fire season the intensity of particulate pollutants in the atmosphere contributed intense panic to the tangibility of mapping the pyrocumulus plumes that made their way over the Bay Area in late August. As the danger of droplets four micrometers in diameter remaining airborne seemed a factor of large-scale clusters, the waves of black carbon mapped in the Bay Area became a second sort of airborne pathogen made acutely material in layers of real-time Air Quality charts.
The boundaries of fire risk charts and indeed fire perimeters seemed suddenly far more fluid than we had been accustomed. When we make our fire maps with clear edges, however, it is striking that almost we stop registering the built environment, or inhabited world. As if by the magic of cartographical selectivity, we bracket the city–the sprawling agglomeration of the Bay Area–from the maps tracking the destructiveness and progress we call advancing wildfires, and from the isochronal variations of air quality that we can watch reflecting wind patterns and air movements in accelerated animated maps, showing the bad air that migrates and pool over the area I life. The even more ephemeral nature of these maps–they record but one instant, but are outdated as they are produced, in ways that fit the ecoystem of the Internet if also the extremes of the new ecosystem of global warming–the isochrones seem somewhat fatalistic, as they are both removed from human agency–as we found out in the weeks after the Lightning Siege of 2020 that seemed a spectacle of the natural world that rivaled the art of Walter de Maria in their grandiosity of time-lapse photography–
–the horizontal line of artificial light from Santa Cruz, unlike the images that De Maria created from The Lighting Field, remind us of the overlap between inhabited spaces where conflagrations in the dry wildlands that spread as the fires struck, and in way far less difficult to aestheticize than The Lightning Field set in a desert removed from human population, but was built as an isolated field for time-stop photography.
The CZU complex brought widespread devastation across areas of extra urban expansion in the Santa Cruz Mountains was almost a map that registered the expansion of residences to the very borders of forests. We haven’t ever faced the problem of maintaining and clearing in weather this dry, even if we have mapped the clustering of fires in the wild land-urban interface: but the strikes ignited underbrush lain like kindling, on the boundaries of the raging fire complexes. If the burning of underbrush by fire mitigation squads seeks to create fire lines in the mountainous landscape to create new perimeters to forestall the advance of major fires, working along a new fire line even as what is still called wildfier smoke travels across the nation, far beyond the Bay Area.
While watching the movement of fires that them in inhabited areas like shifting jigsaw pieces that destroy the landscape across which they move. These marked the start of megafires, that spread across state boundaries and counties, but tried to be parsed by state authorities and jurisdictions, even if, as Jay Inslee noted, this is a multi-state crisis of climate change that has rendered the forests as fuel by 2017–for combined drought and higher temperatures set “bombs, waiting to go off” in our forests, in ways unable to be measured by fire risk that continues to be assessed in pointillist terms by “fuel load” and past history of fires known as the “fire rotation frequency.” When these bombs go off, it is hard to say what state boundary lines mean.
If San Francisco famously lies close to natural beauty, the Bay Area, where I live, lies amidst of a high risk zone, where daily updates on fire risk is displayed prominent and with regularity in all regional parks. These maps made over a decade ago setfire standards for building construction in a time of massive extra-urban expansion. But risk has recently been something we struggled to calculate as we followed the real-time updates of the spread of fires, smoke, and ash on tenterhooks and with readiness and high sense of contingency, anxiety already elevated by rates of coronaviurs that depended on good numbers: fire risk was seen as an objective calculation fifteen years ago, but was now not easy to determine or two rank so crisply by three different shades.
When thunderstorms from mid-August brought the meteorological curiosity of nearly 12,000 dry lightening dry strikes from mid to late August 2020, they hit desiccated forests with a shock. The strikes became as siege as they set over three hundred and fifty-seven fires across the state, that rapidly were communicated into expansive “complexes” of brush fires.
We map these fires by state jurisdictions, and have cast them as such in policy, by borders or the perimeters we hope to contain barely grasp the consequences of how three quarters of a million acres burned up suddenly, and smoke from the cluster of fires rose in columns that spread across state boundary lines as far as Nebraska, and how fire complexes that spread across three million acres that would soon create a layer of soot across the west, eerily materialized in layers of GIS ESRI maps of environmental pollutants, while toxic particulate mater released in plumes of black carbon by the fires cover the state, rendering the sun opaque where I live, in the Bay Area, now Pompeii by the Bay as smoke at toxic levels blanketed much of the state.
They even more serious map, to be sure, was of fire spread: but the maps of air quality set the entire western seaboard apart from the nation, as if threatening to have it fall into the ocean and split off from the United States,–even if the burning of its open lands was more of a portent of things to come, they were a historical anomaly, lying outside the record of fire burns or air quality, if the poor air quality traced the origin of black carbon columns of smoke that would rise into the nation’s atmosphere.
We read more maps than ever before, and rely on maps to process and embody information that seems increasingly intangible by nature. But we define coherence in maps all too readily, without the skepticism that might be offered by an ethics of reading maps that we all to readily consult and devour. Paradoxically, the map, which long established a centering means to understand geographical information, has become regarded uncritically. As we rely on maps to organize our changing relation to space, do we need to be more conscious of how they preset information? While it is meant to be entertaining, this blog examines the construction of map as an argument, and proposition, to explore what the ethics of mapping might be. It's a labor of love; any support readers can offer is appreciated!