Category Archives: Global Warming

Loopy Maps to Rationalize Random Shut-Offs?

As the haze is again settling over San Francisco and blanketing the Bay Area, fire season has begun again in Northern California. The densely populated Bay Area is surrounded by levels of particulate matter from the fires based in the Sierra Nevada, sending some flakes to San Francisco, as alerts are issued to those in Santa Rosa, Livermore and San Jose sensitive to air quality to avoid extended outdoor activities, and reduce their time out of doors.

The new strategies of fire containment, however, that have been adopted by PG&E, after the private company used its privileges to declare power shut-offs in ways that did not seem so “surgical” at all–leading led Bay Area State Senator Jerry Hill to remind the corporation that all safety shut-offs “must be a surgical, last resort measure,” not a knee-jerk method of containment–although what a “surgical” sort of shut-off of electricity would be was unclear, as it would presuppose an assessment of transmission poles, the clearing of nearby trees, and tools of pinpointing high winds. In an era when, in the East Bay and in Oakland, we’ve been looking at fire warnings in public service announcements placed on the side of park entrances and highways for over ten years,

the emergency warnings that extended a red flag alert of extreme fire danger to the entire Bay Area on October 8.

The increased gustiness of winds generalized a sense of extreme fire danger, reminding us of the broad state of fire in a drying state, where rainfall has created a rage of fires across the state. Obi Kaufman has compiled the fearsome image of fires across the state in a rather fierce water color map, which belies its medium, that serves to picture the spread of fires across the states’ counties more concretely than the best remote sensing allows–and indeed how fires have increasingly shaped the topography of the state–raising the deep, primeval fears of fire as a plague of contiguous burning regions across the state, as if its entire surface is lit by fires that bled into one another, moving across space in record time.

Obi Kaufman, “Fire Desnity”

Numbers of residents displaced by the fires over the last five years in California stands as a human rights crisis, particularly acute among lower-income residents of the state, often displaced by loss of homes which leaves them without options of relocation. Because most communities in California wait an average of five years until homes are rebuilt after fires subside, the increasing occurrence of “wild” fires in the state significantly increase the income divides, diminishing the amount of available housing stock across much of the state, and increasing the cost of rentals after fires have been extinguished,

One must read the map of the occurrence of fires in the past five years is also a map of displacement, and pressures on public housing, and of the increased salience of a public-private divide afflicting the local economy and all residents of the state, and standing to disrupt the common good more precipitously than property loss can map: the occurrence of fires around the so-called urban-wildlands periphery, where many seeking housing are also pushed–outside of Sacramento, San Francisco, and Los Angeles or San Diego–creates an apocalyptic scenario where many of the suburban residential areas are consumed by raging fires, displacing residents who will find it even harder to find a home.

The mosaic of the fires seems an impossible challenge to process or parse. But it suggests the impossibility of removing the range of fires that have occurred in the last five years form the cities where populations are concentrated in California, and indeed confirms the surreal nature of how a sustained absence of precipitation has ringed the capital with hugely destructive fires, which despite their individual names suggest the new landscape that tightening resources in the state are compelled to address, both in prevention and mitigation even as the costly nature of controlling what has been an almost year-round fire season creates a budgetary costs few have prepared for:

Is Kaufman’s map the true map underlying the panic at the possibility of new firestorms? It will help us take stock of the dissonance of the maps that PG&E presented of electric shut-offs, and the contorted syntax of public explication of the potential shut offs of elec. We were warned to potentially be the very wind conditions that in 2017 helped spread the North Bay Fires, rendering immediate danger signs throughout the Oakland Hills where memories of the Oakland Firestorm of 1991 were still raw, inscribed in the landscape, as the largest suburban conflagration that spread from the Oakland Hills over the weekend, as the Tunnel Fire was fanned by Diablo winds in late October, destroying hillsides of homes, and pain, shocking residents in ways that would barely receded from the public imaginary by the time of the North Bay Fires were fanned a quarter of a century letter. The mosaic of the regions that were consumed by fire in the past five years in California reveals something of a patchwork quilt.  But the dramatic expanse of the Camp Fire that consumed the city of Paradise, CA suggested a new era of fire regimes, as it flattened the city in ten minutes, creating refugees of climate change on our own soil as it destroyed 14,000 family dwellings and displacing 50,000, with refugees across Yuba City, Chico, Sacramento, and across Butte County.  Nearby, the devastating fire was witnessed differently than other wildfires:  as the air entered our lungs in the Berkeley, Davis, and Sacramento, and we witnessed the sun go red and skies grey.  While the fire was removed, it was immediate, inhabiting multiple spaces at once, if burning across a wooded landscape and consuming huge carbon reserves.

NWS warned about this danger of fire, which couldn’t have been clearer in anyone’s mind, or to PG&E’s new board of directors as they were forced to decide upon a plan of action with the growth of Red Flag warnings in much of the northern state.:

If the dystopia of Fire Season lay last year in raging firestorms that consumed homes, tract housing, and high carbon forests, the safety measures that were adopted after the North Bay Fires of 2017 to curtail corporate responsibility have become a new dystopia, of intermittent power supply, immediately inconveniencing the poor, elderly, and infirm, as a temporary safety tactic was extended to thirty-four of fifty-eight counties–over half–as if this was a new strategy of fire prevention or containment.

For Califorrnia’s largest publically traded utility, Pacific Gas & Electric, faces charges it is unable to maintain, respond, or oversee a state-wide safety issues beyond its control, even if it has entered bankruptcy. The state utility company has been widely blamed for failing to maintain the safety of its transformers has just openly recognized its transformers were the “point of ignition” of the recent tragic fires–exposing itself to 11.5 billion in damages for what was the most deadly fire in California history as it spread across 150,00 acres that the admission of a site of ignition and previous charges for which its equipment was blamed, if not $30 billion in damages.

California’s legislature had tried to extend support to the utilities company in the wake of previous fires to spare it from previous wildfire liabilities; the reluctance to extend any similar security to PG&E has turned to anger in the face of the current destruction, however, reflects huge discontent with continued abilities of forest management and outrage at the scope of the devastation of local neighborhoods, and the scale of the destruction of over 18,000 buildings in the Camp Fire. Even as the current scare has retreated, allowing air quality to improve for now around much of the Bay Area,

there are steep charges PG&E improperly maintains forested areas and trees in proximity to electrical wire–especially dangerous in a parched landscape that has lacked rainfall over multiple years. After the rapid spread of the fires of the Camp Fire, after multiple combustive wildfires to scorched the North Bay in October 2017 that killed forty-four, it is hard to single out poor maintenance as the issue at stake in the fires’ spread. Yet the corporate entity has become blamed on inadequate repairs, maintenance, or poor record keeping, as if such bad practices left state residents increasingly vulnerable. Even if most dismissed President Trump’s wilder claim that the spread of fires in Northern and Southern California were due to “gross mismanagement of the forests,” the notion of blaming the utilities company for its negligence seemed more credible. But building beside grasslands and desiccated forested land have created a new geography of fire and of fires’ spread–revealed in satellite measurements of the fearsome spread.

The liabilities that the power company was assessed for the destruction with which the recent fires’ spread undercut its credibility and increased a sense of its negligence, removed from the extreme weather the state has faced and for which few containment strategies exist. To be sure, the demand to define a felon seems to have overridden the danger of placing transmission lines near forested areas without rain. The aftermath of the fires–perhaps critically the time since Paradise, as it were–has created a sense of nervous breakdown in assessing, monitoring, and mitigating fire danger, it almost seems, as we rush to individuate and indicate clear blame for a changing climate and a lack of response to a decreased level of precipitation rarely experienced, in the need to identify a clear cause or victim for such massive and persistent disequilibria that undermined the public good and well-being in the apocalyptic fires northern and southern California faced, and the terrifying alarm before maps of the fires’ destruction and accelerated spread, perhaps without pausing to consider inter-relations that create a radically new firescape. And the outage maps that PG&E issued to its clients as gustiness of winds grew suggested a similar remove from an overderied landscape, long lacking rain.

And the maps that were issued to make the case for the outages weren’t that convincing, seemed quite improvised, if they did alert customers to the impending danger of power outages as a response to contain future fires. The loopy maps that were recently issued by PG&E, California’s very own home-grown for-profit energy company, to its customers or “consumers” seemed a weak public wager of confidence. After hemming and hawing, apparently, about deciding how to respond to the forecast of possible high winds returning to northern California, a region over recent decades that is haunted by fires by every earlier autumn. Whether they actually prevented fires–as PG&E insists, and left us all to sight with relief–or not, the hodge-podge constellation of overlapping parcels of indicating potential electrical shut-offs was disorienting–

–and didn’t create more clarity by drilling deeper down into Bay Area neighborhoods..

Were the maps a warm-up for the difficulties of managing the threat of fire in a new fire season? Our “fire season” now stretches past winter, if not lasting all year, and its length has created such problems of management that PG&E elected to announce last Tuesday evening that the power was being turned off to over 800,000 of its customers preventively, affecting what might be upwards of 2.5 million individuals. It did so by exploiting an ability that it had gained after inadequately struggling to respond to past fires that had caused damage across the state, but intense loss of homes and property in Northern California, after the Camp Fire almost a year ago in 2018;, combined problems of infrastructural management and climate issues of a lack of rain it was unable to face, management of which was triple jeopardized after it had declared bankruptcy as a private corporation, which this time shut the power preemptively as the gusty winds began to blow.

This being California, it was significantly striking that the turquoise rings seemed electrified version of the iconography of radiating earthquake tremors, and caused a comparable alarm.

The maps were hardly much grounds of confidence, born as the board of directors hunkered down in preparation for those dry, warm wind of winter to sweep across Northern California. They were announced to be the result of the sudden arrival of “unprecedented fire risk,” even as the highest temperature anomalies recorded in the state had been published online since early summer, as dry autumn winds of increasingly high velocity returned to raise fears across the state, setting off alarms for a new geography of firestorm risk.

For “risk” in this case meant winds, and fire danger was understood as an intersection, as in a Venn diagram, between combustible undergrowth and winds that could carry dangerous embers from downed power lines–the apparent paradigm of recent outbreaks of what the media still calls–with PG&E–“wildfires,” as if they were due to a failure to thin the woods.

This may be the best optic through which to understand the awfully loopy maps that the company released, in an attempt to suggest that this time it really had the situation under control, or at least to dodge the danger of not seeming to have responded to verifiable risk. The responsibility of PG&E exercised over the electrical infrastructure of the state had led to its ability to monitoring weather privately to determine the urgency of shutting off residents’ power, allowing customers to consult a weather webpage to ensure some sense of transparency.–and security–until it crashed under broad demand for answers. The project of educating customers to the possibility of unprecedented outages raised questions responsibility and agency that many felt the private company had not been delegated authority to adjudicate.

The prediction of gusts of wind from the Great Basin over areas that still bear traces of firestorms created a broad sense of alarm, however, and with good reason. High gusts of dry air had spread firestorms of huge destruction, crumbling steel lattice structures of aging transformer towers that sent power lines crashing into dry underbrush, in Pulga, as low precipitation, low atmospheric moisture, and sparks led to particularly extreme consequences: debates about what “good forest management” has led to concern about the strength and durability of elevated electric wires knocked down by the force of high winds, pushing brush fires that became firestorms over vaster areas than firefighters had been used to manage–or fire fighters had even recently imagined. It led many to contemplate the stubborn fact that an expanded network of elevated electrical wires, bearing charge, running more prominently than ever before across a wild lands-urban interface, with terrifying consequences.

Despite the complexity of expertise required to explain the fires’ spread and indeed the problems of forest management, the maps that PG&E issued to explain outages were not clear. Only earlier this year, Judge William Alsup called the PG&E “the single most culpable entity in the mix” within the current “crisis that California faces on these wildfires.” This led several attorneys had sued the company for the huge damages of the Camp and North Bay wildfires eagerly listened, happy that some accountability did seem to be in the air. But as PG&E had let its budget for reducing trees near transmission lines whither, they seemed to illustrate utter irresponsibility in neglecting “a large number of trees that should have been removed, and that appears to be the single biggest factor in the 20107 and 2018 fires,” demanding that the fire mitigation plan of the energy corporation follow state law by trimming trees that might contact power lines in high Fire Season winds, which were only expected to grow, and placing there corporation on probation. The limited sensitivity of CEO Bill Johnson to the question of trimming may have not been able to be dislodged from how his salary was being pegged to safety performance, as much as responsible maintenance.

Judge Alsup had ominously warned the corporation it would be bad in the docket come December, wen the number of wildfires PG&E started would be reckoned, and he hoped it would be none: the expectation were balanced by the significant bonuses Johnson would gain if he created conditions to allow the stock of the utility company bounce back to the heights they enjoyed in 2017. And the power of PG&E to shut power of temporarily to 500,000 and ten 800,000 Californians provoker indignation but were won as a right to announce “public safety power shut off” with little warning was won, without being imagined to be used in such an aggressive and proactive fashion as Johnson and his Board seem to have done, raising questions about the current corporate culture of PG&E, if it had committed to “only consider proactively turning off power when the benefits of de-energization outweigh potential public safety risks.” These benefits were not only regarding safety, but the dangers of being found negligent in managing the energy infrastructure.

The problems of fire management that this mapping of fires revealed was both problematic, and difficult to respond to: PG&E asked–or demanded?–the ability to shut off power in high-velocity winds in response. The consequences were not only personal inconveniences. They may well have compromised valuable cancer research on temperature-controlled cultures within a week of conclusion, costing 500K and countless hours of lab work, as controlled experiment were rushed to San Francisco. Even if a warning on October 9 let folks know that the private energy agency “continues to warn of a power outage,” the absence of anything like expectations of such a shut-off or even plans to deal with the web traffic that meant that the “inconsistent” and at time “incorrect” information on the PG&E website, which would continually crash due to high volume of folks seeking information urgently, led to a broad swath of apologies that however refused to admit any underlying fault–“I do apologize for the hardship this has caused but I think we made the right call on safety,” said CEO Johnson, who arrived at PG&E only recently from the Tennessee Valley Authority, and installed a new Board: Johnson is a proud long-time Grateful Dead fan, but may be more conscious his salary was tied to safety performance than California topography or corporate ethics, and expected that the state would rapidly accept a declaration of fire emergency.

Perhaps the arrival of winds didn’t inaugurate a “fire season,” however, but revealed a new aspect of the electrical infrastructure about which no clear working process had been devised or imagined. If such winds inaugurated the “fire season” in California, this was now not an issue to be addressed by thinning forests or environmental management alone–the large carbon loads of forests had gained an increased vulnerability to high-charge electric sparks, in ways that had exponentially expanded the vulnerability of the landscape to fire, and risk of firestorms,–

–of the very sort that had generated ever more terrifying images around the time of the Camp Fire of statewide fire risk; winds that created conditions for the fire fanned the flames further, at higher velocities, to create firestorms and new management challenge of fire whirls, or vortices of flames of extreme heat intensity.

Did it all start with dry winds, or with carbon loads, or with live electric lines suspended in what were revealed to be dangerously unstable ways? Heat maps of the nature of air quality became seared into northern California residents’ minds.

Mapbox Heat Map of Smoke Emissions of Camp Fire

The high gusts running through trees that received far less water or snowpack than in previous years, the deep worries both among PG&E administrators who balanced their abilities of oversight of the electrical infrastructure with their public responsibility, was mirrored in the haunted nature of the Northern California residents who had seen increasing numbers of fire men stream into the areas near Chico, Sacramento, and the North Bay or the Sierras and Yosemite each Fire Season like clockwork in previous years–in ways oddly dissonant with bucolic green surroundings.

This time round, the temperature anomaly alone had offered some guide to prediction of the eventuality of future fires, but provided little guideposts to the possibility of managing the situation at hand. The arrival of winds were terrifying, and not only to PG&E. The gustiness of local winds seem new indices that will mediate all future responses to what Daniel Swain called the ongoing firestorm of California, and are destined to filter our future collective reactions to future fires’ spread.

While the maps of outages seem to convey a weird spectral sort of precision, they suggest an architecture of emergency response that blankets all areas of the now legendary woodlands-urban periphery with darkness, as if in an attempt to rationalize the impending outages that have been paralleled with offers to set up cell phone charging stations, were hardly reassuring.. Promising lavishly to set up a designated communications network with its clients and customers, that seemed a bit like not allowing the news to get into the game, preferring texts, emails, and robocalls as a way of staying in touch and broadcasting emergency to folks who it assumed would have fully charged phones–because who lets their phones run low–though it wasn’t clear communications could continue if outages lasted the full “five days or longer” as announced. The company had its own private team of meteorologists, to prevent any redundancy from designated communication with the NWS or NOAA. But trust us, POG&E seemed to say–as Sumeet Singh, PG&E vice president of the Community Wildfire Safety Program, put it, “some of our customers may experience a power shutoff even though the weather conditions in their specific location are not extreme.”

The memories of the Camp Fire created a sense of disorientation before the impending arrival of high winds, which triggered a memory of the massive destruction of property and loss of lives even as the much of the nation is distracted on other media sources, and indeed by greater problems of irresponsibility. With the memories of the Camp Fire and to the North Bay Fires of the fixed in our head, from the toxic air they across the city, to streams of refugees who lost their homes, we had only just started to face the first evidence of the start of a new fire season in smoke from the first fires, which only last year had made Northern California site of the worst air quality the world, as we studied maps of how the Firefighters have made progress containing the deadly Camp Fire was tired to be contained. As we tracked the fear of fires growing not only near Los Angeles, but near Chico,

we faced a new beast in the unprecedented power shut-offs that became a possibility for hundreds of thousands of customers of the utility company over the last week, a highly controversial act that enabled power to be preventively cut to regions in California that we tracked through loopy maps of potential shutoffs, shown in alientating map layers that seemed to glow with radioactivity, as the authority of the utilities company had assumed in public life in California, elevated after a series of mismanagements or mishaps, meant that the combat of ever-present fire threats extended to the disruptive dangers posed by power shut-offs, perhaps strategic.

But the maps offered no clear logic to their arrival or creation, though they seemed to follow global curvature, akin Bowie net, though the GPS loops didn’t also overlap so often, and derive from power lines on a local level. The threat of interruptions of electricity seemed issues of global importance, altering access to electricity and shifting many to local generators–even closing the University of California for a few days until wind levels died down.

The large, privately held public utility company elected, with winds forecast from Wednesday through Thursday to reach velocities as high as sixty to seventy mph at taller elevations of the state, to adopt what was called “PG&E’s state-mandated wildfire mitigation plan, which aims to cut down on the ignition of wildfires during high-risk periods”–presumably because it was in bankruptcy, more than or as much as a form of public protection, at a time that PG&E has its back up against the wall. The widespread “voluntary” withdrawal of power supply was due to a failure of risk management, and the dangers of firestorms that could not be contained. The threat of electrical transmissions lines malfunctioning, collapsing, or igniting brush–as had begun the conflagration known as the Camp Fire of 2018 and the Tubbs Fire and the North Bay Fires of 2017, lead Berkeley residents to be asked to evacuate the Hills–an area where evacuation would be judged difficult in an actual emergency. The extensive elderly population dependent on life-sustaining electrical medical equipment–oxygenators; dialysis machines; ventilators–whose loss of power could pose diastrous health risks. Even the announcement of power loss could trigger panic sending customers into a tail spin.

Since the potential outage map on the PG&E website first appeared, and before it crashed, significant panic grew as the need to follow a careful protocol seemed absent in the energy corporation’s plans. The maps revealed in complex if simple fashion a blanketing of many of the areas of the Berkeley Hills-based on what must have been the newly launched website PG&E had devised to help customers prepare for wildfires readiness, and indeed presented to the Berkeley City Council this past July about the potential for emergency ‘public safety power shutoffs’ and presumably also the need to develop a strategic response given the problems of residents who relied on electricity for medical devices, cel towers, and city reservoirs, who would all be thrown into disarray in the eventuality of a “public safety shutoff” that the energy company had recently acquired–or been forced–to adopt. The sort of outage map that PG&E distributed were low on actual information, and rich with apparent looniness and difficulty to read,–

–the image of state-wide potential outages even more generic at such small scale. The map is more suggestive of targeting of sites of past ties created by sparks from wind-driven fallen transmission lines or as a result of the sudden collapse during high winds of aging electrical infrastructure. All state residents were alerted to update their contact information to receive more local bulletins at pge.com/mywildfirealerts, a dedicated communications infrastructure over which PG&E held all strings; any possibility of confused communications was sought to be contained–as was that of examination of the protocol for delivering information.

Continue reading

Leave a comment

Filed under climate change, fires, Global Warming, PG&E, Wildfire, wildfire risk

Freezing Time, Seaweed, and the Biologic Imaginary

We can lose sight of the central role that seaweed plays in the coastal habitat of Northern California. For while often present before our eyes, the problems of mapping often submerge seaweed forests with any fixity is mirrored by the threatened disappearance of offshore kelp beds in an amazingly rapid timeframe, as creating an actual image capture able to register the extent of kelp forests is sadly mirrored in the diminishing kelp beds off the California coast.

Has predominantly passive registration of location–onshore registration of sites remotely by satellites, from the harrowing images of the spread of fires. We are reminded by maps showing the rapid advance of the burn perimeters of Yosemite wildfires of 2013, North Bay Fires, or the disastrous Camp Fire of 2018. The rapid pace of the loss of these forested lands seems eerily echoed in the shrinking of coastal beds of kelp along Northern California, and correlates to the advance of warming climes.

If we have developed tools to map the continuity, intensity, and growth of forest fires by satellite and drones, the problem of passively registering the loss of kelp forests, and its relation to the advance of urchin beds, removes a part of coastal environments we are in need of mapping. The scale of maps of the destruction of seaweed beds on the California coast are less rooted in real time, but have advanced in striking fashion over ten years, although the ravages of destruction for now seem to remain undersea. But we are less skilled to communicate their crucial place in offshore environments.

The nutrient-rich cold waters of coastal California provided with its rocky seafloor afford a perfect environment for lush kelp forests, that extend up into British Columbia and Alaska. But as waters are warming, with astounding rapidity, we need to ensure kelp beds are mapped, although many are often off the map, and difficult to register, even as their size has come to be threatened by global warming and climate change, in ways that eerily parallel the loss or threats to irreplaceable forested environments. While the nature of the decline of seaweed is not linked to warming waters directly, the shifting ecosystems that climate change has created have caused a drastic and rapid decline of seaweed’s offshore presence that we have yet to fully map. For the passive registration of kelp beds, traced by GPS or captured through aerial photography, is far less hands-on than examination of their extent would warrant, and low-flying satellites of remote sensing offer few possibilities for accurate mapping of their extent. The rapidity of the disappearance to kelp–and beds whose boundaries are shifting in time with the rapidity of the advance of forest fires on the scale of their destruction–pose problems of global dimensions that are pointing for the immediacy of loss of kelp of over 13,000 species whose biodiversity and creation of oxygen by photosynthesis feeds much of the world drives our own ecosystem.

Paul Horn, Inside Climate News/Source Wernberg and Staub,
Explaining Ocean Warming (IUCN Report, 2016)

Even a map of globally threatened areas cannot emphasize properly the extent to which the Pacific coastline provides a site of cold waters, ocean upwelling that provides rich mineral nutrients, and sunlight that makes it an especially abundant site of kelp forests, in a true megaregion of coastal ecology whose catastrophic loss is impossible to imagine. Even a map of kelp’s local abundance fails to map its ecoystemic centrality in adequate ways–ways that the diversity of kelp speciation also fails to capture, despite its clear .scientific value to survey the ocean populations that are most risk.

The Nature Conservancy

Miller, Lafferty, Lamy, Kui, Rassweller and Reid (2018)/Royal Society Publishing,

The particular vulnerability of the kelp biomass seems to have grown in unexpected ways not only due to climate warming, but to its particular vulnerability to ocean floor sessile predators like purple sea urchins, who are more likely, it is now believed, to eat phytoplankton and microalgae in the kelp understory, rather than kelp itself: however, the role of urchins in diminishing kelp forests, which themselves feed exclusively on sunlight, the combination of how a lack of upwellings due to climate change diminished urchin food supply and the inhospitable nature of warming waters to kelp forests may increase the vulnerability of kelp in coastal oceans.

Continue reading

Leave a comment

Filed under climate change, climate emergency, data visualization, Global Warming, seaweed

Saturated Shores in Southeastern Texas

There is almost no trace of the human, or of the extreme overurbanization of the Texas coast, in most of the maps that were created of the extreme flooding and intense winter rains that hit Galveston and Houston TX with the windfall of Hurricane Harvey.  While maps serve to orient humans to the world–and orient us to human processes and events in a “human world,” as J.B. Harley and David Woodward put it, the confused nature of relations between the human and natural world, is increasingly in danger of being mipmapped.  Data visualizations of extreme weather that erase the modification of coastal environments provide a particularly challenging means of orientation, as news maps are suspended between registering the shock of actual events–and trying to contain the natural emergencies that events of extreme weather create–and the demand for graphics that register natural calamities and the ethics of showing such calamities as “natural”–or even what the category of the natural is in coastal regions that so heavily modified to modify actual weather events.

The ethics of orienting viewers to the rainfall levels that fell in Houston after the landfall Hurricane Harvey Part of the huge difficulties lies in adequately orienting viewers in ways that register a changing natural world–how we are mapping rainfall, for example, or the approach of hurricanes, or are rather mapping the new relation of rain to built surfaces and landcover change that lack permeability for water, facilitating flooding by storms whose potency is changed by the greater atmospheric content of a warming Gulf of Mexico, which the ground cover of Houston, Galveston, and the Texas shore are less able to absorb and return to the Gulf. The area is, itself, something of an epicenter of the increased number of hemispheric tropical cyclones–which demand warm water temperatures above 80 80°F / 27°C and a cooling atmosphere and low wind shear–often led to the Gulf coast.

NASA Earth Observatory/Tropical Cyclones through 2006

–those that come ashore at Galveston hit a seashore that is eminently unprepared to accommodate an influx of water that the paved surface has rendered all but impermeable. If the problem of global cyclones that can become hurricanes is truly global–

NASA Earth Observatory/150 years of Tropical Cyclones

–the intersection between cyclones and areas of paved ground cover is problematic to the southwestern states, and most of all to Texas, Louisiana, and Florida, where water absorption has been anthropogenically reduced in recent decades. At the same time, few other areas of the inhabited world are so widely “tracked” as the destination of tropical cyclone formation..

NWS JetStream Online School)

The problem is partly evident in the choice of new color ramps that transcend the rainbow spectrum of measuring the intensity of rainfall in the recent arrival or ground fall of Hurricane Harvey, which condenses the great difficulty of using old cartographical categories and conventions in order to capture or communicate increasingly extreme weather conditions. in an era of climate change.  But the cartographic problem goes farther:  for it lies in the difficulty of registering the changes in relations f how rain dropped meets the ground, mapping relations between complex processes of warming and atmospheric warmth that lead to greater humidity across the gulf region to ground cover permeability that leaves regions increasingly exposed to flooding.

The relentless logic of data visualizations based on and deriving primarily from remote sensing are striking for rendering less of a human world than the threat of allegedly “natural” processes to that world.  Perhaps because of the recent season of extreme weather we have experienced, weather maps may be among the most widely consulted visualizations in our over-mediated world, if they were already widely viewed as the essential forms of orientation.  But the pointillist logic of weather maps may fail to orient us well to extreme events as the hurricane that dumped a huge amount of water on overbuilt areas to include the human–or the human world–seem a tacit denial of the role of humans in the complex phenemona of global warming that have, with the warming waters of the Gulf of Mexico and ever-increasing ozone over much of the overbuilt southeastern Texas shore, created a perfect storm for their arrival.

This failure to include this role haunts the limited content of the weather map; including the role of humans in maps of extreme weather events indeed remains among the most important challenges of weather maps and data visualization, with the human experience of the disasters we still call natural.  And although the subject is daunting, in the spirit of this blog, we will both look at the communicative dilemmas and difficulties of eye-catching color ramps and their deceptiveness, and poetic difficulties of orienting oneself to shores.  For as the disaster of Harvey is depressing, it compels raising questions of the orientation to the shifting shore, around the national epicenter of Galveston, where the landfall of Hurricane Harvey focussed our attention on August 27, 2017–

90-7
wpc-5day-0Z-8.28.17

–and the meaning of place in an saturated shoreline, where the sea is somehow part of the land, and the land-sea divide blurs with a specificity that seems as if it may well be increasingly true in an approaching era of climate change.  And as we depend on the ready generation of maps based on remote sensing whose relentless logic is based on points, we risk looking sight of the role of place in determining the relations of rainfall to shoreline in maps of coastal flooding that remove remote observations from the built environment that flooding so drastically changes, challenges and affects, in ways that may elide specificities of place.

At a time when we are having and will be destined to have increased problems in orienting ourselves to our shores through digital maps of rainfall, the unclear shorelines of Galveston sent me to the bearings that a poet of an earlier age took her bearings on the mapped shorelines of the place where she had been born, and how she was struck by a bathymetric map to gauge her personal relation to place, and saw place in how the changing shoreline of the northern Atlantic were mapped in the maritimes, in a retrograde form of print mapping in a time of war.  For the way the mapped shore became a means by which Elizabeth Bishop gained bearings on shores through a printed map of coastal bathymetry to access the spatiality of the shore–how “land lies in water” and the blurred relation of land and water that the bathymetric map charts–in an age when the materiality of the map was changing, with the introduction of aerial composite maps from the early 1930s, as the rise of aerial composite maps removed the hand of the mapmaker from the map in an early instance of remote sensing–

5852167

Cartography Associates/David Ramsey: Historical Map Collection: Composite of 164 Aerial Views of San Francisco by Harrison Ryker/Oakland, 1938, 1:2000

–in a medium of aerial photography that focussed on land to the exclusion of water, and that all but erased the relation between water and shore just a few years after Bishop quickly wrote her poem in Christmas 1935 about coastal “edges” of land and sea.  Ryker, who developed techniques of aerial photography used in the mapping of the shores of Puerto Rico for the Fairchild Aerial Camera Company, as well as photographs of the devastating Berkeley Fire of 1923, went into business in 1938–the year of his map–as a map publisher, with a patent for the stereoscope used to interpret aerial imagery,  and must have performed the massively detailed mapping of San Francisco in one hundred and sixty for images taken from airplanes from 1937-38 as a sort of calling card, soon after Bishop wrote her poem, before manufacturing a range of stereoscopes of pocket and desktop versions for military ends that were widely used in World War II by the US Army.

Before war broke out, but in ways that anticipated the coming war, the printed bathymetric map must have resonated as a new reflection on the impersonality of the aerial view; Bishop was suddenly struck when she encountered the materiality of a print map on Christmas 1938 as the art of cartography was already changing, responding to the drawn map under glass of the Atlantic as a way to recuperate the personal impact of place.  Her poem powerfully examined the logic of drawn maps utterly absent from the digitized space of rainfall maps of a flood plain, deriving from data at the cost of human inhabitation of place–and in envisioning data to come to terms with the catastrophic event of flooding distancing or removing the craft of mapmaking from the observer in dangerously deceptive ways.  And so after wrestling with the problems of cartographic representation using remote sensing, while recognizing the value of these readily produced maps of rainfall and the disasters they create,

1.  For weather maps are also among the most misleading points to orient oneself to global warming and climate change, as they privilege the individual moment, removed from a broader context of long-term change or the human alteration of landscape.  They provide endless fascination by synthesizing an encapsulated view of weather conditions, but also  suggest a confounding form of media to orient audiences to long-term change or to the cascading relations of the complex phenomenon of climate change and our relation to the environment, as they privilege a moment in isolation from any broader context, and a sense of nature removed from either landscape modification or human intervention in the environment, in an area were atmospheric warming has shifted sea-surface temperatures.  The effects on the coast is presented in data visualizations that trace the hurricane’s “impact” as if its arrival were quite isolated from external events, and from the effects of human habitations on the coast.  The image of extreme flooding is recorded as a layer atop a map, removing the catastrophic effects of the flooding from the overpaved land of the megacities of southeastern Texas, and the rapid paving over of local landcover of its shores.

90-7

.

Such visualizations preserve a clear line between land and sea, but treat the arrival of the rains on land as isolated from the Consuming such events of global warming in color-spectrum maps.  The data of rainfall translate data into somewhat goofy designs represents a deep alienation from the environment, distancing viewers in dangerous ways from the very complexity of global warming that Gulf coast states encountered.

Such data visualizations seem dangerously removed notion of how we have changed our own environment, by describing a notion of “nature” that is immediately legible, as if it were removed from any human trace or of the impact of modification of the land, and by imaging events in isolation from one another–often showing a background in terrain view as if it has no relation to the events that the map describes.  Although weather maps and AccuWeather forecasts are sources of continual fascination, and indeed orientation, they are are also among the most confounding media to orient viewers to the world’s rapidly changing climate–and perhaps among the most compromised.  For they imply a remove of the viewer from space-and from the man-made nature of the environment or the effects of human activity form the weather systems whose changes we increasingly register.  By reifying weather data as a record of an actuality removed from human presence at one place in time, they present a status quo which it is necessary to try to peel off layers, and excavate a deeper dynamic, and indeed excavate the effects of human presence in the landscape or geography that is shown in the map.  We are drawn to tracking and interpret visualizations of data from satellite feeds in such weather maps–or by what is known as “remote sensing,” placed at an increased remove from the human habitation of a region, and indeed in a dangerously disembodied manner.

Visualizations resulting from remote observation demand taken as a starting point to be related to from the human remaking of a region’s landscape that has often increasingly left many sites increasingly vulnerable to climate change.  But the abstract rendering of their data in isolation from a global picture–or on the ground knowledge of place–may render them quite critically incomplete.  The remove of such maps may even suggest a deep sense of alienation form the environment, so removed is the content of the data visualization form human presence, and perhaps from any sense of the ability to change weather-related events, or perceive the devastating nature of their effects on human inhabitants:   their stories are about weather, removed form human lives, as they create realities that gain their own identity in images, separate from a man-made world, at a time when weather increasingly intersects with and is changed by human presence.  While throwing into relief the areas hit by flooding near to the southeastern Texas shore at multiple scales based on highly accurate geospatial data, much of which is able to be put to useful humanitarian uses–

Harvey flooding_1.jpg

Dartmouth Flood Observatory/University of Colorado at Boulder, August 29. 2017

1504094467hurricane-harvey-flood-map.gif

Maps of the World

–the reduction of floods to data points creates a distorted image of space renders their occurrence distant from the perspective of folks on the ground, and places their content at a considerable remove from the complex causality of a warming Gulf of Mexico, or the problems of flood drainage by which Galveston and Houston were beset.  Indeed, the multiple images of that report rainfall as an overlay in a rainbow spectrum, at a remove from the reasons for Houston’s vulnerability to flooding and the limits the region faces of flood control, in broadcast Accuweather images of total rainfall in inches advance a metric that conceals the cognitive remove from the dangers of flooding, ora human relation to the landscape that the hurricane so catastrophically affected.  Can we peel under the layers of the data visualization, and create better images that appreciate the human level on which the landscape stands to be devastated by hurricane rains, as much as tracking the intensity of the growth of rainfall over time?

90-5.jpeg

AccuWeather, Rainfall levels by Thursday

90

AccuWeather, Friday to Monday

Such layers of green, meant to suggest the intensity of rainfall that fell over land, reveal the concentration of water in areas closes to the Gulf of Mexico.  Even the most precise geographical records of the dangers of flooding in the floodplain of southeastern Texas with little reference to the historical modification of the region by inhabitants–

Harvey flooding_1

Dartmouth Flood Observatory at University of Colorado, Boulder/August 29, 2017

–and conceal the extent to which the landscape’s limited ground cover permeability has left the region far more susceptible to flooding, and elevated the risks of the emergency.  The problem of reading any signs of human presence into these “images” of precipitation provoke problems of disentangling remote sensing data from knowledge of the region’s recent urban growth and the consequent shift in local landcover.

The perspective of our relation to these events is often as fleeting and as existential as they flood us with data, which we viewers have little perspective or tools to process fully.  The onrush of recent remote sensing maps batter us with an array of data, so much as to lead many to throw up their hands at their coherence.  Even as we are  still trying to calculate the intensity of damages in Puerto Rico–where electricity is so slowly returning that even even after four months, almost a third of its 1.5 million electricity customers still lack power–and the cost of fires in southern California.  We look at maps, hoping to piece together evidence of extensive collateral damage of global warming.  Yet we’ve still to come to terms with the intensity of rainstorms that hit southeastern Texas–deluging the coast with rainfall surpassing the standard meteorological chromatic scale that so misleadingly seems to offer a transparent record of the catastrophe, but omits and masks the experiences of people on the ground, digesting swaths of remotely sensed data that take the place of their perception and experience, and offering little critical perspective on the hurricane’s origin.

The rapidity with which rain challenged ground cover permeability provides both a challenge for mapping as a symptom of global warming and landscape modification:   the mapping of “natural” levels of rainfall blurs the pressing problem of how shifting landcover has created an impermeability to heightened rains, and indeed how the new patterns of habitation challenge the ability of the coast of the Gulf of Mexico to absorb the prospect of increased rain in the face of decreasing groundcover permeability, and the extreme modification of the coastline that increasingly impeded run-off to the Gulf.

2.  Across much of southeastern Texas, a region whose growth was fed by the hopes of employment in extractive industries, real estate demand and over-paving have unfortunately intersected with extreme weather in southeastern Texas in ways which dat visualizations have had trouble exposing, but which raise a curtain on the coming crises of a failure of ability to accommodate increased levels of rainfall  If the lack of precedent for the intense rainfall in Galveston Bay generated debate about introducing a new color that went beyond the rainbow scale employed in weather charts, what seemed a problem of the cartographic color-spectrum suggested a problem of governability and indeed government response to extreme weather conditions.  How to register the dangers of rainfall that goes of the scale or standards of measurement?

One must consider how to orient viewers to the intensity of consequent flooding, and to its consequences and better prepare ourselves for the arrival of deluging rains without falling back on the over-freighted metaphor of rains of biblical scope.  How many more hurricanes of increasing intensity can continue to pound the shores, by whipping precipitation from increasingly warming waters and humid air?  The cumulative pounding of tropical cyclones in the Gulf stands to create a significantly larger proportion of lands lying underwater–temporarily submerged lands–with radically reduced possibilities of drainage, as hurricanes carry increased amounts of evaporated water from the humid air of the warming gulf across its increasingly overbuilt shores. in ways that have changed how the many tropical cyclones that have crossed the land-sea threshold since NOAA began tracking their transit (in 1851) poses a new threat to the southeastern coast of Texas, and will force us to map the shifting relation between land and water not only in terms of the arrival of hurricanes, or cyclonic storms–

–but the ability of an increasingly overbuilt landscape to lie underwater as the quantity of the Gulf coast rainfall stands to grow, overwhelming the overbuilt nature of the coast.

Most maps that chart the arrival and impact of hurricanes seem a form of climate denial, as much as they account for climate change, locating the hurricanes as aggressive forces outside the climate, against a said backdrop of blue seas, as if they  are the disconnect.  Months after the hurricane season ended, the damage for hurricanes caused have hardly been assessed in what has been one of the most costly and greatest storm damage since 1980 in the United States,–including the year of Hurricane Katrina–we have only begun to sense the damage of extreme weather stands to bring to the national infrastructure.  The comparison to the costs of storm damage in previous years were not even close.

But distracted by the immediacy of data visualizations, and impressed by the urgency of the immediate, we risk being increasingly unable to synthesize the broader patterns of increased sea surface temperatures and hurricane generations, or the relations between extremely destructive weather events, overwhelmed by the excessive destruction of each, and distracted from raising questions about the extremely poor preparation of most overbuilt regions for their arrival, and indeed the extent to which regional over-building that did not take the possibility of extreme weather into account–paving large areas without adequate drainage structures or any areas of arable land–left inhabitants more vulnerable to intense rains.  For in expanding the image of the city without bounds, elasticity, or margins for sea-level rise, the increasingly brittle cityscapes of Galveston and much of the southeastern Texas shoreline were left incredibly unprepared for the arrival of hurricanes or intense rains.  Despite the buzz of an increased density of hurricanes to have hit the region,

1851_2013_mjrhurr2

the questions of how to absorb hurricanes of the future, and to absorb the increased probability of rainfall from hurricanes in the Gulf of Mexico and its shores, suggests questions of risk, danger, and preparation that we have no ability to map.  What, indeed, occurs, as hurricanes themselves destroy the very means of transmitting on the ground information and sensing weather, and we rely exclusively on remote sensing?

Destroyed satellite dishes after Hurricane Maria hit Humacao, Puerto Rico  REUTERS/Alvin Baez

To characterize or bracket these phenomena as “natural” is, of course, to overlook complex interaction between extreme weather patterns and our increasingly overbuilt environments that have both transformed the nature of the Southeastern Texas coast and have made the region both an area of huge economic growth over time, and have paved over much of the floodplain–as well as elevated the potential risks that are associated with coastal flooding in the Gulf Coast.  To be sure, any discussion of the Gulf of Mexico must begin from the increasingly unclear nature of much of our infrastructure across land and sea, evident in the range of pipelines of gas and oil that snake along a once more clearly defined shore charted by ProPublica in 2012, revealed the scope of the manmade environment that has both changed the relation of the coastal communities to the Gulf of Mexico, and has been such a huge spur to ground cover change.

The expansive armature of lines that snake from the region across the nation–

pipeline_line_map

ProPublica, Pipeline Safety Tracker/Hazardous liquid pipelines are noted in red; gas in blue

-and whose tangle of oil pipelines that extend from the very site of Galveston to the Louisiana coast is almost unable to be defined as “offshore” save as a fiction, so highly constructed is much of the national waters in submerged lands in the Gulf of Mexico–

gulfofmexicopipelines

ProPublica, Pipeline Safety Tracker/Hazardous liquid pipelines are noted in red

They indeed seem something of an extension of the land, and a redefinition of the shore, and reveal a huge investment of the offshore extractive industries that stand to change much of the risk that hurricanes pose to the region, as well as the complex relation of our energy industries to the warming seas.  Yet weather maps, ostensibly made for the public good, rarely reveal the overbuilt nature of these submerged lands or of the Gulf’s waters.

Despite the dangers that such an extensive network of hazardous liquid lines along the Gulf of Mexico, the confusion between mapping a defined line between land and water, and visualizing relations of extreme weather disturbances as hurricanes in the Gulf of Mexico and local infrastructure haunts the extremely thin nature of the sort of data visualizations that are generated about the dangers of hurricanes and their landfall in the region.  For all too often, they presume a stable land/sea divide, removed from the experience of inhabitants of the region and how we have remade the shore.

3.  How can we better integrate both a human perspective on weather changes, and the role of human-caused conditions in maps of extreme weather?  How can we do better by going beneath the data visualizations of record-breaking rainfall, to map the human impact of such storms?  How could we do better to chart the infrastructural stresses and the extent to which we are ill-prepared for such extreme weather systems whose impact multiplies because of the increased impermeability of the land, unable to absorb excessive rainfall, and beds of lakes and reservoirs that cannot accommodate increased accumulation of rainfall that  stand to become the new normal?  The current spate of news maps that provoke panic by visualizing the extremes of individual cases may only inspire a sort of data vis-induced ADD, distracting from infrastructural inadequacies to the effects of global warming–and leaving us at a loss to guarantee the best structures of governability and environmental readiness.

Indeed, the absence of accurately mapping the impact and relation between landcover, storm intensity, rainfall, flooding, and drainage abilities increases the dangers of lack of good governance.  There need not be any need for a reminder of how quickly inadequate mapping of coastal disasters turns into an emblem of bad governance.  There is the danger that, overwhelmed by the existential relation to each storm, we fail to put them together with one another; compelled to follow patterns of extreme weather, we risk being distracted from not only the costs but the human-generated nature of such shifts in seasons between extremes of hot and cold.  For as we focus on each event, we fail to integrate a more persuasive image of how rising temperatures stand to create an ever-shifting relation between water and land.

Provoked by the rhetoric of emergency, we may need to learn to distance ourselves better from the aerial views that synthesize intense precipitation, tally hurricane impacts, or snowfall levels, and view them less as individual “strikes” or events and better orient ourselves to a broader picture which put us in a less existential relation to extreme weather.

2017-four-us-hur-landfalls_3

The Weather Channel

We surely need to establish distance to process syntheses of data in staggering aerial views on cloud swirl, intense precipitation, and snowfall, and work to peel back their striking colors and bright shades of rainbow spectra, to force ourselves to focus not only on their human costs, or their costs of human life, but their relation to a warming planet, and the role of extreme of weather in a rapidly changing global climate, as much as track the “direct strikes” of hurricanes of individual names, as if they were marauders of our shores:  their creation is as much tied to the changing nature of our shores and warming sea-surface temperatures, and in trying to create a striking visualization, we deprive ourselves from detecting broader patterns offering better purchase on weather changes.

direct-strikes

The Weather Channel

If patterns of weather maps epitomized by Accuweather forecast and projections suggest an exhilaratingly Apollonian view on global and regional weather patterns, they also  shift attention form a broader human perspective in quite deeply pernicious ways.  Such maps provided the only format for grasping the impact of what happened as the hurricane made landfall, but provided little sense of the scale of inundations that shifted, blurred and threatened the coast of the Gulf of Mexico.  They provide a format for viewing floods that are disjoined from victims, and seem to naturalize the quite unnatural occurrence of extreme weather systems.  Given the huge interest in grasping the transformation of Hurricane Harvey from a tropical storm to a Category Four hurricane, and the huge impact a spate of Category Four hurricanes have created in the Gulf of Mexico, it’s no surprise that the adequacy of the maps of Hurricane Harvey have been interrogated as hieroglyphs or runes of a huge weather change:  we sift through them for a human story which often left opaque behind bright neon overlays, whose intensity offer only an inkling of a personal perspective of the space or scale of their destruction on the ground:  while data maps provide a snapshot of the intensity of rain-levels or wind strength at specific sites, it is difficult if important to remember that their concentration on sites provide a limited picture of causation or complexity.

All too often, such maps fail to offer an adequately coherent image of disasters and their consequences, and indeed to parse the human contributions to their occurrence.  This post might be defined into multiple subsections.  The first actions suggest the problems of mapping hurricanes in the Gulf of Mexico in relation to flooding in data visualizations of the weather and the overbuilt region; the middle of the post turns to an earlier poetic model for considering the relation between land and sea that visualizations all too easily obscure, and the meaning that the poet Elizabeth Bishop found in viewing relations between land and sea in a printed map of the Atlantic; after returning to the question of the overbuilt shore compounds problems of visualizing the Texas coast, the final section, perhaps its most provocative, returns to Bishop’s reading of a map of the Atlantic coast.

What such new weather maps would look like is a huge concern.  Indeed, as we depend on weather maps to orient us to place ourselves in the inter-relations of climate change, sea-level, surface temperatures, and rain, whether maps cease to orient us to place, but when best constructed help to describe the changing texture of weather patterns in ways that can help familiarize us not only to weather conditions, but needed responses to climate change.  For  three months after the hurricanes of the Gulf of Mexico caused such destruction and panic on the ground, it is striking not only that few funds have arrived to cover costs of rebuilding or insurance claims, but the judgement or understanding of the chances for future flooding have almost left our radar–perhaps pushed rightly aside by the firestorms of northern and southern California, but in ways that troublingly seem to forget to assess or fail to assess the extent of floods and groundwater impermeability  along the Texas and Louisiana coast.  The problems that preparation for future coastal hurricanes off the Gulf of Mexico raise problems of hurricane control and disaster response that seem linked to problems of mapping their arrival–amd framing the response to the increasing rains that are dumped along the entire Gulf Coast.

Indeed, the chromatic foregrounding of place in such rainbow color ramps based on GPS obscure other maps.   Satellite data of rainfall are removed from local conditions, and serve to help erase complex relations between land and water or the experience of flooding on the ground–by suggesting a clear border between land and sea, and indeed mapping the Gulf of Mexico as a surface as if it were unrelated to the increased flooding around Houston, in maps prepared from satellite imagery, despite the uneasy echoes of anthropogenic causes for the arrival of ten hurricanes in ten weeks, in ways that suggest how warming waters contributed to the extreme inundation of the Gulf Coast.  Despite NOAA  predictions of a 45% likelihood of ‘above-normal’ activity for the 2017 Atlantic hurricane season, with, a 70% likelihood of storms that could transform into hurricanes, the images of inundated lands seem both apocalyptic and carefully removed from the anthropogenic changes either to the ocean or land that intensified their occurrence so dramatically on the ground.

Dartmouth Flood Observatory Flooding Harvey

 Dartmouth Flood Observatory

Harvey flooding_0.jpg

Dartmouth Flood Observatory/August 29, 2017

Is it possible to recuperate the loss of individual experience in such data maps, or at least acknowledge their limitations as records of the complexity of a changing climate and the consequences of more frequent storm surges and such inundations of rainfall?  As we seek better to understand the disaster relief efforts through real-time maps of effects of Hurricane Harvey as it moved inland from the Gulf of Mexico, shifting from Category 4 Hurricane from a tropical storm, we tried to grasp levels of rainfall that spun out of 115-mile-an-hour winds across southeastern Texas that damaged crops, flooded fields, ruined houses, and submerged cars, we scan stories in hope of clues to assess our position in relation to increasingly dangerous weather systems whose occurrence they may well forebode.  At a time of increased attention to extreme weather has long developed, the gross negligence of climate change denial is increasingly evident:  it recalls the earlier denial of any relation between hurricanes and climate change, when increased hurricanes were cast as “the cycle of nature,” rather than as consequences whose effects have in fact been broadly intensified by human activity.

Current attempts to map the toll of record-smashing hurricanes focused almost exclusively on point-based data view rainstorms largely as land-based records; even as they intend to monitor the effects of Harvey’s landfall by microwave censors, they risk seeming to isolate real-time rainfall levels from the mechanics warmer air and sea-surface temperatures which result from human-caused global warming, not relating increased storm surges or inundations to achanges in coastal environments or climate change.  To render such changes as natural–or only land-based–is irresponsible in an age of reckless levels of climate denial.  Indeed, faced by the proliferation of data visualizations, part of the journalistic difficulty or quandary is to integrate humanistic or individual perspectives on the arrival of storms, rendered in stark colors in the increasingly curtailed ecosystems of newsrooms which seek simplified visualizations of satellite data on the disaster, which fail to note the human contributions to the travails that are often reserved for photographs, which increasingly afford opportunities of disaster tourism in the news, emphasizing the spectator’s position before disasters, by images that underscore the difficulties in processing or interpreting the proliferation of data from MODIS satellite feeds:  we can show the ability to measure the arrival of torrential rains, but in offering few legends, save the date and scale, but offering few keys o interpret the scale of the disaster.

The looming portent of human-made climate change, however, underlies the poor predictions that NOAA offered of perhaps 2-4 major hurricanes this Spring, and the lack of a new director for NOAA–on which local and state agencies depend to monitor the nations shores and fisheries–suggested the, from June to September, which left states on their own to make decisions and plan for disaster mitigation programs and better flood maps.  (The danger of appointing a newly nominated director, Barry Myers, who is a strong supporter of the privitization of weather maps and an executive at the private Accuweather mapping service, suggests the difficulty of determining the public-private divide in an era of neoliberalism, and a free market of weather maps that were once seen as central to national security and standards of safety.)   There are two hidden scales on which we read these opaque maps of global warming and globalization and local inundation are triply frustrating.   For all the precision and data richness of such point-maps of largely land-based rainfall, local temperature, or flooding, the biases of such instantaneous measurements seem to fit our current governing atmosphere of climate change denial, and dangerous in erasing how such storms are informed by long-term consequences of man-made climate change.  (As the mapping tools of coastal weather seem destined to change, what sort of change in direction for NOAA coastal maps do we want:  the appointment suggests the terrifying possibility of a return to the Bush-era proposal nominee Myers supported that prohibiting the agency from producing any maps already available in the private sector then threatened federal weather lines to go dark–lest they literally compete with ad-supported websites private providers–and shift federal information offline?)

For making moves toward the future readability of weather maps may well be at stake in critically important ways.  The 2005 proposal that Myers backed would have eliminated the National Weather Service, even while exempting those forecasts needed to preserve “life and property,” would in essence have returned the weather services to a pre-internet era, even as the most active hurricane season including a record breaking fifteen hurricanes and twenty-eight storms began in the gulf coast, including the infamous hurricane Katrina.  The proposed bill would have prevented NOAA from posting open data, and not only readily available to researchers and policymakers, in ad-free formats, free of popup screens, but allow them to make their own maps on the fly–ending good practices of posting climate data would work quite dangersously to prevent development of tools of data visualization outside commercial models of rendering storms and hurricanes as if environmentally isolated.

2005-tracks-update.jpg
direct-strikes

A deeper problem of providing such limited weather maps of tropical storms may be the subtexts about the relation of human causes to weather they convey, and the absence of a greater narrative of the transformation of a global ecology or of the ecology of the Gulf Coast.  The curtailed images of “nature” they present by symbolizing rains, winds, floods, or submerged regions in appealing hues as natural–raise questions of the odd simplicity of the absent storylines:  cheery colors erase or bracket complex questions of climate change, the human contribution to extreme weather events, or the human experience of suffering on the ground:  Rita, Cindy, Katrina, Dennis, and Wilma seem not part of the environment, epiphenomenal interlopers moving across a static deep blue sea, in an apparent dumbing down of the mechanics of hurricane or storm formation in a rainbow spectrum removed from a human-made environment.

Continue reading

Leave a comment

Filed under anthropogenic change, climate change, coastlines, ecological disasters, gulf coast

The Terror of Climate Change: Uncorking Bombs of Streaming Snow in the Atlantic

Even in an era when waking up to weather bulletins provide a basic way of orienting oneself to the world, the arrival of the bomb cyclone in the early morning of January 4, 2018, along the east coast of the United States, commanded a certain degree of surprise.  For those without alerts on their devices, the howling winds that streamed through the streets and rose from rivers provided an atmospheric alert of the arrival of streams of arctic air and snows, creating something called “white-outs” in highways along much of the eastern seaboard that paralyzed traffic and reminded us of the delicate balance over much of our infrastructure.

The effects of the arrival of a low pressure system in the western Atlantic created effects that cascaded across the nation, setting temperatures plummeting and winds spewing snowfall as the extratropical cyclone was displaced off New England, and propelling snow over the east coast from what was an offshore weather disturbance.  While the “bomb cyclone” sounded portentous, the actual explosiveness was perhaps not felt at its eye over the Atlantic–

 

636505697845576807-Capture

 

–as the bomb-like burst of pressure scattered snows through howling winds across much of the coast, but rather in the unbalanced distribution of snowfall across the nation that it so quickly created.  The cyclonic winds of the “weather bomb” could not be localized:  their effect was to set off a burst of precipitation, chilled by arctic airs, remindeding us of the delicate relation between land and sea in an era of climate change, when we are apt to feel the effects of colliding air masses across the country, as far as Tennessee or Ohio.

The bomb created a deep oceanic disturbance in the dissonance of sea-surface and air temperatures, and triggered the increasing imbalances of the distribution of snow across the nation, as if inaugurating an era of the increasingly unequal levels of snowfall, as a bomb that seemed to burst over the Atlantic sent snowfall flying across the east coast–

 

Thursday am bomb

 

–in ways that led to a deep disparities of snow and ice levels across much of the country, where much of the nation’s western states were surprisingly free of snow, increasingly rare save in several spots.

 

January-18-Snow-Analysis

 

 

 

 

The bomb cyclone spread across a broad surface of the eastern seaboard and Gulf of Mexico, as the areas that stand to be open to gas- and oil-speculation suddenly took a far greater hit than was expected, raising questions of the arrival of extreme weather systems as sea-surface temperatures grew:  the kink in the Gulf Stream created a swirl that sucked in arctic air and spread clouds of snowfall across the eastern seaboard as the seas became incredibly stormy, driven by hurricane winds.  The bomb cyclone wasn’t a major disaster, but seems a wake-up call of the charting of minerals stored in the seabed of offshore areas in the Outer Continental Shelf off the United States–the “federal lands” that the government decided since it administers directly it may as well start to lease.

 

Thursday am bomb.pngPrecipitation Column Rising from Offshore Winds, January 2-4/Ryan Maue

Continue reading

2 Comments

Filed under energy independence, environmental change, Global Warming, oceans, remotely sensed maps

The Cognitive Clouding of Global Warming: Paris and Pittsburgh; Creditors and Debtors

The argument of America First seems to have been extended to its logical conclusion as the apparently selected President of the United States has single-handedly subtracted the nation from a map of climate change.  By denying the place of the United States in the Paris Climate Accords, President Trump seems, in the most charitable interpretation, to have acted on his own instincts for what was the benefit that accrued to the country in the very short term, and after looking at the balance books of the United States government for what might have been the first time, decided that America had no real part in the map of the future of a warming world.  Rather than outright denying global warming or climate change, Trump decided that the conventions established to contain it by the world’s nations had no immediate advantage for the United States.  The result wasn’t really to subtract the United States from the ecumene, but from the phenomenon or at least the collective reaction of the world to climate change, and openly declare the supremacy of his own personal opinion–as if by executive fiat–on the matter.

The personal position which he advanced was so personal, perhaps, to be presented in terms of his own clouded thinking on the matter, or at least by seizing it to create what he saw as a wedge between national consistencies, and to use wildly incommensurate forms of data to create the impression of his own expertise on the issue–and to mislead the nation.  For Donald Trump took advantage of his having Presidential podium to diss the Paris Accords by a torrent of alliteration as resting on a “cornucopia of dystopian, dishonest and discredited data.”  Even if one wants to admire the mesmerizingly deceptive alliteration, the notion of rooting an initial response to planetary climate change in the perspective of one nation–the United States of America–which produced the lion’s share of greenhouse gasses–is only designed to distort.  By pretending to unmask the Paris Accords as in fact a bum economic deal for the United States, as if it were solely designed to “handicap” one national economy, set a sad standard for the values of public office.  For as Trump dismissed data on climate change as discredited with mock-rage, and vowed that the entire affair had been designed by foreign groups who had already “collectively cost America trillions of dollars through tough trade practices” and were desiring to continue to inflict similar damage.

But the large future on trade imbalances–which he treated as the bottom line–he staged a spectacle of being aggrieved that seemed to take on the problems of the nation, with little sense of what was at stake.  Trump’s televised live speech was preeminently designed only to distract from the data on which the Accords had been based.  And even as Trump sought to pound his chest by describing the Accord as a “bad deal for Americans,” that in truth “to the exclusive benefit of other countries.”  By turning attention to an America First perspective on global warming, Trump sought to replace the international scope of the challenge–and intent of the much-negotiated Climate Accords–by suggesting that it obscured American interests, even if it only took America’s good will for granted.  As if explaining to his televised audience that the agreement only “disadvantages the United States in relation to other countries,” with the result of “leaving American workers–who [sic] I love–. . . to absorb the cost in terms of lost jobs [and] lower wages,” he concealed the actual economics of withdrawing from the Accords were buried beneath boasts to have secured “350 billion of military and economic development for the US” and to help American businesses, workers, taxpayers, and citizens.  In dismissing the data out of hand about the expanded production of greenhouse gasses, Trump ridiculed the true target of the nearly universally approved Accords, scoffing at the abilities to reduce global temperatures; instead, he concentrated on broad figures of lost jobs in manufacturing and industries that are in fact small sectors of the national economy, and incommensurable with the dangers of ignoring global warming and climate change, or the exigencies of taking steps to counter its recent growth.

 

global warming

Increased likelihood of temperature rising above previous records by 2050 and 2080

 

oceanic-warmingSea Surface Temperatures compared to historical baseline of a century ago

 

As if years of accumulated data of earth observation could be dismissed as deceptive out of hand by executive authority, independent of an accurate judgement of its measurement, Trump dismissed expert opinion with the air of a true populist whose heart lay in the defense of the American people and their well-being–as if they could be abstracted and prioritized above the world’s  Trump’s largely rambling if gravely delivered comments in the Rose Garden press conference that painted himself as daily fighting for the country cemented the alliance of populism and a war on science by its odd substitution of bad economic data for good scientific data.  The switch is one in which his administration has specialized.  His address certainly culminated an outright dismissal of scientific conclusions based on a distorted America First picture of the world, where a stolid declaration that “the United States will withdraw from the Paris Climate Accords” made sense as form of national defense–despite the potential global catastrophe that rising global temperatures and sea surface temperatures threaten.

The catastrophes were minimized by being argued to be based on “discredited data” in a bizarre flourish designed to dismiss scientific concensus  Trump conspicuously faulted not only the “discredited” but distracting nature of data  in the speech he gave in the Rose Garden on June 1, 2017 that supposedly justified his announcement of withdrawing from the Paris Climate Accords in 2015 to limit heat-trapping emissions of carbon fuels that have been tied to observed climate change.  Rather than foreground the international nature of the accords among agreed upon by almost 200 nations, trump advanced the need to heed local interests, perversely, but even more perversely argued that the Accords resulted from disinformation.  He spoke to the world to chastise their recognition of scientific observations, in so doing destabilizing not only global alliances but undermining a long-negotiated climate policy by pulling the rug out from long accepted consensus not only of climate scientists but a role of national leadership that sought to remedy the failure of the Kyoto Protocol of 1997.  Trump turned his back on the Climate Accords on how to curb greenhouse gas emissions  by proclaiming their unfairness to American interests, and attacking unwanted constraints on American industry, through his own deployment of data that was even more discredited as an excuse to walk away from the prospect of a greener world.

 

Exiting the Green.png  Al Drago/New York Times

 

If Trump steered the nation away from green energy and into darkness, Vladimir Putin seemed to mock Trump’s rationale for the withdrawal when he mused, jokingly but ever so darkly, that “maybe the current [U.S.] president thinks they are not fully thought-through,” making open fun of Donald Trump’s image of global leadership by wryly noting in ways that echoed the absurdity of Trump’s defense of the local in place of the global.  “We don’t feel here that the temperature is going hotter here, . . . I hear they are saying it snowed in Moscow today and its raining here, very cold,” Putin noted, as if relishing undermining long-established trends in climate data by invoking a populist championing of local knowledge as if it trumped the advantages of earth observation that satellite observation has long provided.   Populism trumped expertise and Putin laughed at the possibility that the Accords might soon fail as a result.

Given the longstanding desire of Moscow to be released from constraints on exploring the billions of tons of Arctic oil on which Russia has chosen to gamble, Trump’s almost purposive blindness to a changing environmental politics of the global economy astounds for its parochialism, and its championing of place to dismiss undeniable effects of climate change that seems closely tied to carbon emissions.  For with a false populism that championed the limited perspective of one place in the world–or one’s own personal experience–Trump dismissed the maps and projections of climate change, on the basis that the “deal” was simply “BAD.”  And as a man who views everything as yet another deal, while he pronounced readiness to “renegotiate” an accord he sought to cast as a failure of President Obama to represent America’s interests, the rebuke fell flatly as the accord was never designed to be renegotiable.

Putin’s remarks were met by scattered laughter of recognition, and some smirks at the decision of the American president to withdraw form a long-negotiated set of accords to the collective dismay of our military and environmental allies, and its implicit endorsement of deniers of climate change.  The potential “axis of mass destruction” France’s climate minister has cautioned against might indeed be one of mass distraction.  For in dismissing and indeed disdaining the historical accords to limit carbon emissions, Trump sought a soundbite sufficient to stoke suspicions the climate treaty.  He sought to cast it as yet another deeply rigged system of which he had taken to compulsively warning Americans.  Such a metaphor of bounty was jarring to reconcile with onerous economic burdens cited as the prime motivations for deciding to reject the Paris Accords on Climate Change.  The jarring cognitive coinage seemed to connote its negative by a disorienting litotes; but perhaps the most striking element of the entire news conference was that Trump offered no data that backed up his own pronouncements and appearance of steadfast or only obstinate personal resolve.

Before the coherence of the embodiment of climate change in maps, Trumps jarringly juxtaposed radically different sorts of statistic to snow the nation–and the world–by disorienting his audience, on which Trump turned to a litany of complaints and perceived offenses striking for providing no data of any sort, save several bits of false data.  As much as Trump betrayed uneven command over the data on climate change, as if embedding discrete numbers in unclear fashion that supported a self-evident argument, as if they addressed one of the most carefully documented changes in the atmosphere of the world.  By juxtaposing a threat that “could cost Americans as much as 2.7 million lost jobs by 2025“–a number described as extreme but decontextualized to exaggerate its effect, framed by the dismissive statement  “Believe me, this is not what we need!“– with a projected small temperature decrease of two tenths of a degree Celsius–“Think of that!  This much”–as if to indicate the minuscule return that the “deal” offered to the United States that would have made it worthy accepting its costs–

 

sub-buzz-27555-1496436714-1

 

The gesture seemed designed to juxtapose the honesty of direct communication with the deceit of the experts.   Trump’s notion of direct communication concealed the surreal enjambment of disproportionate numbers more striking by the difference of their scale than their meaning.  Of a piece with his citation of partial statistics that exaggerate his points, from “95 Million not in the U.S. labor force” as if to imply they are all unsuccessfully looking for work, targeting some 8 million immigrants as “illegal aliens”ready for deportation, or how immigrants coast American taxpayers “billions of dollars a year.”   Such large figures deploy discredited data difficult to process to conjure fears by overwhelming audience, distracting from specific problems with large numbers that communicate an illusion of expertise, or even overwhelm their judgment by talking points disseminated in deeply questionable media sources.

If the power of this juxtaposition of unrelated numbers gained their effectiveness because of a lack of numeracy–Trump’s claim of 100 million social media followers lumps his followers on Twitter, Facebook and Instagram, many of whom may be the same people, and other fake persona —the numbers seem to exist for their rhetorical effect alone, as if to awe by their size and dismiss by the miniscule benefits they might provide. The point of contrasting such large and small statistics was to suggest the poor priorities of the previous administration, and dilute form the consensus reached on the modeling of climate change.  To be sure, the Trump administration also barters in fake facts on Fox News Sunday. inflating the number of jobs in coal industries, that show a misleading sense of the government’s relation to the national economy, generating a range of falsehoods that disable fact-checking, obscuring the fact that the global marketplace increasingly gives preference to cleaner energy and clean energy jobs more quickly others sectors of our national economy beyond energy industries.  The ties of Trump’s administration to fossil fuels–from the Secretary of State to the Secretary of Energy to the Secretary of the Interior down–employ the obsfuscating tactics of fossil fuel industries to obscure benefits of low-carbon fuels.  Indeed, the inability to “renegotiate” a deal where each nation set its own levels of energy usage rendered Trump’s promise of the prospect of renegotiation meaningless and unclear, even if it was intended to create the appearance of him sounding reasonable and amiable enough on nightly television news.

 

Broad hands.pngCheriss May/Sipa via AP Images 

 

Another point of the citation of false data was to evoke a sense of false populism, by asking how the Accords could ever add up.  In isolating foregrounded statistics great and small, tightly juxtaposed for rhetorical effect, the intent seems consciously to bombard the audience to disorienting effect.  We know Trump has disdain for expertise, and indeed the intersection between a sense of populism with disdain or rejection of science may be endemic:  in formulating responses to a global question like climate change that he has had no familiarity with save in terms of margins of profits and regulations.  Rather than consulting experts, the President has prepared for public statements by consulting sympathetic media figures like Kimberly Guilfoyle who endorse climate conspiracy–and not experts–who use data as obscuring foils, suggesting an ecology of information originating from pro-fossil fuel industry groups.

But as much as adopt talking points from other media, Trump uses data to frame overstatements of unclear relation to actualities–as making the distorting and meaningless promise to drop power plant climate rules, clean water rules and other regulations to “help American workers, increasing wages by more than $30 billion over the next seven years”–a figure drawn from a fossil fuel industry nonprofit, which offered little grounds for such a claim, and was a cherry-picked large number offered without any contextualization–or consideration that $30 billion would not fill the pockets of 300 million.  The point of allowing workers to continue to fire coal without hoping to meet any guidelines for carbon emissions did secure the total of 50,000 jobs in coal mining in the US, bit seems out of synch with the decline of demand for coal world-wide.

 

 

The point of citing such numbers offer a scaffolding for many of Trump’s claims, but as talking points serve to disorient as much as instruct, and disorient from a global perspective and became the basis for pushing the groundless withdrawal from the Paris Accords.  Perhaps the orientation for the talking points that migrate from many right-wing news sites into Trump’s public speeches As many of the talking points culled from the unsourced ecosystem of the internet inform Trump’s public statements that may be drawn from a special dossier that arrives on his desk, as Shane Goldmacher suggested, many of which are circulated in the White House to feed Trump’s personal appetite for media consumption, many both dislodged from their original contexts and some neither substantiated or fact-checked, are printed and placed on his desk in the Oval Office, effectively introducing dissembling as much as dissenting information into Trump’s significantly reduced three-page Presidential Daily Briefing.

Such a new information economy that defines the Oval Office in the Age of Trump makes it less of a nexus of information-sharing from scientific communities.  It rather serves to introduce information designed to swamp existing facts–as the eight inch rise in sea levels since 1880, or the catastrophic floods on course to double by 2030, or economic disparities of the global footprints of different parts of the world, and only recently recognized ecological debts that patterns of consumption generate globally.

 

Eco deficits

creditors and debtors

 

It is almost difficult to tell whether the jarring incommensurability of great and small numbers that Trump cited in his Rose Garden press conference was intentional–a strategy designed to mystify,–as some have cautioned–or a sort of cognitive dissonance between the ingrained skepticism before data, and  belief in his own powers to resolve a problem of any size.   It may well be a combination of both:  but the history of long-term measurement of climate change suggest a perfect storm between his own doubting of data and persuasive skills with his outsized cognitive sense of his abilities to resolve an issue of such magnitude, and the inability he had of acknowledging that the United States had a need to recognize a debt it owed anyone.

The very overflow and abundance of data on global warming and climate change, in this context, cast a gauntlet and raised a challenge to be dismissed, and negotiated around in ways that did not depend on scientific observations, but would reflect his own ability to get a better deal for the United States alone, in a perverse impulse to isolationism in response to one of the greatest consequences and challenges of globalization–climate change–and the particular problems faced by the developing countries and for nations that were defined as biocapacity debtors.  Indeed, in separating the nation from a pact between developing and developed countries on energy use and fossil fuel emissions, the notion of any prospect of global compact is unsettled by the withdrawal of the largest developed nation form the Accords–under the pretense that their interests were not respected enough–with one other nations that sought to enforce stricter emissions guidelines.

 

Developed and Undeveloped Nations Signed onto Paris Climate Accords/Washington Post

Continue reading

Leave a comment

Filed under Donald J. Trump, Global Warming, globalism, globalization, statistics

Mapping Our Shrinking Shores

Coasts have provided the primary cartographical invention to understand the risks that erosion pose to property:  the coast-line is the boundary of the known land, and determines the outer bound of the real estate.  But the coastal fixation of the landlubber privileges the illusion of the fixity of the shore.  More than ever, assumptions about the fixity of shorelines must fall away.  Perhaps the most haunting take away from the Surging Seas web-based map of global shorelines forces us to take into account the inevitable mutability that must be accepted with the rising of ocean-level associated with climate change.

The web-map presents itself as a set of tools of analysis, as much as cartographical techniques, by which the rise of sea-level that has already risen globally some eight inches since 1880 stands to accelerate–emphasizing the alternate scenarios that the acceleration of sea-level rise stands to bring over the next hundred years, introducing a new concept of risk due to coastal flooding.  The availability of accurate GPS images of the elevations of homes have provided the possibility of sketching scenarios of sea-level rise to create readily zoomable maps of elevated ocean levels that confront us with at least the image of the options which we still theoretically have.  The contrasting futures created in this cartographical comparison shocks viewers with a salutary sort of operational paranoia only increased as one fiddles with a slider bar to grant greater specificity to the disastrous local consequences of rising sea-levels world-wide.

shanghai

In ways quite unlike the wonderfully detailed old NOAA Topographic Surveys which map shorelines at regular transects, or T-Sheets, recording the high waterline of tides across 95,000 coastal miles and 3.4 million square miles of open sea, the coastline is less the subject of these web maps than levels of potential inundation.  In a negative-mapping of possibilities of human habitation, blue hues invade the landscape in a monitory metric emphasizing the regions at risk of being underwater in a century.  Whereas scanned T-Sheets can now be viewed by a historical time-bar slider, the fixity of space or time are less relevant to the web maps than the gradients of possible sea-level rise caused by carbon emissions might force us to confront.

Surging Seas forces us to confront the possibilities of the future underwater world.  The infiltration of a deep shade of blue commands the eye by its intensity, deeper shades signifying greater depth, in ways that eerily underscore the deep connection that all land has to the sea that we are apt to turn our backs upon in most land maps, showing the extent to which a changing world will have to familiarize itself to water-level rise in the not-distant future.  It’s almost paradoxical that the national frontiers we have inscribed on maps has until recently effectually made impossible such a global view, but the attraction of imagining the somewhat apocalyptic possibility of sea-level rise seems almost to map a forbidden future we are not usually allowed to see, and has a weirdly pleasurable (if also terrifying) aspect of viewing the extensive consequences of what might be with a stunning level of specific and zoomable local detail we would not otherwise be able to imagine, in what almost seems a fantasia of the possibilities of mapping an otherwise unforeseen loss, not to speak of the apparent lack of coherence of a post-modern world.

For the variety of potential consequences of disastrous scenarios of sea-level rise posed can be readily compared with surprisingly effective and accurate degrees of precision, in maps that illustrate the depths at which specific regions stand to be submerged underwater should sea-level rise continue or accelerate:  zooming into neighborhoods one knows, or cities with which one is familiar, the rapid alteration of two to seven feet in sea-level can be imagined–as can the fates of the some 5 million people worldwide who live less than four feet above sea-level.  For if the shores have long been among the most crowded and popular sites of human habitation–from New York to London to Hong Kong to Mumbai to Jakarta to Venice–the increasing rapidity of polar melting due to climate change stands to produce up to a seven feet rise in sea-level if current rates of carbon emissions, and a mere four degree centigrade rise in global temperature stands to put the homes of over 450 million underwater, which even the most aggressive cutting in carbon emissions might lower to only 130 million, if rates of warming are limited to but 2°C.   (If things continues as they stand, the homes of some 145 million who currently dwell on land in China alone are threatened with inundation.)

The recent review of the disastrous consequences of a rise of two degrees Centigrade on the land-sea boundary of the United States led Climate Central to plot the effects of a-level rise of at least 20 feet on the country–and foreground those regions that were most at risk.   The webmap serves as something like a window into the possible futures of climate change, whose slider allows us to create elevations in sea-level that the ongoing melting of the polar ice-cap seems poised to create.  As much as offer compare and contrast catastrophes, the immediacy of recognizing the degree to which places of particular familiarity may soon stand to lie underwater performs a neat trick: for whereas a map might be said to bring closer the regions from which one is spatially removed or stands apart, making present the far-off by allowing one to navigate its spatial disposition in systematic fashion, the opacity of those light blue layers of rising seas obscures and subtracts potentially once-familiar site of settlement, effectively removing land from one’s ken as it is subtracted from the content of the map, and charting land losses as much as allowing its observation.

The result is dependably eery.  The encroachment of the oceans consequent to rising sea-level propose a future worthy of disaster films.  But the risks can be viewed in a more measured ways in the maps of sea-level on the shores of the United States calculated and mapped by Stamen design in the Surging Seas project that allows us to imagine different scenarios of sea-level rise on actual neighborhoods–the set of interactive maps, now aptly retitled Mapping Choices, will not only cause us to rethink different scenarios of shifting shorelines by revisiting our favorite low-lying regions, or allow us to create our own videos of Google Earth Flyovers of different areas of the world.  Mapping Choices provides a way to view the risks and vulnerabilities to climate change made particularly graphic in centers of population particularly low-lying, where they testify to the clarity with which web maps can create a vision of imagined experience as we imagine the actual losses that global warming is poised to create.  And although the recent expansion of the map to a global research report, allowing us to examine possible global futures that are otherwise difficult to comprehend or process the potential risks posed by the inundation of low-lying inhabited regions for a stretch of thirty meters, the potential risk of inundation is perhaps most metaphorically powerful for that region that one best knows, where the efficacy of a simple side-by-side juxtaposition of alternate potential realities has the unexpected effect of hitting one in one’s gut:  for debates about the possibilities of climate change suddenly gain a specificity that command a level of attention one can only wonder why one never before confronted as an actual reality.

Alternate Scenarios

Maps are rarely seen as surrogates for observation, and web maps often offer something like a set of directions, or way finding tools.  But the predicted scenarios of sea-levle rise allows one to grasp the local levels of inundation with a specificity that allow risk to be seen in terms of actual buildings–block by block–and wrestle with the risks that climate change portends.  The lack of defenses of populations in many regions are definitely also at great risk, but to envision the loss of property and known space seems oddly more affecting in such an iconic map of Manhattan–and somewhat more poetic as an illustration of the fungibility of its hypertrophied real estate and property values.

Of course, the data of Climate Change allows a terrifying view of the future of four degrees centigrade warming on low-lying Boston and the shores of the Charles, as the city is reduced to a rump of an archipelago–

Boston

or the disastrous scenarios for the populations in the lower lying areas of Jakarta–

Jakarta

or, indeed, in Mumbai–

Mumbai

Viewers are encouraged to imagine the risks of the possible alternate futures of just two degrees with an immediacy that worms into one’s mind.  The possibilities that GPS offers of instantaneous calculations of shoreline position and elevations allow one to view a potential reality where one can focus on individual streets with inspirational urgency.

But such scenarios seem somehow particularly graphic illustrations of risk for those regions where there has been a huge investment of human capital, as New York City, where it might seem credible enough to be mapped that they are poised to melt not into air but vanish beneath ocean waves.  For if Marx predicted with spirited apocalypticism at the very start of the Communist Manifesto that capitalism would destroy value to money as it expanded into future markets, as market forces abstracted all things into money–and “all that is solid melts into air”–the twentieth-century expansion of possibilities of environmental and human destruction have lent unprecedented urgency.  While for Marx the metaphor of melting of inherent value was the product of the capitalist system, the capitalist system bodes a strikingly similar image of sinking into the seas.  For huge expanses of the old industrial city–the piers and the old manufacturing zones, most all of the Jersey shore and area around Newark, Long Island City and the Gowanus canal seem sink apart from the shoreline in the future New York that Surging Seas creates, in ways that seem the consequence of industrial production and carbon surging far beyond 400 parts per million (ppm), with the addition of some 2 ppm per year, in ways that make it a challenge to return to the levels deemed healthy–let alone the levels of 275 ppm which the planet long held through the mid-eighteenth century.

That drought, hurricanes, disappearance of arctic ice (up to 80% in summertime) and rising sea levels are tied to the growth of greenhouse gasses hint how global capital might be closely linked to the sinking into the seas, and suggest the surpassing of a tipping point of climate change that is the counterpart to melting into air might be viewed, in New York City’s economic geography, as if to offer a poetic reflection of the migration of capital into the financial centers of the city downtown from its piers or areas of industry–

NY:NJ

–although half-hearted joking references to Marxist maxims (or geographers) is hardly the topic of this post, and the island of high finance that would be created in downtown Manhattan would hardly have ever been planned as an island.

Lower Manhattan Island?

What one might someday see as the lopping off of much of lower Manhattan might be far better tied to the runaway markets of a free-trade economy, rather than rational planning, and has no clear correspondence to property values.

lopped off lower Manhattan

Indeed, the mapping of the prospective loss of those residential parts of the city “where poor people dwell” (as do minorities) is undeniable, if one looks at the 2010 American Community Survey, regarding either in the city’s distribution of ethnic groups or households earning below $30,000, who remain the most vulnerable to the danger of rising ocean levels.

ACS 2005?

Income under 30,000American Community Survey (2010)/New York Times

But the disappearance of the Eastern Parkway and the Jersey shore are a blunt reminder of the extreme vulnerability of the built environment that lies close to sea-level–

Eastern Parkway and Atlantic Avenue above the seas

–and an actually not-too-apocalyptic reminder, but the mapping of consequences of man-made change that goes under the rubric of anthropocene, and is most apparent in the increasing quotient of carbon dioxide in the atmosphere and the warming that this may bring.  For if it has been approximated that there has already been a rise of sea-levels by some eight inches since 1880, the unprecedented acceleration of that rate, which will increase the dangers of floods from storms and place many of the some three thousand coastal towns at risk, are likely to increase as the sea level may rise from two to over seven feet during the new century.

350ppm-chart-300_fixed

The distribution is by no means uniform, and more industrialized countries, like the United States, are producing far more particulate matter, although they have been recently overtaken by China from 2007, and have atmospheres above 380 ppm in the Spring, making them more responsible for rendering higher temperatures–although the lower-lying lands below the equator may be most vulnerable to the consequences of climate change.

Screen Shot 2015-07-13 at 8.20.11 PM

Screen Shot 2015-07-13 at 8.21.44 PMScreen Shot 2015-07-13 at 8.22.35 PMVox– A visual tour of the world’s CO2 emissions

The increasing levels of particulate matter are attempted to be more locally mapped in Surging Seas.

The changes extend, in a nice dramatic detail, into the Central Park Meer rejoining the East River with the predicted inundation of much of the posh residential area of Manhattan’s East Side, all the way to Fifth Avenue.

Truncated NJ and absent upper East side

It is difficult not to compare the scenarios sketched in Surging Seas maps to some of the maps of those future islands of New York that Map Box and others allowed Sarah Levine to create maps of the heights of buildings from open data after the pioneering maps of Bill Rankin’s 2006 “Building Heights.”   When Rankin remapped Manhattan by taking building height as an indirect index of land value, he saw the island as clustered in distinct islands of elevation above 600 feet:

manhattan-heights

Radical Cartography (2006)

Levine used similar data to chart the fruits of Mammon in buildings above sixty stories.  Maps of skyscrapers beside the gloom of Surging Seas suggest those towers able to withstand the rising seas brought by global temperatures jumping by just two degrees Centigrade.  If one moves from the map of the bulk of lowest sections of lower Manhattan–

Two Inches in Lower Manhattan

with reference to Levine’s brilliantly colored carmine mapping of the highest buildings in the Big Apple, above forty-seven or fifty-nine stories, which one imagines might provide the best vantage points that rise above the rising waves, especially when located on the island’s shores.

Mapping NYC by Sarah

Sarah Levine Maps Manhattan

There’s a mashup begging to be made, in which the tallest buildings of over fifty stories at the tip of the island peak up above the cresting waves, and the rump of buildings in lower Manhattan offer contrasting vistas of the city’s contracting shores.  The buildings that create the canyons of urban life, the buildings of elevation surpassing sixty stories might suggest the true islands of Manhattan’s future, as much as the points that punctuate its skyline.

Sarah's Lower Manhattan

The realization of this possible apocalypse of property made present in these maps offer the ability to visit impending disasters that await our shorelines and coasts, and imagine the consuming of property long considered the most valuable on the shore–as rising seas threaten to render a whispy shoreline of the past, lying under some six meters of rising seas.  The prospect of this curtailing of the ecumene, if it would bring an expansion of our nation’s estuaries, presents an image of the shrinking of the shores that suggests, with the authority of a map, just how far underwater we soon stand to be.

Eastern USASurging Seas: sea level rise after 2 degrees centigrade warming

All actual maps, including Levine’s, provide authoritative reporting of accurate measures with a promise of minimal distortions.  But visualizations such Surging Seas offer frightening windows into a future not yet arrived, using spatial modeling to predict the effects of a rise in sea-level of just five feet, and the potentially disastrous scale such a limited sea-level change would bring:  the coasts are accurate, but their inundation is a conservative guess, on the lower spectrum of possibilities.  For in a country in which 2.6 million homes are less than four feet above current sea-levels, the spectral outlines of chilly blue former coastlines peak at a future world are still terrifying and seem all too possible, as much as potential cautionary tale.  The concretization of likely scenarios of climate change remind us that however much we really don’t want to get there, how potentially destructive the possibility of a several degree rise in ocean temperatures would be.

Leave a comment

Filed under Climate Change, coastal flooding, data visualization, Global Warming