Category Archives: climate change

Loopy Maps to Rationalize Random Shut-Offs?

As the haze is again settling over San Francisco and blanketing the Bay Area, fire season has begun again in Northern California. The densely populated Bay Area is surrounded by levels of particulate matter from the fires based in the Sierra Nevada, sending some flakes to San Francisco, as alerts are issued to those in Santa Rosa, Livermore and San Jose sensitive to air quality to avoid extended outdoor activities, and reduce their time out of doors.

The new strategies of fire containment, however, that have been adopted by PG&E, after the private company used its privileges to declare power shut-offs in ways that did not seem so “surgical” at all–leading led Bay Area State Senator Jerry Hill to remind the corporation that all safety shut-offs “must be a surgical, last resort measure,” not a knee-jerk method of containment–although what a “surgical” sort of shut-off of electricity would be was unclear, as it would presuppose an assessment of transmission poles, the clearing of nearby trees, and tools of pinpointing high winds. In an era when, in the East Bay and in Oakland, we’ve been looking at fire warnings in public service announcements placed on the side of park entrances and highways for over ten years,

the emergency warnings that extended a red flag alert of extreme fire danger to the entire Bay Area on October 8.

The increased gustiness of winds generalized a sense of extreme fire danger, reminding us of the broad state of fire in a drying state, where rainfall has created a rage of fires across the state. Obi Kaufman has compiled the fearsome image of fires across the state in a rather fierce water color map, which belies its medium, that serves to picture the spread of fires across the states’ counties more concretely than the best remote sensing allows–and indeed how fires have increasingly shaped the topography of the state–raising the deep, primeval fears of fire as a plague of contiguous burning regions across the state, as if its entire surface is lit by fires that bled into one another, moving across space in record time.

Obi Kaufman, “Fire Desnity”

Numbers of residents displaced by the fires over the last five years in California stands as a human rights crisis, particularly acute among lower-income residents of the state, often displaced by loss of homes which leaves them without options of relocation. Because most communities in California wait an average of five years until homes are rebuilt after fires subside, the increasing occurrence of “wild” fires in the state significantly increase the income divides, diminishing the amount of available housing stock across much of the state, and increasing the cost of rentals after fires have been extinguished,

One must read the map of the occurrence of fires in the past five years is also a map of displacement, and pressures on public housing, and of the increased salience of a public-private divide afflicting the local economy and all residents of the state, and standing to disrupt the common good more precipitously than property loss can map: the occurrence of fires around the so-called urban-wildlands periphery, where many seeking housing are also pushed–outside of Sacramento, San Francisco, and Los Angeles or San Diego–creates an apocalyptic scenario where many of the suburban residential areas are consumed by raging fires, displacing residents who will find it even harder to find a home.

The mosaic of the fires seems an impossible challenge to process or parse. But it suggests the impossibility of removing the range of fires that have occurred in the last five years form the cities where populations are concentrated in California, and indeed confirms the surreal nature of how a sustained absence of precipitation has ringed the capital with hugely destructive fires, which despite their individual names suggest the new landscape that tightening resources in the state are compelled to address, both in prevention and mitigation even as the costly nature of controlling what has been an almost year-round fire season creates a budgetary costs few have prepared for:

Is Kaufman’s map the true map underlying the panic at the possibility of new firestorms? It will help us take stock of the dissonance of the maps that PG&E presented of electric shut-offs, and the contorted syntax of public explication of the potential shut offs of elec. We were warned to potentially be the very wind conditions that in 2017 helped spread the North Bay Fires, rendering immediate danger signs throughout the Oakland Hills where memories of the Oakland Firestorm of 1991 were still raw, inscribed in the landscape, as the largest suburban conflagration that spread from the Oakland Hills over the weekend, as the Tunnel Fire was fanned by Diablo winds in late October, destroying hillsides of homes, and pain, shocking residents in ways that would barely receded from the public imaginary by the time of the North Bay Fires were fanned a quarter of a century letter. The mosaic of the regions that were consumed by fire in the past five years in California reveals something of a patchwork quilt.  But the dramatic expanse of the Camp Fire that consumed the city of Paradise, CA suggested a new era of fire regimes, as it flattened the city in ten minutes, creating refugees of climate change on our own soil as it destroyed 14,000 family dwellings and displacing 50,000, with refugees across Yuba City, Chico, Sacramento, and across Butte County.  Nearby, the devastating fire was witnessed differently than other wildfires:  as the air entered our lungs in the Berkeley, Davis, and Sacramento, and we witnessed the sun go red and skies grey.  While the fire was removed, it was immediate, inhabiting multiple spaces at once, if burning across a wooded landscape and consuming huge carbon reserves.

NWS warned about this danger of fire, which couldn’t have been clearer in anyone’s mind, or to PG&E’s new board of directors as they were forced to decide upon a plan of action with the growth of Red Flag warnings in much of the northern state.:

If the dystopia of Fire Season lay last year in raging firestorms that consumed homes, tract housing, and high carbon forests, the safety measures that were adopted after the North Bay Fires of 2017 to curtail corporate responsibility have become a new dystopia, of intermittent power supply, immediately inconveniencing the poor, elderly, and infirm, as a temporary safety tactic was extended to thirty-four of fifty-eight counties–over half–as if this was a new strategy of fire prevention or containment.

For Califorrnia’s largest publically traded utility, Pacific Gas & Electric, faces charges it is unable to maintain, respond, or oversee a state-wide safety issues beyond its control, even if it has entered bankruptcy. The state utility company has been widely blamed for failing to maintain the safety of its transformers has just openly recognized its transformers were the “point of ignition” of the recent tragic fires–exposing itself to 11.5 billion in damages for what was the most deadly fire in California history as it spread across 150,00 acres that the admission of a site of ignition and previous charges for which its equipment was blamed, if not $30 billion in damages.

California’s legislature had tried to extend support to the utilities company in the wake of previous fires to spare it from previous wildfire liabilities; the reluctance to extend any similar security to PG&E has turned to anger in the face of the current destruction, however, reflects huge discontent with continued abilities of forest management and outrage at the scope of the devastation of local neighborhoods, and the scale of the destruction of over 18,000 buildings in the Camp Fire. Even as the current scare has retreated, allowing air quality to improve for now around much of the Bay Area,

there are steep charges PG&E improperly maintains forested areas and trees in proximity to electrical wire–especially dangerous in a parched landscape that has lacked rainfall over multiple years. After the rapid spread of the fires of the Camp Fire, after multiple combustive wildfires to scorched the North Bay in October 2017 that killed forty-four, it is hard to single out poor maintenance as the issue at stake in the fires’ spread. Yet the corporate entity has become blamed on inadequate repairs, maintenance, or poor record keeping, as if such bad practices left state residents increasingly vulnerable. Even if most dismissed President Trump’s wilder claim that the spread of fires in Northern and Southern California were due to “gross mismanagement of the forests,” the notion of blaming the utilities company for its negligence seemed more credible. But building beside grasslands and desiccated forested land have created a new geography of fire and of fires’ spread–revealed in satellite measurements of the fearsome spread.

The liabilities that the power company was assessed for the destruction with which the recent fires’ spread undercut its credibility and increased a sense of its negligence, removed from the extreme weather the state has faced and for which few containment strategies exist. To be sure, the demand to define a felon seems to have overridden the danger of placing transmission lines near forested areas without rain. The aftermath of the fires–perhaps critically the time since Paradise, as it were–has created a sense of nervous breakdown in assessing, monitoring, and mitigating fire danger, it almost seems, as we rush to individuate and indicate clear blame for a changing climate and a lack of response to a decreased level of precipitation rarely experienced, in the need to identify a clear cause or victim for such massive and persistent disequilibria that undermined the public good and well-being in the apocalyptic fires northern and southern California faced, and the terrifying alarm before maps of the fires’ destruction and accelerated spread, perhaps without pausing to consider inter-relations that create a radically new firescape. And the outage maps that PG&E issued to its clients as gustiness of winds grew suggested a similar remove from an overderied landscape, long lacking rain.

And the maps that were issued to make the case for the outages weren’t that convincing, seemed quite improvised, if they did alert customers to the impending danger of power outages as a response to contain future fires. The loopy maps that were recently issued by PG&E, California’s very own home-grown for-profit energy company, to its customers or “consumers” seemed a weak public wager of confidence. After hemming and hawing, apparently, about deciding how to respond to the forecast of possible high winds returning to northern California, a region over recent decades that is haunted by fires by every earlier autumn. Whether they actually prevented fires–as PG&E insists, and left us all to sight with relief–or not, the hodge-podge constellation of overlapping parcels of indicating potential electrical shut-offs was disorienting–

–and didn’t create more clarity by drilling deeper down into Bay Area neighborhoods..

Were the maps a warm-up for the difficulties of managing the threat of fire in a new fire season? Our “fire season” now stretches past winter, if not lasting all year, and its length has created such problems of management that PG&E elected to announce last Tuesday evening that the power was being turned off to over 800,000 of its customers preventively, affecting what might be upwards of 2.5 million individuals. It did so by exploiting an ability that it had gained after inadequately struggling to respond to past fires that had caused damage across the state, but intense loss of homes and property in Northern California, after the Camp Fire almost a year ago in 2018;, combined problems of infrastructural management and climate issues of a lack of rain it was unable to face, management of which was triple jeopardized after it had declared bankruptcy as a private corporation, which this time shut the power preemptively as the gusty winds began to blow.

This being California, it was significantly striking that the turquoise rings seemed electrified version of the iconography of radiating earthquake tremors, and caused a comparable alarm.

The maps were hardly much grounds of confidence, born as the board of directors hunkered down in preparation for those dry, warm wind of winter to sweep across Northern California. They were announced to be the result of the sudden arrival of “unprecedented fire risk,” even as the highest temperature anomalies recorded in the state had been published online since early summer, as dry autumn winds of increasingly high velocity returned to raise fears across the state, setting off alarms for a new geography of firestorm risk.

For “risk” in this case meant winds, and fire danger was understood as an intersection, as in a Venn diagram, between combustible undergrowth and winds that could carry dangerous embers from downed power lines–the apparent paradigm of recent outbreaks of what the media still calls–with PG&E–“wildfires,” as if they were due to a failure to thin the woods.

This may be the best optic through which to understand the awfully loopy maps that the company released, in an attempt to suggest that this time it really had the situation under control, or at least to dodge the danger of not seeming to have responded to verifiable risk. The responsibility of PG&E exercised over the electrical infrastructure of the state had led to its ability to monitoring weather privately to determine the urgency of shutting off residents’ power, allowing customers to consult a weather webpage to ensure some sense of transparency.–and security–until it crashed under broad demand for answers. The project of educating customers to the possibility of unprecedented outages raised questions responsibility and agency that many felt the private company had not been delegated authority to adjudicate.

The prediction of gusts of wind from the Great Basin over areas that still bear traces of firestorms created a broad sense of alarm, however, and with good reason. High gusts of dry air had spread firestorms of huge destruction, crumbling steel lattice structures of aging transformer towers that sent power lines crashing into dry underbrush, in Pulga, as low precipitation, low atmospheric moisture, and sparks led to particularly extreme consequences: debates about what “good forest management” has led to concern about the strength and durability of elevated electric wires knocked down by the force of high winds, pushing brush fires that became firestorms over vaster areas than firefighters had been used to manage–or fire fighters had even recently imagined. It led many to contemplate the stubborn fact that an expanded network of elevated electrical wires, bearing charge, running more prominently than ever before across a wild lands-urban interface, with terrifying consequences.

Despite the complexity of expertise required to explain the fires’ spread and indeed the problems of forest management, the maps that PG&E issued to explain outages were not clear. Only earlier this year, Judge William Alsup called the PG&E “the single most culpable entity in the mix” within the current “crisis that California faces on these wildfires.” This led several attorneys had sued the company for the huge damages of the Camp and North Bay wildfires eagerly listened, happy that some accountability did seem to be in the air. But as PG&E had let its budget for reducing trees near transmission lines whither, they seemed to illustrate utter irresponsibility in neglecting “a large number of trees that should have been removed, and that appears to be the single biggest factor in the 20107 and 2018 fires,” demanding that the fire mitigation plan of the energy corporation follow state law by trimming trees that might contact power lines in high Fire Season winds, which were only expected to grow, and placing there corporation on probation. The limited sensitivity of CEO Bill Johnson to the question of trimming may have not been able to be dislodged from how his salary was being pegged to safety performance, as much as responsible maintenance.

Judge Alsup had ominously warned the corporation it would be bad in the docket come December, wen the number of wildfires PG&E started would be reckoned, and he hoped it would be none: the expectation were balanced by the significant bonuses Johnson would gain if he created conditions to allow the stock of the utility company bounce back to the heights they enjoyed in 2017. And the power of PG&E to shut power of temporarily to 500,000 and ten 800,000 Californians provoker indignation but were won as a right to announce “public safety power shut off” with little warning was won, without being imagined to be used in such an aggressive and proactive fashion as Johnson and his Board seem to have done, raising questions about the current corporate culture of PG&E, if it had committed to “only consider proactively turning off power when the benefits of de-energization outweigh potential public safety risks.” These benefits were not only regarding safety, but the dangers of being found negligent in managing the energy infrastructure.

The problems of fire management that this mapping of fires revealed was both problematic, and difficult to respond to: PG&E asked–or demanded?–the ability to shut off power in high-velocity winds in response. The consequences were not only personal inconveniences. They may well have compromised valuable cancer research on temperature-controlled cultures within a week of conclusion, costing 500K and countless hours of lab work, as controlled experiment were rushed to San Francisco. Even if a warning on October 9 let folks know that the private energy agency “continues to warn of a power outage,” the absence of anything like expectations of such a shut-off or even plans to deal with the web traffic that meant that the “inconsistent” and at time “incorrect” information on the PG&E website, which would continually crash due to high volume of folks seeking information urgently, led to a broad swath of apologies that however refused to admit any underlying fault–“I do apologize for the hardship this has caused but I think we made the right call on safety,” said CEO Johnson, who arrived at PG&E only recently from the Tennessee Valley Authority, and installed a new Board: Johnson is a proud long-time Grateful Dead fan, but may be more conscious his salary was tied to safety performance than California topography or corporate ethics, and expected that the state would rapidly accept a declaration of fire emergency.

Perhaps the arrival of winds didn’t inaugurate a “fire season,” however, but revealed a new aspect of the electrical infrastructure about which no clear working process had been devised or imagined. If such winds inaugurated the “fire season” in California, this was now not an issue to be addressed by thinning forests or environmental management alone–the large carbon loads of forests had gained an increased vulnerability to high-charge electric sparks, in ways that had exponentially expanded the vulnerability of the landscape to fire, and risk of firestorms,–

–of the very sort that had generated ever more terrifying images around the time of the Camp Fire of statewide fire risk; winds that created conditions for the fire fanned the flames further, at higher velocities, to create firestorms and new management challenge of fire whirls, or vortices of flames of extreme heat intensity.

Did it all start with dry winds, or with carbon loads, or with live electric lines suspended in what were revealed to be dangerously unstable ways? Heat maps of the nature of air quality became seared into northern California residents’ minds.

Mapbox Heat Map of Smoke Emissions of Camp Fire

The high gusts running through trees that received far less water or snowpack than in previous years, the deep worries both among PG&E administrators who balanced their abilities of oversight of the electrical infrastructure with their public responsibility, was mirrored in the haunted nature of the Northern California residents who had seen increasing numbers of fire men stream into the areas near Chico, Sacramento, and the North Bay or the Sierras and Yosemite each Fire Season like clockwork in previous years–in ways oddly dissonant with bucolic green surroundings.

This time round, the temperature anomaly alone had offered some guide to prediction of the eventuality of future fires, but provided little guideposts to the possibility of managing the situation at hand. The arrival of winds were terrifying, and not only to PG&E. The gustiness of local winds seem new indices that will mediate all future responses to what Daniel Swain called the ongoing firestorm of California, and are destined to filter our future collective reactions to future fires’ spread.

While the maps of outages seem to convey a weird spectral sort of precision, they suggest an architecture of emergency response that blankets all areas of the now legendary woodlands-urban periphery with darkness, as if in an attempt to rationalize the impending outages that have been paralleled with offers to set up cell phone charging stations, were hardly reassuring.. Promising lavishly to set up a designated communications network with its clients and customers, that seemed a bit like not allowing the news to get into the game, preferring texts, emails, and robocalls as a way of staying in touch and broadcasting emergency to folks who it assumed would have fully charged phones–because who lets their phones run low–though it wasn’t clear communications could continue if outages lasted the full “five days or longer” as announced. The company had its own private team of meteorologists, to prevent any redundancy from designated communication with the NWS or NOAA. But trust us, POG&E seemed to say–as Sumeet Singh, PG&E vice president of the Community Wildfire Safety Program, put it, “some of our customers may experience a power shutoff even though the weather conditions in their specific location are not extreme.”

The memories of the Camp Fire created a sense of disorientation before the impending arrival of high winds, which triggered a memory of the massive destruction of property and loss of lives even as the much of the nation is distracted on other media sources, and indeed by greater problems of irresponsibility. With the memories of the Camp Fire and to the North Bay Fires of the fixed in our head, from the toxic air they across the city, to streams of refugees who lost their homes, we had only just started to face the first evidence of the start of a new fire season in smoke from the first fires, which only last year had made Northern California site of the worst air quality the world, as we studied maps of how the Firefighters have made progress containing the deadly Camp Fire was tired to be contained. As we tracked the fear of fires growing not only near Los Angeles, but near Chico,

we faced a new beast in the unprecedented power shut-offs that became a possibility for hundreds of thousands of customers of the utility company over the last week, a highly controversial act that enabled power to be preventively cut to regions in California that we tracked through loopy maps of potential shutoffs, shown in alientating map layers that seemed to glow with radioactivity, as the authority of the utilities company had assumed in public life in California, elevated after a series of mismanagements or mishaps, meant that the combat of ever-present fire threats extended to the disruptive dangers posed by power shut-offs, perhaps strategic.

But the maps offered no clear logic to their arrival or creation, though they seemed to follow global curvature, akin Bowie net, though the GPS loops didn’t also overlap so often, and derive from power lines on a local level. The threat of interruptions of electricity seemed issues of global importance, altering access to electricity and shifting many to local generators–even closing the University of California for a few days until wind levels died down.

The large, privately held public utility company elected, with winds forecast from Wednesday through Thursday to reach velocities as high as sixty to seventy mph at taller elevations of the state, to adopt what was called “PG&E’s state-mandated wildfire mitigation plan, which aims to cut down on the ignition of wildfires during high-risk periods”–presumably because it was in bankruptcy, more than or as much as a form of public protection, at a time that PG&E has its back up against the wall. The widespread “voluntary” withdrawal of power supply was due to a failure of risk management, and the dangers of firestorms that could not be contained. The threat of electrical transmissions lines malfunctioning, collapsing, or igniting brush–as had begun the conflagration known as the Camp Fire of 2018 and the Tubbs Fire and the North Bay Fires of 2017, lead Berkeley residents to be asked to evacuate the Hills–an area where evacuation would be judged difficult in an actual emergency. The extensive elderly population dependent on life-sustaining electrical medical equipment–oxygenators; dialysis machines; ventilators–whose loss of power could pose diastrous health risks. Even the announcement of power loss could trigger panic sending customers into a tail spin.

Since the potential outage map on the PG&E website first appeared, and before it crashed, significant panic grew as the need to follow a careful protocol seemed absent in the energy corporation’s plans. The maps revealed in complex if simple fashion a blanketing of many of the areas of the Berkeley Hills-based on what must have been the newly launched website PG&E had devised to help customers prepare for wildfires readiness, and indeed presented to the Berkeley City Council this past July about the potential for emergency ‘public safety power shutoffs’ and presumably also the need to develop a strategic response given the problems of residents who relied on electricity for medical devices, cel towers, and city reservoirs, who would all be thrown into disarray in the eventuality of a “public safety shutoff” that the energy company had recently acquired–or been forced–to adopt. The sort of outage map that PG&E distributed were low on actual information, and rich with apparent looniness and difficulty to read,–

–the image of state-wide potential outages even more generic at such small scale. The map is more suggestive of targeting of sites of past ties created by sparks from wind-driven fallen transmission lines or as a result of the sudden collapse during high winds of aging electrical infrastructure. All state residents were alerted to update their contact information to receive more local bulletins at pge.com/mywildfirealerts, a dedicated communications infrastructure over which PG&E held all strings; any possibility of confused communications was sought to be contained–as was that of examination of the protocol for delivering information.

Continue reading

Leave a comment

Filed under climate change, fires, Global Warming, PG&E, Wildfire, wildfire risk

Freezing Time, Seaweed, and the Biologic Imaginary

We can lose sight of the central role that seaweed plays in the coastal habitat of Northern California. For while often present before our eyes, the problems of mapping often submerge seaweed forests with any fixity is mirrored by the threatened disappearance of offshore kelp beds in an amazingly rapid timeframe, as creating an actual image capture able to register the extent of kelp forests is sadly mirrored in the diminishing kelp beds off the California coast.

Has predominantly passive registration of location–onshore registration of sites remotely by satellites, from the harrowing images of the spread of fires. We are reminded by maps showing the rapid advance of the burn perimeters of Yosemite wildfires of 2013, North Bay Fires, or the disastrous Camp Fire of 2018. The rapid pace of the loss of these forested lands seems eerily echoed in the shrinking of coastal beds of kelp along Northern California, and correlates to the advance of warming climes.

If we have developed tools to map the continuity, intensity, and growth of forest fires by satellite and drones, the problem of passively registering the loss of kelp forests, and its relation to the advance of urchin beds, removes a part of coastal environments we are in need of mapping. The scale of maps of the destruction of seaweed beds on the California coast are less rooted in real time, but have advanced in striking fashion over ten years, although the ravages of destruction for now seem to remain undersea. But we are less skilled to communicate their crucial place in offshore environments.

The nutrient-rich cold waters of coastal California provided with its rocky seafloor afford a perfect environment for lush kelp forests, that extend up into British Columbia and Alaska. But as waters are warming, with astounding rapidity, we need to ensure kelp beds are mapped, although many are often off the map, and difficult to register, even as their size has come to be threatened by global warming and climate change, in ways that eerily parallel the loss or threats to irreplaceable forested environments. While the nature of the decline of seaweed is not linked to warming waters directly, the shifting ecosystems that climate change has created have caused a drastic and rapid decline of seaweed’s offshore presence that we have yet to fully map. For the passive registration of kelp beds, traced by GPS or captured through aerial photography, is far less hands-on than examination of their extent would warrant, and low-flying satellites of remote sensing offer few possibilities for accurate mapping of their extent. The rapidity of the disappearance to kelp–and beds whose boundaries are shifting in time with the rapidity of the advance of forest fires on the scale of their destruction–pose problems of global dimensions that are pointing for the immediacy of loss of kelp of over 13,000 species whose biodiversity and creation of oxygen by photosynthesis feeds much of the world drives our own ecosystem.

Paul Horn, Inside Climate News/Source Wernberg and Staub,
Explaining Ocean Warming (IUCN Report, 2016)

Even a map of globally threatened areas cannot emphasize properly the extent to which the Pacific coastline provides a site of cold waters, ocean upwelling that provides rich mineral nutrients, and sunlight that makes it an especially abundant site of kelp forests, in a true megaregion of coastal ecology whose catastrophic loss is impossible to imagine. Even a map of kelp’s local abundance fails to map its ecoystemic centrality in adequate ways–ways that the diversity of kelp speciation also fails to capture, despite its clear .scientific value to survey the ocean populations that are most risk.

The Nature Conservancy

Miller, Lafferty, Lamy, Kui, Rassweller and Reid (2018)/Royal Society Publishing,

The particular vulnerability of the kelp biomass seems to have grown in unexpected ways not only due to climate warming, but to its particular vulnerability to ocean floor sessile predators like purple sea urchins, who are more likely, it is now believed, to eat phytoplankton and microalgae in the kelp understory, rather than kelp itself: however, the role of urchins in diminishing kelp forests, which themselves feed exclusively on sunlight, the combination of how a lack of upwellings due to climate change diminished urchin food supply and the inhospitable nature of warming waters to kelp forests may increase the vulnerability of kelp in coastal oceans.

Continue reading

Leave a comment

Filed under climate change, climate emergency, data visualization, Global Warming, seaweed

Saturated Shores in Southeastern Texas

There is almost no trace of the human, or of the extreme overurbanization of the Texas coast, in most of the maps that were created of the extreme flooding and intense winter rains that hit Galveston and Houston TX with the windfall of Hurricane Harvey.  While maps serve to orient humans to the world–and orient us to human processes and events in a “human world,” as J.B. Harley and David Woodward put it, the confused nature of relations between the human and natural world, is increasingly in danger of being mipmapped.  Data visualizations of extreme weather that erase the modification of coastal environments provide a particularly challenging means of orientation, as news maps are suspended between registering the shock of actual events–and trying to contain the natural emergencies that events of extreme weather create–and the demand for graphics that register natural calamities and the ethics of showing such calamities as “natural”–or even what the category of the natural is in coastal regions that so heavily modified to modify actual weather events.

The ethics of orienting viewers to the rainfall levels that fell in Houston after the landfall Hurricane Harvey Part of the huge difficulties lies in adequately orienting viewers in ways that register a changing natural world–how we are mapping rainfall, for example, or the approach of hurricanes, or are rather mapping the new relation of rain to built surfaces and landcover change that lack permeability for water, facilitating flooding by storms whose potency is changed by the greater atmospheric content of a warming Gulf of Mexico, which the ground cover of Houston, Galveston, and the Texas shore are less able to absorb and return to the Gulf. The area is, itself, something of an epicenter of the increased number of hemispheric tropical cyclones–which demand warm water temperatures above 80 80°F / 27°C and a cooling atmosphere and low wind shear–often led to the Gulf coast.

NASA Earth Observatory/Tropical Cyclones through 2006

–those that come ashore at Galveston hit a seashore that is eminently unprepared to accommodate an influx of water that the paved surface has rendered all but impermeable. If the problem of global cyclones that can become hurricanes is truly global–

NASA Earth Observatory/150 years of Tropical Cyclones

–the intersection between cyclones and areas of paved ground cover is problematic to the southwestern states, and most of all to Texas, Louisiana, and Florida, where water absorption has been anthropogenically reduced in recent decades. At the same time, few other areas of the inhabited world are so widely “tracked” as the destination of tropical cyclone formation..

NWS JetStream Online School)

The problem is partly evident in the choice of new color ramps that transcend the rainbow spectrum of measuring the intensity of rainfall in the recent arrival or ground fall of Hurricane Harvey, which condenses the great difficulty of using old cartographical categories and conventions in order to capture or communicate increasingly extreme weather conditions. in an era of climate change.  But the cartographic problem goes farther:  for it lies in the difficulty of registering the changes in relations f how rain dropped meets the ground, mapping relations between complex processes of warming and atmospheric warmth that lead to greater humidity across the gulf region to ground cover permeability that leaves regions increasingly exposed to flooding.

The relentless logic of data visualizations based on and deriving primarily from remote sensing are striking for rendering less of a human world than the threat of allegedly “natural” processes to that world.  Perhaps because of the recent season of extreme weather we have experienced, weather maps may be among the most widely consulted visualizations in our over-mediated world, if they were already widely viewed as the essential forms of orientation.  But the pointillist logic of weather maps may fail to orient us well to extreme events as the hurricane that dumped a huge amount of water on overbuilt areas to include the human–or the human world–seem a tacit denial of the role of humans in the complex phenemona of global warming that have, with the warming waters of the Gulf of Mexico and ever-increasing ozone over much of the overbuilt southeastern Texas shore, created a perfect storm for their arrival.

This failure to include this role haunts the limited content of the weather map; including the role of humans in maps of extreme weather events indeed remains among the most important challenges of weather maps and data visualization, with the human experience of the disasters we still call natural.  And although the subject is daunting, in the spirit of this blog, we will both look at the communicative dilemmas and difficulties of eye-catching color ramps and their deceptiveness, and poetic difficulties of orienting oneself to shores.  For as the disaster of Harvey is depressing, it compels raising questions of the orientation to the shifting shore, around the national epicenter of Galveston, where the landfall of Hurricane Harvey focussed our attention on August 27, 2017–

90-7
wpc-5day-0Z-8.28.17

–and the meaning of place in an saturated shoreline, where the sea is somehow part of the land, and the land-sea divide blurs with a specificity that seems as if it may well be increasingly true in an approaching era of climate change.  And as we depend on the ready generation of maps based on remote sensing whose relentless logic is based on points, we risk looking sight of the role of place in determining the relations of rainfall to shoreline in maps of coastal flooding that remove remote observations from the built environment that flooding so drastically changes, challenges and affects, in ways that may elide specificities of place.

At a time when we are having and will be destined to have increased problems in orienting ourselves to our shores through digital maps of rainfall, the unclear shorelines of Galveston sent me to the bearings that a poet of an earlier age took her bearings on the mapped shorelines of the place where she had been born, and how she was struck by a bathymetric map to gauge her personal relation to place, and saw place in how the changing shoreline of the northern Atlantic were mapped in the maritimes, in a retrograde form of print mapping in a time of war.  For the way the mapped shore became a means by which Elizabeth Bishop gained bearings on shores through a printed map of coastal bathymetry to access the spatiality of the shore–how “land lies in water” and the blurred relation of land and water that the bathymetric map charts–in an age when the materiality of the map was changing, with the introduction of aerial composite maps from the early 1930s, as the rise of aerial composite maps removed the hand of the mapmaker from the map in an early instance of remote sensing–

5852167

Cartography Associates/David Ramsey: Historical Map Collection: Composite of 164 Aerial Views of San Francisco by Harrison Ryker/Oakland, 1938, 1:2000

–in a medium of aerial photography that focussed on land to the exclusion of water, and that all but erased the relation between water and shore just a few years after Bishop quickly wrote her poem in Christmas 1935 about coastal “edges” of land and sea.  Ryker, who developed techniques of aerial photography used in the mapping of the shores of Puerto Rico for the Fairchild Aerial Camera Company, as well as photographs of the devastating Berkeley Fire of 1923, went into business in 1938–the year of his map–as a map publisher, with a patent for the stereoscope used to interpret aerial imagery,  and must have performed the massively detailed mapping of San Francisco in one hundred and sixty for images taken from airplanes from 1937-38 as a sort of calling card, soon after Bishop wrote her poem, before manufacturing a range of stereoscopes of pocket and desktop versions for military ends that were widely used in World War II by the US Army.

Before war broke out, but in ways that anticipated the coming war, the printed bathymetric map must have resonated as a new reflection on the impersonality of the aerial view; Bishop was suddenly struck when she encountered the materiality of a print map on Christmas 1938 as the art of cartography was already changing, responding to the drawn map under glass of the Atlantic as a way to recuperate the personal impact of place.  Her poem powerfully examined the logic of drawn maps utterly absent from the digitized space of rainfall maps of a flood plain, deriving from data at the cost of human inhabitation of place–and in envisioning data to come to terms with the catastrophic event of flooding distancing or removing the craft of mapmaking from the observer in dangerously deceptive ways.  And so after wrestling with the problems of cartographic representation using remote sensing, while recognizing the value of these readily produced maps of rainfall and the disasters they create,

1.  For weather maps are also among the most misleading points to orient oneself to global warming and climate change, as they privilege the individual moment, removed from a broader context of long-term change or the human alteration of landscape.  They provide endless fascination by synthesizing an encapsulated view of weather conditions, but also  suggest a confounding form of media to orient audiences to long-term change or to the cascading relations of the complex phenomenon of climate change and our relation to the environment, as they privilege a moment in isolation from any broader context, and a sense of nature removed from either landscape modification or human intervention in the environment, in an area were atmospheric warming has shifted sea-surface temperatures.  The effects on the coast is presented in data visualizations that trace the hurricane’s “impact” as if its arrival were quite isolated from external events, and from the effects of human habitations on the coast.  The image of extreme flooding is recorded as a layer atop a map, removing the catastrophic effects of the flooding from the overpaved land of the megacities of southeastern Texas, and the rapid paving over of local landcover of its shores.

90-7

.

Such visualizations preserve a clear line between land and sea, but treat the arrival of the rains on land as isolated from the Consuming such events of global warming in color-spectrum maps.  The data of rainfall translate data into somewhat goofy designs represents a deep alienation from the environment, distancing viewers in dangerous ways from the very complexity of global warming that Gulf coast states encountered.

Such data visualizations seem dangerously removed notion of how we have changed our own environment, by describing a notion of “nature” that is immediately legible, as if it were removed from any human trace or of the impact of modification of the land, and by imaging events in isolation from one another–often showing a background in terrain view as if it has no relation to the events that the map describes.  Although weather maps and AccuWeather forecasts are sources of continual fascination, and indeed orientation, they are are also among the most confounding media to orient viewers to the world’s rapidly changing climate–and perhaps among the most compromised.  For they imply a remove of the viewer from space-and from the man-made nature of the environment or the effects of human activity form the weather systems whose changes we increasingly register.  By reifying weather data as a record of an actuality removed from human presence at one place in time, they present a status quo which it is necessary to try to peel off layers, and excavate a deeper dynamic, and indeed excavate the effects of human presence in the landscape or geography that is shown in the map.  We are drawn to tracking and interpret visualizations of data from satellite feeds in such weather maps–or by what is known as “remote sensing,” placed at an increased remove from the human habitation of a region, and indeed in a dangerously disembodied manner.

Visualizations resulting from remote observation demand taken as a starting point to be related to from the human remaking of a region’s landscape that has often increasingly left many sites increasingly vulnerable to climate change.  But the abstract rendering of their data in isolation from a global picture–or on the ground knowledge of place–may render them quite critically incomplete.  The remove of such maps may even suggest a deep sense of alienation form the environment, so removed is the content of the data visualization form human presence, and perhaps from any sense of the ability to change weather-related events, or perceive the devastating nature of their effects on human inhabitants:   their stories are about weather, removed form human lives, as they create realities that gain their own identity in images, separate from a man-made world, at a time when weather increasingly intersects with and is changed by human presence.  While throwing into relief the areas hit by flooding near to the southeastern Texas shore at multiple scales based on highly accurate geospatial data, much of which is able to be put to useful humanitarian uses–

Harvey flooding_1.jpg

Dartmouth Flood Observatory/University of Colorado at Boulder, August 29. 2017

1504094467hurricane-harvey-flood-map.gif

Maps of the World

–the reduction of floods to data points creates a distorted image of space renders their occurrence distant from the perspective of folks on the ground, and places their content at a considerable remove from the complex causality of a warming Gulf of Mexico, or the problems of flood drainage by which Galveston and Houston were beset.  Indeed, the multiple images of that report rainfall as an overlay in a rainbow spectrum, at a remove from the reasons for Houston’s vulnerability to flooding and the limits the region faces of flood control, in broadcast Accuweather images of total rainfall in inches advance a metric that conceals the cognitive remove from the dangers of flooding, ora human relation to the landscape that the hurricane so catastrophically affected.  Can we peel under the layers of the data visualization, and create better images that appreciate the human level on which the landscape stands to be devastated by hurricane rains, as much as tracking the intensity of the growth of rainfall over time?

90-5.jpeg

AccuWeather, Rainfall levels by Thursday

90

AccuWeather, Friday to Monday

Such layers of green, meant to suggest the intensity of rainfall that fell over land, reveal the concentration of water in areas closes to the Gulf of Mexico.  Even the most precise geographical records of the dangers of flooding in the floodplain of southeastern Texas with little reference to the historical modification of the region by inhabitants–

Harvey flooding_1

Dartmouth Flood Observatory at University of Colorado, Boulder/August 29, 2017

–and conceal the extent to which the landscape’s limited ground cover permeability has left the region far more susceptible to flooding, and elevated the risks of the emergency.  The problem of reading any signs of human presence into these “images” of precipitation provoke problems of disentangling remote sensing data from knowledge of the region’s recent urban growth and the consequent shift in local landcover.

The perspective of our relation to these events is often as fleeting and as existential as they flood us with data, which we viewers have little perspective or tools to process fully.  The onrush of recent remote sensing maps batter us with an array of data, so much as to lead many to throw up their hands at their coherence.  Even as we are  still trying to calculate the intensity of damages in Puerto Rico–where electricity is so slowly returning that even even after four months, almost a third of its 1.5 million electricity customers still lack power–and the cost of fires in southern California.  We look at maps, hoping to piece together evidence of extensive collateral damage of global warming.  Yet we’ve still to come to terms with the intensity of rainstorms that hit southeastern Texas–deluging the coast with rainfall surpassing the standard meteorological chromatic scale that so misleadingly seems to offer a transparent record of the catastrophe, but omits and masks the experiences of people on the ground, digesting swaths of remotely sensed data that take the place of their perception and experience, and offering little critical perspective on the hurricane’s origin.

The rapidity with which rain challenged ground cover permeability provides both a challenge for mapping as a symptom of global warming and landscape modification:   the mapping of “natural” levels of rainfall blurs the pressing problem of how shifting landcover has created an impermeability to heightened rains, and indeed how the new patterns of habitation challenge the ability of the coast of the Gulf of Mexico to absorb the prospect of increased rain in the face of decreasing groundcover permeability, and the extreme modification of the coastline that increasingly impeded run-off to the Gulf.

2.  Across much of southeastern Texas, a region whose growth was fed by the hopes of employment in extractive industries, real estate demand and over-paving have unfortunately intersected with extreme weather in southeastern Texas in ways which dat visualizations have had trouble exposing, but which raise a curtain on the coming crises of a failure of ability to accommodate increased levels of rainfall  If the lack of precedent for the intense rainfall in Galveston Bay generated debate about introducing a new color that went beyond the rainbow scale employed in weather charts, what seemed a problem of the cartographic color-spectrum suggested a problem of governability and indeed government response to extreme weather conditions.  How to register the dangers of rainfall that goes of the scale or standards of measurement?

One must consider how to orient viewers to the intensity of consequent flooding, and to its consequences and better prepare ourselves for the arrival of deluging rains without falling back on the over-freighted metaphor of rains of biblical scope.  How many more hurricanes of increasing intensity can continue to pound the shores, by whipping precipitation from increasingly warming waters and humid air?  The cumulative pounding of tropical cyclones in the Gulf stands to create a significantly larger proportion of lands lying underwater–temporarily submerged lands–with radically reduced possibilities of drainage, as hurricanes carry increased amounts of evaporated water from the humid air of the warming gulf across its increasingly overbuilt shores. in ways that have changed how the many tropical cyclones that have crossed the land-sea threshold since NOAA began tracking their transit (in 1851) poses a new threat to the southeastern coast of Texas, and will force us to map the shifting relation between land and water not only in terms of the arrival of hurricanes, or cyclonic storms–

–but the ability of an increasingly overbuilt landscape to lie underwater as the quantity of the Gulf coast rainfall stands to grow, overwhelming the overbuilt nature of the coast.

Most maps that chart the arrival and impact of hurricanes seem a form of climate denial, as much as they account for climate change, locating the hurricanes as aggressive forces outside the climate, against a said backdrop of blue seas, as if they  are the disconnect.  Months after the hurricane season ended, the damage for hurricanes caused have hardly been assessed in what has been one of the most costly and greatest storm damage since 1980 in the United States,–including the year of Hurricane Katrina–we have only begun to sense the damage of extreme weather stands to bring to the national infrastructure.  The comparison to the costs of storm damage in previous years were not even close.

But distracted by the immediacy of data visualizations, and impressed by the urgency of the immediate, we risk being increasingly unable to synthesize the broader patterns of increased sea surface temperatures and hurricane generations, or the relations between extremely destructive weather events, overwhelmed by the excessive destruction of each, and distracted from raising questions about the extremely poor preparation of most overbuilt regions for their arrival, and indeed the extent to which regional over-building that did not take the possibility of extreme weather into account–paving large areas without adequate drainage structures or any areas of arable land–left inhabitants more vulnerable to intense rains.  For in expanding the image of the city without bounds, elasticity, or margins for sea-level rise, the increasingly brittle cityscapes of Galveston and much of the southeastern Texas shoreline were left incredibly unprepared for the arrival of hurricanes or intense rains.  Despite the buzz of an increased density of hurricanes to have hit the region,

1851_2013_mjrhurr2

the questions of how to absorb hurricanes of the future, and to absorb the increased probability of rainfall from hurricanes in the Gulf of Mexico and its shores, suggests questions of risk, danger, and preparation that we have no ability to map.  What, indeed, occurs, as hurricanes themselves destroy the very means of transmitting on the ground information and sensing weather, and we rely exclusively on remote sensing?

Destroyed satellite dishes after Hurricane Maria hit Humacao, Puerto Rico  REUTERS/Alvin Baez

To characterize or bracket these phenomena as “natural” is, of course, to overlook complex interaction between extreme weather patterns and our increasingly overbuilt environments that have both transformed the nature of the Southeastern Texas coast and have made the region both an area of huge economic growth over time, and have paved over much of the floodplain–as well as elevated the potential risks that are associated with coastal flooding in the Gulf Coast.  To be sure, any discussion of the Gulf of Mexico must begin from the increasingly unclear nature of much of our infrastructure across land and sea, evident in the range of pipelines of gas and oil that snake along a once more clearly defined shore charted by ProPublica in 2012, revealed the scope of the manmade environment that has both changed the relation of the coastal communities to the Gulf of Mexico, and has been such a huge spur to ground cover change.

The expansive armature of lines that snake from the region across the nation–

pipeline_line_map

ProPublica, Pipeline Safety Tracker/Hazardous liquid pipelines are noted in red; gas in blue

-and whose tangle of oil pipelines that extend from the very site of Galveston to the Louisiana coast is almost unable to be defined as “offshore” save as a fiction, so highly constructed is much of the national waters in submerged lands in the Gulf of Mexico–

gulfofmexicopipelines

ProPublica, Pipeline Safety Tracker/Hazardous liquid pipelines are noted in red

They indeed seem something of an extension of the land, and a redefinition of the shore, and reveal a huge investment of the offshore extractive industries that stand to change much of the risk that hurricanes pose to the region, as well as the complex relation of our energy industries to the warming seas.  Yet weather maps, ostensibly made for the public good, rarely reveal the overbuilt nature of these submerged lands or of the Gulf’s waters.

Despite the dangers that such an extensive network of hazardous liquid lines along the Gulf of Mexico, the confusion between mapping a defined line between land and water, and visualizing relations of extreme weather disturbances as hurricanes in the Gulf of Mexico and local infrastructure haunts the extremely thin nature of the sort of data visualizations that are generated about the dangers of hurricanes and their landfall in the region.  For all too often, they presume a stable land/sea divide, removed from the experience of inhabitants of the region and how we have remade the shore.

3.  How can we better integrate both a human perspective on weather changes, and the role of human-caused conditions in maps of extreme weather?  How can we do better by going beneath the data visualizations of record-breaking rainfall, to map the human impact of such storms?  How could we do better to chart the infrastructural stresses and the extent to which we are ill-prepared for such extreme weather systems whose impact multiplies because of the increased impermeability of the land, unable to absorb excessive rainfall, and beds of lakes and reservoirs that cannot accommodate increased accumulation of rainfall that  stand to become the new normal?  The current spate of news maps that provoke panic by visualizing the extremes of individual cases may only inspire a sort of data vis-induced ADD, distracting from infrastructural inadequacies to the effects of global warming–and leaving us at a loss to guarantee the best structures of governability and environmental readiness.

Indeed, the absence of accurately mapping the impact and relation between landcover, storm intensity, rainfall, flooding, and drainage abilities increases the dangers of lack of good governance.  There need not be any need for a reminder of how quickly inadequate mapping of coastal disasters turns into an emblem of bad governance.  There is the danger that, overwhelmed by the existential relation to each storm, we fail to put them together with one another; compelled to follow patterns of extreme weather, we risk being distracted from not only the costs but the human-generated nature of such shifts in seasons between extremes of hot and cold.  For as we focus on each event, we fail to integrate a more persuasive image of how rising temperatures stand to create an ever-shifting relation between water and land.

Provoked by the rhetoric of emergency, we may need to learn to distance ourselves better from the aerial views that synthesize intense precipitation, tally hurricane impacts, or snowfall levels, and view them less as individual “strikes” or events and better orient ourselves to a broader picture which put us in a less existential relation to extreme weather.

2017-four-us-hur-landfalls_3

The Weather Channel

We surely need to establish distance to process syntheses of data in staggering aerial views on cloud swirl, intense precipitation, and snowfall, and work to peel back their striking colors and bright shades of rainbow spectra, to force ourselves to focus not only on their human costs, or their costs of human life, but their relation to a warming planet, and the role of extreme of weather in a rapidly changing global climate, as much as track the “direct strikes” of hurricanes of individual names, as if they were marauders of our shores:  their creation is as much tied to the changing nature of our shores and warming sea-surface temperatures, and in trying to create a striking visualization, we deprive ourselves from detecting broader patterns offering better purchase on weather changes.

direct-strikes

The Weather Channel

If patterns of weather maps epitomized by Accuweather forecast and projections suggest an exhilaratingly Apollonian view on global and regional weather patterns, they also  shift attention form a broader human perspective in quite deeply pernicious ways.  Such maps provided the only format for grasping the impact of what happened as the hurricane made landfall, but provided little sense of the scale of inundations that shifted, blurred and threatened the coast of the Gulf of Mexico.  They provide a format for viewing floods that are disjoined from victims, and seem to naturalize the quite unnatural occurrence of extreme weather systems.  Given the huge interest in grasping the transformation of Hurricane Harvey from a tropical storm to a Category Four hurricane, and the huge impact a spate of Category Four hurricanes have created in the Gulf of Mexico, it’s no surprise that the adequacy of the maps of Hurricane Harvey have been interrogated as hieroglyphs or runes of a huge weather change:  we sift through them for a human story which often left opaque behind bright neon overlays, whose intensity offer only an inkling of a personal perspective of the space or scale of their destruction on the ground:  while data maps provide a snapshot of the intensity of rain-levels or wind strength at specific sites, it is difficult if important to remember that their concentration on sites provide a limited picture of causation or complexity.

All too often, such maps fail to offer an adequately coherent image of disasters and their consequences, and indeed to parse the human contributions to their occurrence.  This post might be defined into multiple subsections.  The first actions suggest the problems of mapping hurricanes in the Gulf of Mexico in relation to flooding in data visualizations of the weather and the overbuilt region; the middle of the post turns to an earlier poetic model for considering the relation between land and sea that visualizations all too easily obscure, and the meaning that the poet Elizabeth Bishop found in viewing relations between land and sea in a printed map of the Atlantic; after returning to the question of the overbuilt shore compounds problems of visualizing the Texas coast, the final section, perhaps its most provocative, returns to Bishop’s reading of a map of the Atlantic coast.

What such new weather maps would look like is a huge concern.  Indeed, as we depend on weather maps to orient us to place ourselves in the inter-relations of climate change, sea-level, surface temperatures, and rain, whether maps cease to orient us to place, but when best constructed help to describe the changing texture of weather patterns in ways that can help familiarize us not only to weather conditions, but needed responses to climate change.  For  three months after the hurricanes of the Gulf of Mexico caused such destruction and panic on the ground, it is striking not only that few funds have arrived to cover costs of rebuilding or insurance claims, but the judgement or understanding of the chances for future flooding have almost left our radar–perhaps pushed rightly aside by the firestorms of northern and southern California, but in ways that troublingly seem to forget to assess or fail to assess the extent of floods and groundwater impermeability  along the Texas and Louisiana coast.  The problems that preparation for future coastal hurricanes off the Gulf of Mexico raise problems of hurricane control and disaster response that seem linked to problems of mapping their arrival–amd framing the response to the increasing rains that are dumped along the entire Gulf Coast.

Indeed, the chromatic foregrounding of place in such rainbow color ramps based on GPS obscure other maps.   Satellite data of rainfall are removed from local conditions, and serve to help erase complex relations between land and water or the experience of flooding on the ground–by suggesting a clear border between land and sea, and indeed mapping the Gulf of Mexico as a surface as if it were unrelated to the increased flooding around Houston, in maps prepared from satellite imagery, despite the uneasy echoes of anthropogenic causes for the arrival of ten hurricanes in ten weeks, in ways that suggest how warming waters contributed to the extreme inundation of the Gulf Coast.  Despite NOAA  predictions of a 45% likelihood of ‘above-normal’ activity for the 2017 Atlantic hurricane season, with, a 70% likelihood of storms that could transform into hurricanes, the images of inundated lands seem both apocalyptic and carefully removed from the anthropogenic changes either to the ocean or land that intensified their occurrence so dramatically on the ground.

Dartmouth Flood Observatory Flooding Harvey

 Dartmouth Flood Observatory

Harvey flooding_0.jpg

Dartmouth Flood Observatory/August 29, 2017

Is it possible to recuperate the loss of individual experience in such data maps, or at least acknowledge their limitations as records of the complexity of a changing climate and the consequences of more frequent storm surges and such inundations of rainfall?  As we seek better to understand the disaster relief efforts through real-time maps of effects of Hurricane Harvey as it moved inland from the Gulf of Mexico, shifting from Category 4 Hurricane from a tropical storm, we tried to grasp levels of rainfall that spun out of 115-mile-an-hour winds across southeastern Texas that damaged crops, flooded fields, ruined houses, and submerged cars, we scan stories in hope of clues to assess our position in relation to increasingly dangerous weather systems whose occurrence they may well forebode.  At a time of increased attention to extreme weather has long developed, the gross negligence of climate change denial is increasingly evident:  it recalls the earlier denial of any relation between hurricanes and climate change, when increased hurricanes were cast as “the cycle of nature,” rather than as consequences whose effects have in fact been broadly intensified by human activity.

Current attempts to map the toll of record-smashing hurricanes focused almost exclusively on point-based data view rainstorms largely as land-based records; even as they intend to monitor the effects of Harvey’s landfall by microwave censors, they risk seeming to isolate real-time rainfall levels from the mechanics warmer air and sea-surface temperatures which result from human-caused global warming, not relating increased storm surges or inundations to achanges in coastal environments or climate change.  To render such changes as natural–or only land-based–is irresponsible in an age of reckless levels of climate denial.  Indeed, faced by the proliferation of data visualizations, part of the journalistic difficulty or quandary is to integrate humanistic or individual perspectives on the arrival of storms, rendered in stark colors in the increasingly curtailed ecosystems of newsrooms which seek simplified visualizations of satellite data on the disaster, which fail to note the human contributions to the travails that are often reserved for photographs, which increasingly afford opportunities of disaster tourism in the news, emphasizing the spectator’s position before disasters, by images that underscore the difficulties in processing or interpreting the proliferation of data from MODIS satellite feeds:  we can show the ability to measure the arrival of torrential rains, but in offering few legends, save the date and scale, but offering few keys o interpret the scale of the disaster.

The looming portent of human-made climate change, however, underlies the poor predictions that NOAA offered of perhaps 2-4 major hurricanes this Spring, and the lack of a new director for NOAA–on which local and state agencies depend to monitor the nations shores and fisheries–suggested the, from June to September, which left states on their own to make decisions and plan for disaster mitigation programs and better flood maps.  (The danger of appointing a newly nominated director, Barry Myers, who is a strong supporter of the privitization of weather maps and an executive at the private Accuweather mapping service, suggests the difficulty of determining the public-private divide in an era of neoliberalism, and a free market of weather maps that were once seen as central to national security and standards of safety.)   There are two hidden scales on which we read these opaque maps of global warming and globalization and local inundation are triply frustrating.   For all the precision and data richness of such point-maps of largely land-based rainfall, local temperature, or flooding, the biases of such instantaneous measurements seem to fit our current governing atmosphere of climate change denial, and dangerous in erasing how such storms are informed by long-term consequences of man-made climate change.  (As the mapping tools of coastal weather seem destined to change, what sort of change in direction for NOAA coastal maps do we want:  the appointment suggests the terrifying possibility of a return to the Bush-era proposal nominee Myers supported that prohibiting the agency from producing any maps already available in the private sector then threatened federal weather lines to go dark–lest they literally compete with ad-supported websites private providers–and shift federal information offline?)

For making moves toward the future readability of weather maps may well be at stake in critically important ways.  The 2005 proposal that Myers backed would have eliminated the National Weather Service, even while exempting those forecasts needed to preserve “life and property,” would in essence have returned the weather services to a pre-internet era, even as the most active hurricane season including a record breaking fifteen hurricanes and twenty-eight storms began in the gulf coast, including the infamous hurricane Katrina.  The proposed bill would have prevented NOAA from posting open data, and not only readily available to researchers and policymakers, in ad-free formats, free of popup screens, but allow them to make their own maps on the fly–ending good practices of posting climate data would work quite dangersously to prevent development of tools of data visualization outside commercial models of rendering storms and hurricanes as if environmentally isolated.

2005-tracks-update.jpg
direct-strikes

A deeper problem of providing such limited weather maps of tropical storms may be the subtexts about the relation of human causes to weather they convey, and the absence of a greater narrative of the transformation of a global ecology or of the ecology of the Gulf Coast.  The curtailed images of “nature” they present by symbolizing rains, winds, floods, or submerged regions in appealing hues as natural–raise questions of the odd simplicity of the absent storylines:  cheery colors erase or bracket complex questions of climate change, the human contribution to extreme weather events, or the human experience of suffering on the ground:  Rita, Cindy, Katrina, Dennis, and Wilma seem not part of the environment, epiphenomenal interlopers moving across a static deep blue sea, in an apparent dumbing down of the mechanics of hurricane or storm formation in a rainbow spectrum removed from a human-made environment.

Continue reading

Leave a comment

Filed under anthropogenic change, climate change, coastlines, ecological disasters, gulf coast

Droughtshaming!

Will the hashtag #droughtshaming change the public water consumption levels in California?  or is it only a manifestation of an all too long-submerged consciousness of evident property differences across most of Southern California–a space where ever-conspicuous consumption has long been made manifest in keeping yard lawns perpetually green?   and what of the Wet Prince of Bel Air, who has used an incredible 11.8 million gallons yearly during the drought to maintain the green yards on their southern California estate?

Almost as powerful a portmanteau as “Mansplaining,” the compound currently trending on Twitter presents both a righteous form of indignation, improvising map via social media that suggests our changing sense of our environment may open new arenas of public speech. The creation of a set of zoomable interactive maps from the New York Times of projected water-cuts and current water-usage across the state’s water districts have been recently mapped an uneven balance between water districts statewide, in ways that not only call clear attention to sharp discrepancies of water-usage across the state, not only between how urban and agricultural regions might be affected by mandated reductions in public water usage–

 

central valley water cuts

 

but what might be called the selective yard-drenching in specific regions of the south-lands, according to the same interactive data visualization–

 

 
drenching years in 2014-15 in LA

 

and the notable persistent over-use of water in wealthier areas of LA’s per diem consumption of water this past winter–

 

LA Consumption habits per diem Winter 2015

 

The map above offers an approximate reflection of a topography of disposable income, described b UCLA’s California Center for Sustainable Communities.  The Center quite recently found not only that “wealthy used more than three times the rate of non-wealthy people,” but wealth was the most conspicuous correlation and predictor of water use–and watering lawns, as we have long known, an increasing sign of conspicuous consumption even in an age of drought.

Is this a decision to spend more on water, or is it, as seems more likely, the conspicuous expenditure of water on yards, perhaps fueled by the cost of letting all that greenspace go dry, or the actual dangers of fire hazards that letting lawns go dry might create?  The oft-cited datum that Beverly Hills residents daily “used” some 286 gallons of water during September 2014, at the same time northern and coastal San Diego County consumed some 584 gallons in the Santa Fe Irrigation District, contrast sharply to Compton residents served by the LA Department of Water and Power who restricted themselves to 93 gallons a day and Angelinos in East LA some 48 gallons.

But it bears repeating at a time when Governor Brown wants to mandate across the board 20% reductions in water use as a means of increasing efficiency, if only to ask what some of the best manners of mandating reductions are.  By dividing water-usage by census tract, clear patterns in LA County emerge, that make it something of an epicenter, to mix geographic metaphors, with the recent rash of tweets about excessively selfish individual water use at Beverly Hills mansions that include, in some cases, spas and vineyards as well as expansive still-green lawns:

 

Water:Income LA

 

But rather than only call attention to the sociological correlation between water-waste and wealth, this post wants to ask questions about the ethics of the spontaneous sorts of mapping of water-waste that have proliferated in Angelino social media, as if to sharpen critiques of the lack of social responsibility of the wealthy in a city of sharp social divides, in ways that remote sensing is promising new results in a far more detailed manner for select Los Angeles neighborhoods in order to drill more deeply into the extent of watering of lawns, flowers, and trees that underlies such datasets.  But human-scale photographs posted on social media via Twitter has been an initial means to assemble immediately available instances of water over-use.

The spontaneous mapping of such inequalities on social-media is a sort of crowd-sourced shaming to redress unspoken social inequities, with offending addresses lain out on twitterfeeds for the public to see, lest anyone be confused about who has the public interest at heart, and who is most concerned with keeping the brown grass at bay, even without looking at the bigger picture, in something approximating collective rage against the overwatered large yard as an exercise of collective shaming, which has gained a real edge given that the state is poised to levy hefty fines on identified water wasters since mid-2014.  It’s triggered a geographical awareness of the steep inequities of water use and comes close to socially sanctioned class-consciousness–

 

droughtshaming

 

–and its effects on the lived landscape ofBeverly Hills lawns:

 

 

Streisanf

Such selective outing of levels of outrageously cartoonish disproportionate use of water utilities may run the ethical risk of crowd-sourced surveillance, where aerial photography approaches NSA-style snooping via overhead drones–the regional sustainability manager for Sacramento’s Utilities Department was said to be “pleasantly surprised” at such snitching last summer, when #drougthshaming took off on the Twittersphere.  But the current spate of tweeted outrage expressed on social media has also become a venue for expressing suppressed sentiments of a class struggle, very slightly veiling disgust at profligate over-watering lawns indulged by those running automatic sprinklers as if they were draining regional aquifers single-handedly, with little heed for state-wide water shortages, brought to the front in signs posted in public parks that remind users that “Brown is the New Green.”

 

Brown New GreenAaron Mendelson/KQED

 

Tweets are most famous for unleashing wrath against the privileged who are out of touch with the reality of water-needs–

 

green lawns

OhMo

Kim

–at the fact that rhythms of daily consumption patterns are so drastically different across a single city by degrees of multipliers.  And is it even a surprise that the mansions of three and a half acres we’ve become used to viewing and vicariously living on Reality TV have been most notoriously cautioned by local Municipal Water Districts to cut the their water use drastically?  (Both Barbara Streisand and Kim Kardashian have publicly agreed to curtail their water use–“Kim takes this drought seriously;” said a representative; “she has no problem letting her grass go brown.”)

The targeted social criticism is by no means limited to the super-wealthy:

Sprinklers Running since <7AM

The steep social discrepancies in water-use have thrown into relief the divided economic structures of the city that we’ve long known about from the American Community Survey–Orange County and Palos Verde residents use respectively thee and two times the state-wide per capita daily consumption rates in February 2015–but now suggest that water wastage among the wealthy is actually undermining the public good in a clearly mappable manner.  We have long seen larger yards in specific neighborhoods, but watering practices seem to have grown out-of-hand in expropriating the public resource with obliviousness, even while we blame “nature” for a drought that is increasingly evident is indeed largely man-made, and even may as due to human nature as climate change.

LA in detail

 

During the summer, such deep discrepancies of daily water consumption are of course placed into even further relief in  data visualizations of local levels of consumption, reflecting an apparent rationalization of increased water usage as well as the readiness of covering rising water costs, as lower income families responded more rationally to higher water costs.

 

LA summer of 2014

 

To be sure, Northern California has done fairly well to reduce consumption from the Spring 2013–

 

usage change nocal

 

But it is also true that the aerial photographs of the ambient effects of income inequality that sent Google Earth images viral after being posted on persquaremile reveal the grey v. green dichotomy to be by no means limited to the southland–

 

oak:piedmont

 

Such a democratic appropriation of Google Earth may have paved the way for the tweeting of extravagant consumption of water that has become all too evident in some of the larger Beverly Hill yards, that can be linked to specific addresses.

The calls for greater restraint in water usage since March 2013 is far from clear in much of the greater Los Angeles area, as posters on social media have not only realized, but realized that they were able to publicly point out.

 

SoCal 2013-15

Both a more equitable distribution of water access and a rethinking of such deeply-lying assumptions of personal prerogative to wasting water deserve attention as Californians try to curb continued water use in a responsible manner.  We will have to tilt swords with some of the deeper espousers of a free market of deregulated water consumption, but at this point, for better or worse, deregulation has its back snugly against the wall.

And despite the reluctance of water utilities to identify wasters of boggling amounts of public water–as the Los Angeles homeowner known only as Wet Prince of Bel Air, a name won for pumping an incredible annual 11.8 million gallons during the recent drought to his estate.  The recent news that 100 residents of such wealthy Los Angeles neighborhoods as Westside have been pumping millions of gallons of water apiece has called for more effective means of recourse than twitter revenge, as such outing bears little fruit; in the light of recently passed laws against over-use of water, remote sensing technologies have been used by journalists at Reveal who are eager to even up the score:  taking advantage of   new fines assessed against excessive water use, the mapping through Digital Globe and others provides a deeper survey of water use than would be released by Los Angeles’ compliant Department of Water & Power.  Indeed, the Center for Investigate Reporting has begun to “out” high water-users by remote sensing–and publishing the maps!

Given the limits of Twitter photographs to document public instances of water overuse, the expansive indulgence of overwatering in such somewhat reclusive sites as Bel Air, perhaps inspired by droughtshaming, have used remote sensing provides a means to assess an accurate record of water-use to map the high use of water to estates to out individual culprits of over-watering, tracking the greening of their gardens by Google Earth and Digital Globe and an assessment of exactly how healthy those yards are.

BelAirOverview20160909.jpg

Using remote sensing of the health of plants–by means of a form of remote sensing developed to detect plant health common in agricultural assessment– the Normalized Vegetation Index (NDVI) helps to pinpoint individual culprits of water over-use might be identified whose identity would be otherwise kept hidden by the county, by measuring the living vegetation that has continued its ability to absorb visible light wavelengths of light, the very ones used in photosynthesis, to create a unique dataset of those with the largest living yards in the municipality.

For the primary culprits are be identified by remote sensing of living green vegetation that remain on such sites as the heavily wooded estate that is maintained by move producer Peter Guber, part-owner of the Golden State Warriors, who indulges his wooded estate with over 2.8 million gallons of water each year, while pushing the Warriors to take up a home in San Francisco to boost their revenues.  The owner of the 42-room French-style chateau from TV’s “The Beverly Hillbillies,” former Univision CEO Jerrold Perenchio, who uses up to 6.1 million gallons each year to water his plants and gardens.  The owner of the 28,000-square-foot “Bellagio House” whose floral gardens suck up over 4.6 million gallons per year.  The technology used of combining infrared and near-infrared light by the Normalized Vegetation Index (NDVI).    The NDVI has become sufficiently refined from satellite or drone remote observation to parse and better describe water use and its impact in plants with a great precision, as is evident in the MODIS satellite maps of groundwater in the United States, and to present a highly sensitive reading of vegetation health at precise moments in time, and indeed within given parameters of health, by mapping the presence of water in plants–as one would map the presence of water in the ground.

NDVI.jpg

By means of a similar remote sensing with NDVI, one can effectively map lots’ local water saturation at a scale to detect individually owned gardens such as those that Guber indulges on Lausanne Road in Bel Air–outlined below , with relative vegetative health shown in red colors, showing the highest range of the NDVI–as an accurate way to assess the extent of living vegetation, using infrared and near-infrared light to measure the local health of vegetation with amazing sensitivity, much as is familiar from global maps–but is only recently possible at such low scale thanks to Digital Globe–in ways that can not only identify individual culprits of water over-use, but presumably take them to task.

Guber's estat.png

–or the Casa Encantada owned by Garry Winnick–

Casa Encantada.png

For unlike the yellowed out areas of most of even the region of Bel Air, the bright red expanses suggest an odd over-nourishment of gardens even in a time of drought that indeed seems quite newsworthy, and is perhaps able to be viewed by Digital Globe alone.

Casa Encantada trees.png

–and can also be mapped, if with less clear-cut results, by soil moisture:

Soil Moisture.png

While such remote sensing from satellites had been confined to national regions at specific times of year,

600px-ndvi_062003

600px-ndvi_102003

–or used to map global differences in plant health–

600px-Globalndvi_tmo_200711_lrg.jpg

–the local assessment of those who over-indulge in caring for their lawns and flowers is both something close to surveillance and perhaps a form of surveillance that recent laws about water use have sanctioned in California during our current drought.

The odd triangles and spots of green that remain in a drying out landscape in which most of the rest of us live (spot the non-arboreal light green track in the tan landscape shown below?) reveal the levels of water waste which demand to be curtailed, and are emblematic of the golf courses and overwatered farms that we’ve just begun to take stock.

FullSizeRender-11

1 Comment

Filed under Bel Air, California drought, climate change, mapping drought, Remote Sensing

Arctic Circles

On our annual northward migration to Ottawa this December, we gathered around the unused fireplace in an unheated living room during the warmest Canadian Christmas in personal experience–as well as in the public record for Atlantic Canada, where local records for rainfall have surpassed all earlier recorded years.  Perhaps because of this, discussion turned to ownership of the North Pole for the first time for some time, as what was formerly a featureless area of arctic ice has become, as a receding polar ice-sheet exposes possible sites of petroleum mining, to become an area of renewed land grabs and claims of territoriality, as their value for nations is primarily understood in a global market of energy prospecting.  The story of the new mapping of territorial claims around the arctic ice cap goes back decades, to the exploration of offshore polar drilling, but the exposure of land raises new questions for mapping because boundaries of polar sovereignty are contested, even as oil companies have speculated by modeling sites of future exploration for petroleum deposits.

Although one assumption circulated that the place was Canadian by birthright—birthright to the Arctic?–since it is so central to national mythistory.  But there’s as much validity for its claims as the more strident claim the explorer Artur Chilingarov made to justify his planting of a Russian tricolor in the murky ocean bed 2.5 miles below the North Pole, during the 2007 polar expedition of the Mir submarine, with the blunt declaration that “The Arctic has always been Russian.”  Canadian PM Steven Harper did not hesitate a bit before decrying these claims to territoriality, warning his nation of the danger of Russian plans for incursions into the arctic in his tour of Canada’s North, thumping his chest and professing ongoing vigilance against Russia’s “imperial” arctic “imperial” as a national affront in addressing troops participating in military maneuvers off Baffin island as recently as in 2014.

Harper’s speech might have recalled the first proposal to carve pie-shaped regions in a sectorization of the North Pole first made by the early twentieth-century Canadian senator, the honorable Pascal Poirier, when he full-throatedly proposed to stake Canada’s sovereign claims to land “right up to the pole” and transform what had been a terra nullius into an image of objective territory seemed once again at stake.  Poirier claimed jurisdictional contiguity in declaring “possession of all lands and islands situated in the north of the Dominion.”  Poirier’s project of sectorizing the frozen arctic sea and its islands, first launched shortly after Peary’s polar expedition, has regained its relevance in an age of global warming, arctic melting and climate change.  But the reaction to the expanding Arctic Ocean in a language of access to a market of commodities has inflected and infected his discussion of the rights of territoriality, in ways that have obscured the deeper collective problems and dilemmas that the eventuality of global warming–and arctic melting–broadly pose.

Arctic Teritorial Claims

Encyclopedia Brittanica

The question of exactly where the arctic lies, and how it can be bounded within a territory, or, one supposes, how such an economically beneficial “good” that was part of how parts of the north pole might get away from Canada, has its roots in global warming–rather than in conquest.  The dramatically rapid shrinkage of ice in the Arctic Sea has raised newly pressing issues of sovereignty; the widespread melting of arctic ice has made questions of the exploitation of its natural resources and potential routes of trade has made questions of the ownership of the Arctic ocean–the mapping of the territorial rights to the seas–increasingly pressing, as some 14 million square kilometers of Arctic Ocean have emerged not only as open for exploration, but as covering what has been estimated as 13% or more of total reserves of oil remaining to be discovered world wide.

20141220_IRM937

 The Economist

While it seemed unrelated to the ice melting from nearby roofs, or large puddles on the streets of Ottawa, conflicting and contested territorial claims that have recolored most maps of the Arctic so that its sectors recall the geopolitical boardgame RISK, that wonderful material artifact of the late Cold War.  Rather than map the icy topography of the region as a suitably frosty blue, as Rand McNally would long have it, we now see contested sectors of the polar regions whose borderlands lie along the Lomonosov Ridge (which runs across the true pole itself).  The division of the pole so that it looks like post-war Berlin is an inevitable outcome of the fact that the arctic is warming at twice the rate of the rest of the planet, resulting in the opening of an area that was for so long rarely mapped, and almost always colored white with shades of picturesque light blue to suggest its iciness.

The lands newly revealed in the northern climes have however led territorial claims of sovereignty to be staked by a four-color scheme of mapping.  The uncovering of arctic lands–in addition to new technologies for underwater oil extraction and sensing–have complicated the existing maps of ocean waters premised upon expanding existing territorial waters an additional 278 kilometers beyond what can be proven to be an extension of a landmasses’ continental shelf–expanding since 1984 the rights to Arctic waters of the United States, Denmark, and Canada, according to consent to the United Nation’s Law of the Sea Convention (UNICLOS) which sought to stabilize on scientific grounds competing claims to arctic sovereignty.

Arctic Boudnary Disputes

The issues have grown in complex ways as the melting of Arctic ice has so dramatically expanded in recent years, exposing new lands to territorial claims that can be newly staked on a map that unfortunately seems more and more to resemble the surface of a board games.  Even more than revealing areas that were historically not clearly mapped for centuries, the melting of the polar cap’s ice in the early twenty-first century has precipitated access to the untapped oil and gas reserves—one eight of global supplies—and the attendant promise of economic gains.  Due to the extreme rapidity with which polar temperatures have recently risen in particular, the promises of economic extraction have given new urgency to mapping the poles and the ownership of what holes will be drilled there for oil exploration:  instead of being open to definition by the allegedly benevolent forces of the free market, the carving up of the arctic territories and disputes over who “owns” the North Pole are the nature follow-through of a calculus of national interests.  The recent opening up of new possibilities of cross-arctic trade that didn’t involve harnessed Alaskan Huskies drawing dog sleds.  But the decline in the ice-cover of the arctic, as it was measured several years ago, already by 2011 had opened trade routes like the Northwest Passage that were long figures of explorers’ spatial imaginaries, but are all of a sudden being redrawn on maps that raise prospects of new commercial routes.   New regions assume names long considered but the figments of the overly active imaginations of early modern European arctic explorers and navigators in search of the discovery of sea routes to reach the Far East.

20120616_SRM980

The Melting North,” Economist

On the one hand, these maps are the end-product of the merchant-marine wish-fulfillment of the eighteenth-century wishful mapping of the French Admiral Bartholomew de Fonte, whose maps promised that he had personally discovered several possible courses of overcoming a trade-deficit caused by British domination of the Atlantic waters, allowing easy access to the South Seas.  The imagination of such routes proliferated in a set of hopeful geographies of trade which weren’t there in the late eighteenth century, of which de Fonte’s General Map of the Discoveries is an elegant mixture of fact and fiction, and imagined polar nautical expeditions of a fairly creative sort, presenting illusory open pathways as new discoveries to an audience easily persuaded by mapping pathways ocean travel, even if impassable, and eager to expand opportunities for trade by staking early areas of nautical sovereignty to promise the potential navigational itineraries from Hudson Bay or across the Tartarian nation of the polar pygmies:

arctic1772-full-1

Open-ended geographies of land-masses were given greater credibility by the dotted lines of nautical itineraries from a West Sea above California to Kamchatka, a peninsula now best-known to practiced players of the board-game RISK:

0078em

As well as imagine the increase potential shipping routes that can speed existing pathways of globalization, in fact, the meteorological phenomenon of global warming has also brought a global swarming to annex parts of the pole in confrontational strategies reminiscent of the Cold War that tear a page out of the maps, which give a similar prominence to Kamchatka, of the board game ‘RISK!’  Will their growth lead to the naming of regions that we might be tempted to codify in a similarly creatively improvised manner–even though the polar cap was not itself ever included in the imaginative maps made for successive iterations of the popular game of global domination made for generations of American boys–and indeed provided a basis for a subconscious naturalization of the Cold War–even while rooting it in the age of discoveries and large, long antiquated sailing ships, for the benefit of boys.

pic324841

RISK (1968)  

Following versions took a less clearly vectorized approach, imagining a new constellation of states, but also, for the first time, including animals, and updating those schooners to one sleeker ship!

1 living room, dining room, kitchen IMG_1319

Risk!, undated  

The more updated current gameboard is curiously more attentive to the globe’s shorelines, as if foregrounding their new sense of threatened in-between areas, on some subconscious areas, that are increasingly prone to flooding, and less inviolable, but also suggesting an increasingly sectorized world of geopolitics, less rooted in individual. nation-states..

risk-1

Risk–current board

Will future editions expand to include the poles as well, before they melt in entirety, as the ways that they become contested among countries percolate in the popular imagination?

We must await to see what future shorelines codified in the special ‘Global Warming Edition’ of RISK–in addition to those many already in existence in the gaming marketplace.

If the game boards suggest Christmas activities of time past, the ongoing present-day game of polar domination seems to be leading to an interesting combination of piece-moving and remapping with less coordinated actions on the parts of its players.  We saw it first with Russia’s sending the Mir up to the North, which precipitated how Norway claimed territoriality of a sizable chunk of Arctic waters around the island of Svalbard; then Denmark on December 15 restocked its own claims, no doubt with a bit of jealousy for Norwegian and Swedish oil drilling, to controlling some 900,000 square kilometers of arctic ocean north of Greenland, arguing that they in fact belong to its sovereign territories, and that geology reveals the roots of the so-called Lomonosov Ridge itself as an appendage of Greenland–itself a semi-autonomous region of Denmark, upping up the ante its claims to the pole.

While the Russians were happy to know that their flag was strategically but not so prominently placed deep, deep underwater in the seabed below the poles, the problem of defining the territorial waters of the fast-melting poles upped the ante for increasing cartographical creativity.   Recognized limits of 200 nautical miles defines the territorial waters where economic claims can be made, but the melting of much of the Arctic Ocean lays outside the claims of Canada (although it, too, hopes to stake sovereignty to a considerable part of the polar continental shelf), by extending sovereign claims northward from current jurisdictional limits to divide the mineral wealth.  Were the Lomosonov Ridge–which isn’t moving, and lies above Greenland–to become a new frontier of the Russian state, Russian territory would come to include the pole itself.

LOMOSONOV RIDGE.png

Bill Rankin/National Geographic

The actual lines of territorial division aside, the diversity of names of the single region indicate the competing claims of sovereignty that exist, as if a historical palimpsest, within an actual map of the polar region:  from the Amundsen Basin lies beside the Makarov Basin, the Yermak Plateau beside the Lena Trough and Barents Plain, suggesting the multiple claims of naming and possession as one approached the North Pole, without even mentioning Franz Josef Land.

LOMOSONOV RIDGE
amundsen basin dotted lines of contestation?.png
Contestation of the Pole

While the free market isn’t able to create an exactly equanimous or impartial division of land-claims, the new levels of Denmark’s irrational exuberance over mineral wealth led the country to advance new claims for owning the north pole, and oil-rich Norway eager to assert its rights to at least a sixth of the polar cap, given its continued hold on the definition of the northern lands.  The increasing claims on proprietary rights of polar ownership among nations has lead international bodies such as the United Nations Conventions on the Law of the Seas (UNICLOS) to hope to codify the area peaceably by shared legal accords–presumably before the ice-cover all melts.

The maps of speculation of the “Arctic Land Grab” is economically driven and suggests an extension of offshore speculation for oil and gas that has long roots, but which never imagined that these claims would be able to be so readily concretized in terms of a territorial map as the melting of the ice cap now suggests.  But as technical maps of prospecting are converted into maps with explicit territorial claims, planned or lain lines of pipe are erased, and the regions newly incorporated as sites of territoriality in ways that earlier cartographers would never have ventured.

rankin polar maps
Bill Rankin/Radical Cartography

The existence of laid or planned pipeline by which to pump and stream oil across much of Upper Canada from the Chukchi Sea, North Slope, and MacKenzie Delta have long been planned by Canadians.  Similarly, the Russian government, echoing earlier claims of Russian stars to straddle the European and Asian continents, have claimed the underwater Lomosonov Ridge as part of the country’s continental shelf, even if it lies outside the offshore Exclusive Economic Zone, as is permitted by UNICLOS–so long as the edge of the shelf is defined.

Canada has taken the liberty to remap its own territory this April, in ways that seem to up the ante in claims to arctic sovereignty.  In updating the existing map of 2006 to make it appear more ice exists in the Arctic than it had in the past,  the Atlas of Canada Reference Map seems to augment its own sovereign claims to a region in ways clothed in objectivity:  even as arctic ice-cover undeniably rapidly melts in a decades-long trend, the ice-cover in the region is greatly expanded in this map, in comparison to that of 2006, and the northern parts of Canada are given a polemic prominence in subtle ways by the use of a Lambert conformal conic projection and a greatly expanded use of aboriginal toponymy to identify lands that even belong to different sovereignty–as Greenland, here Kalaalit Nunaat–in terms that link them to indigenous Canadians, and by extension to the nation.  Both tools of mapping appear to naturalize Canadian claims to the Arctic in a not so subtle fashion.  Moreover, the map stakes out exclusive economic zones around Arctic regions:  even as the Arctic rapidly melts, for example, disputed islands near Greenland, like Hans Island, are shown clearly as lying in Canadian waters.

Canada with Polar Claims, Parks

Perhaps what exists on paper trumps reality, creating an authoritative image of an expanded Arctic–a white plume that expands the amount of Arctic ice beyond the rendering of the Arctic Sea in its earlier if now outdated predecessor.

It is instructive to look backwards, to grasp the earlier strategic sense invested in the Kamchatka Sea, before it migrated into Risk! The earlier pre-fifty-states rendering of this Russian area as an independent sea, fed by the Kamchatka River, was seen as an area apart from the Pacific, bound by the archipelagos of a future Alaska that were imagined to bound the region, as if to create an oceanic theater of entirely Russian dominance, above the “eastern ocean” of the Pacific, and almost entirely ringed by what must have seemed to have been essentially Russian lands.

The above map has, of course, nary a reference to a pole, but an expanded sea remaining fully open to navigation with charts.

What exists on paper, once officially sanctioned, seems to stand as if it will continue to trump the rapidly shrinking extent of arctic ice.  The map trumps reality by blinding the viewer, ostrich-like fashion, or keeping their head deeply buried in the proverbial sand.  The decision to show the thirty-year median of sea-ice extent in September in the years between 1981 to 2010 brings the map into line with the way that Environment Canada computes sea-ice extent.  And the augmentation of Inuit toponymy for regions near the Arctic recognizes the indigenous role in shaping Canada’s toponym.  But it would be hard to say that either would be advanced if they did not have the effect of expanding Canadian sovereignty to the arctic.  The reality it maps clearly mirrors the shifting interests of the state at a time of the shrinking of Arctic ice due to climate change, more closely than it shows the effects of global warming on the ice-cover of the northern regions, let alone in the Arctic itself.  With more maps that diminish the effects of global warming, the orienting functions of the map seem to be called into question in themselves.

Merry Christmas indeed!

2 Comments

Filed under arctic, Climate Change, climate change, climate monitoring, Global Warming

Smelling the Coffee

Coffee beans lying on a burlap sack map the entire world, but belie the fact that the area suitable for growing coffee beans stands to be reduced by as much as half by 2050, if climate change continues, and the Arabica beans grown in tropical highlands stop receiving the year-round rains that not only enable but nourish their growth.  If that isn’t a wake-up call, what is?  Some 25 million farmers rely on their production of the beans–most of whom are small farmers–but rising temperatures predicted could radically reduce and effectively circumscribe what those in the know call the “coffee map” that tracks the traffic in beans, their shipping across seas, and arrival for roasting, in the newly popular epicenter of roasting, the Untied States of America, who seem increasingly united by caffeine that Starbucks CEO Howard Shultz would be a credible candidate for U.S. President, able to boast of his ability to change the electoral map favorable to Republicans enough to inspire angst in the Democratic Party.

The threat of the disruption of an Independent self-candidacy of the coffee bean mogul was a nightmare that seemed destined to follow the fear of a new post-Trump oligarchy that deflated the Democratic Party, and inspired a round of critical commentary from the oligarchical crowd of Bloomberg, Buffett, et al., who presented the real threat of a coffee-driven candidacy as a thwarting of the American people, perhaps by the promise of lowering the prices of a venti: the conversion of beans to espresso-based drinks like cappuccino pose questions rooted in old-style economic geography, however, of the circulation of beans to rotaries, and the currency of capital from coffee, that demand mapping: Schultz is perhaps a new wave of capital from the self-fashioning of economic wealth in the different domains embodied by Bloomberg, Trump, and Buffett, far more rooted in investment in intangibles, but more deeply tied in direct ways to processes of global warming we rarely see in a cup of coffee or in that scoop of opaquely glistening dark beans.

The sorts of disruption in growing beans that localized drought in those tropical highlands seems almost destined to bring–as the worst dry spell in decades that hit Brazil’s coffee belt, destroying a third of the crop or a decline by a half in coffee bean yields in Tanzania since the 1960s, as temperatures warm and the ground becomes more wet, stand to grow not only the price of the two and a quarter billion cups of coffee we humans consume daily, in what seems like a private experience, meaning that the beans will no longer be so abundant in those burlap bags, or so plentifully and bounteously displayed in buckets that tempt the spending of a few bills on an easy pick-me-up that feels all too eerily like an intravenous feed.

In Columbia alone, coffee leaf rust that is the consequence of weather that is both warmer and wetter stand to damage some 60% of the country’s agricultural land by 2050, making for a decreased abundance of beans and ever more removed cup of joe.  Indeed, coffee beans are particular enough about growing conditions that to remain optimal conditions of production, the temperature should remain between 18–21°C, and once rising above 23°C, bean quality declines and the plant grows far too fast, noticeably changing its taste, flavor bouquet, and aroma . . . with a detectable shift occurs with only a rise of only half of a degree at the wrong time in the growing season, in ways that could change coffee harvests far sooner that one might expect.

Coffee beans map

More than any other sector of life, the tremendous growth of coffee consumption over the past decades invites a daily morning exercise of geographic literacy.  In hubs of metropolitan coffee drinking, the local origins of beans are invested with a provenance that is supplemented by detailed descriptions of their practices of cultivation.  The geographic here primarily signifies or maps to the gastronomic; provenance of coffee has a deeper resonance than the locations of their roasting, and erases the huge distances traversed in their transportation or the local climates in which they were produced. The importation of coffees and costs of coffee farming are naturalized within local coffee bars as a geographic palate, as we’re accustomed to having Sulawesi, Rwanda, Burundi, and Tanzania among the rotating coffees of the week, and in our cups; their names, instead of the once-dominantColumbian or Brazil beans, are part of our routines that are often stripped of geographic signification as place-names–even though they might be better understood as a geographic lesson with climate warming as its subject.

The most globalist of all coffee chains, Starbucks, openly invites us, in the age of globalism, to remember a map as a decorative object, where place names are overlapping and indeed colliding with descriptors of the good on a map that seems more decorative than geographical in content, recurring as a backdrop for savoring the aromas of its blends, the global map colored, predictably, a rich coffee-shaded hue, all boundaries elided or obscured in a world that seems saturated by java, as if for the benefit of the drinker’s taste, providing a decorative map to enhance the drinking experience in a roadside Starbucks,

the descriptive or denotative content of the map quite intentionally reduced to a backdrop for taking a coffee break, pausing in one’s itinerary as one is driving along a California freeway and pausing at a rest stop.  It is striking that there is barely any attempt in this mock-antique global map of even making any place legible, only vaguely  treating the map as a sort of division of coffee flavors, but treating the globe as saturated by the flavor of beans–rather than of states or nations; descriptors are suspended as if disembodied, across a mock gold-leaf field, indicating the earthy, herbal, and chocolate tastes of coffees from “Asia” or the “Pacific”–rather than specifying distinctions of beans originating from Indonesia, Yemen, Java, Bali, New Guinea, or Ethiopia, where the leafs of coffee plants stretch, beneath the Tropic of Cancer.  The lands are a uniform brown, saturated with coffee flavors of disturbingly uniform tone, the descriptors lie on a gold field as if to offer themselves as what we sense as coffee consumers.

Indeed, the region can be displaced by the beans’ descriptor of the beans, whose deeply colored beans provide the more compelling descriptor of the coffee that seems to have indeed displaced the very legibility of place, if not its notion in a map, as they become the site of sensing the flavors of “nutty” coffee beans, the islands are left distinctly defined but without any sense of sovereign identity, and the number of coastal islands of possible cultivation almost infinite in number.

As we drink coffee daily, and isolate the coffee-drinking experience as an ecstatic one that almost exists without place–but is based on the construction of a place to drink coffee apart from the work world or the street, in a space of selective privilege and almost private intimacy, can maps even help to process the origins of coffee plants, or the mystification of converting a place-name to a descriptor, in ways that seem to extract the beans’ flavor from their site of origin?  They might not disrupt the individual experience of a good cup of coffee, an intensely pleasurable and even intensely neurologically stimulating activity, but map the names tied to the pleasure off caffeination to the broader land of the land, moving from the domestic sphere of coffee drinking and the half-private space of cafés where the precious brown liquid now circulates before being imbibed at considerable costs, all too easily concealed in the privacy of our pleasures–

325105_3f7f2740198ea41e_b

Louis Marin-Bonnet, “Woman Taking Coffee” (1774) Cooper Hewitt (inked engraving)

–to the global world in which this space must be situated. and the global markets on which they our all too easily internalized habits of caffeination increasingly depend.

The lopsided distribution of this conversion of toponymy into gastronomy is evident in a map of where this huge rise of coffee consumption and importation has occurred.  If coffee beans were introduced to the future United States in the eighteenth century, coffee has become among the leading importers of beans worldwide.  Americans now consume some 23 gallons each day (or 22.1 according to Wikipedia)–but declined far below the 48 gallons Americans were said to consume annually in 1946–0r the 62 gallons downed annually in far chillier Finland.  Yet it has become part of bloodstream–literally–and an unprecedented (although we lacked earlier metrics) 161,000 folks listed coffee making or serving as a “skill” in 2013.

Screen shot 2013-02-11 at 9.26.23 AM

We might call this the browning of North America, ignoring that the distribution of local blends, pioneered by Peets and popularized by Starbucks, before being refined by Blue Bottle or Four Barrel, masks variations in a topography of coffee drinking in the United States far more variegated than the homogeneous brown of the United States.  (The fact that Canada is yellow may seem comforting, but conceals the very urban nature of this social ritual:  notice how those brown dots congregate around Toronto, Montreal, Vancouver and Ottawa . . . )   The absurdity of nationally ranking habits of daily caffeination aside (although there’s an academic press title in here somewhere), the most striking aspect of this map is the huge area of the world left in white, shown here only from the equator since no countries below the equator import coffee beans in such quantity.

Coffee IMporting Countries- Top Ten
legend

Indeed, the ten largest coffee importers, mapped in yellow, not brown, suggests an imbalance of equatorial countries whose products tend to wake up folks in northern climes.

mapM

National Geographic

The more detailed mapping of the production of coffee by bags of beans, mapped by Oxfam for 2001, showed a nicely skewed data distribution, with those non-growing regions left suitably blank, as if they thirsted for the brown stimulant that came pouring (or steaming) in from equatorial climes:

Oxfam 2001

Oxfam

There is a clear “coffee belt” whose discovery and demarcation the Coffee Grower Association of Hamburg claims responsibility for:

coffee_belt

German Coffee Association

The Starbucks map referenced geographic precision of “our coffee belt” in decorative terms, as the flower and pods of the beans overlap the geographical content, and the letters on the map are changed to almost decorative forms, colored the hue of coffee.

IMG_4898.JPG

And a lot of imported coffee is needed–creating what has indeed long been a pretty big business interest in the US, even before the boutiques of metropolitan areas from the 1990s.  Even if, once this huge amount of unroasted beans is divided per capita, rendering regions like Canada and Scandinavia distinctly darker, and measured by consumption of cups/day alone, the geographic distribution looks a bit different in 2011 when mapping coffee consumption per capita in this clickable map of circa 2008:

2011 coffee map
l_581_521114deb92a786b0f86b920c30990d5-1

Which returns us to the interests that all maps conceal.  By the alchemy of toponymy, the regions from which the beans themselves derive, of course, come from the very equatorial regions that are the sites of forestry.

Coffee and Forests map--50 def:37 cps

This is evident in this far more anodyne map, prepared by someone trained in the school of neo-corporate graphic design:

I appreciate the hand-drawn oval projection, crafted with care in the midwest city of big shoulders, which suggests that if all roads led to Rome, all beans flow to Chicago:

Global sources of coffee on ms map

A more informed map might link the cultivation not only to forested regions, but to the very “hot spots” whose local biodiversity is most threatened by global warming, and where the inefficient use of water widespread in coffee cultivation least practical and most pernicious.  For it is not only equatorial areas, but from the driest areas on earth:

hotspots_coffee_map

Conservation International

The overlap of ‘hot spots’ map onto coffee-growing regions alarms; coffee cultivation is widespread in 16 of the 34 most threatened ecosystems.  And this is the tip of the iceberg, to use a somewhat mixed metaphor unless one considers iced coffee, of the paradox: privileging the locality of cultivation in maps of caffeine consumption reinforces the fragility of local ecosystems.  This is a very different map, speaking map now, from the manner that an  earlier cartographical image set the details of consumption as an inevitable but conscious choice, concentrated in one icon, as opposed to the naturalization of growing in isolated pockets of uniform unattractive gray:

Coffee Map of the World

Indeed, the conscious coffee drinkers at the Water Footprint in the Netherlands have mapped the severely disproportionate gross virtual water import that results worldwide:

Coffee's Water Footpring

waterfootpring.org
The “flows” of coffee beans might be mapped, if somewhat less legibly, in comparison to those of chocolate, whose parallel commerce from the New World matches it as a popular stimulant from the New World, as Mary Norton reminds us, that in fact was similar in its function to coffee:

Coffee and Chocolate

In part, this is a local story, with much of the jumping value of java able to be registered in the Port of Oakland, as shown in this bar graph which groups all beans as a whole, independent of locality, but reflects Oakland’s significance as a global hub of the importation of coffee beans.

SF-AB759_ROASTE_NS_20120926202103
Coffee IMporting Countries

Let’s recall the lopsided nature of a current map of coffee indulgence that this demand reflects–

–and note odd emptiness of the very areas where the extractive powers of coffee were so long based if we return back to the Starbucks map from which we began, the areas of Europe, now colored a dun caffeinated color, obscuring that it is the center from which the rage of coffee consumption began.

Eurocoffee.JPG

We might consider the possibility of a re-mapping our daily habits or affectation for caffeination, through this map of the network of coffee distribution and consumption, removed from most familiar geographic categories.  The schematic map is courtesy San Jose-trained Roxanne Pasibe:

248260ada0281088a11dea8323cabc75

In the initial graphic, the beans naturally drop to the ground, and into our bags. Let’s try to map how they come to get there, and into our cups.

Coffee Heart.png

Climate Institute

5 Comments

Filed under climate change, Coffee, economic geography,, Global Warming, water footprints