Tag Archives: risk maps

Surfside Ecotones

Shifting from a vision of blissed-out honeybees flying from hives to a flowering birch tree in gardens of a city whose buildings encode “labor-intensive materiality” in each of their individual bricks, American poet Campbell McGrath conjured a distinctly modern melancholy that turns from wonder at bees bearing pollen from blossoms of cherry trees on their anthers to hexagonal cells to the transience of New York, reduced to an “archipelago of memory,” as buildings of man-made bricks once “barged down the Hudson,” from a hinterland of clay pits, quarries, and factories, disintegrate into the Atlantic, and in an imagined future we are left to imagine their eventual return to silt: “It’s all going under, the entire Eastern Seaboard,” the Miami-based poet prophesied the impending retreat of the city, almost as an afterthought with resignation or awe in early 2020, and imagined how a new map of the nation that emerged as the Atlantic rose. The American Capitol secures more solid grounds in Kansas City and leaves the shores of Washington, DC, leaving “flooded tenements” or houseboats that are “moored to bank pillars along Wall Street.” As the Atlantic rises, few “will mourn for Washington,” and the once densely inhabited seaboard has been all but abandoned. What was the coast became a fluid ecotone bridging land and sea, in this dystopic flight of fancy, shifting the capitol to western Missouri in a search for more secure grounds.

The sudden collapse of half of the south tower of Champlain Towers in Surfside, FL, may be less apocalyptic in scope than the eastern seaboard. But it is now impossible to speak of offhand: we are agape mourning residents of the collapsed tower, trapped under the concrete rubble after the sudden pancaking of the southern tower. Without presuming to judge or diagnose the actual causes for the tragic sudden collapse of a twelve story condominium along the shore, the shock of the pancaking of floors of an inhabited condominium raises questions on how the many structural questions that surround Champlain Towers were overlooked. While retrospective attention is devoted to whether the certification process is adequate for forty year old concrete weight-bearing structures that are exposed to far more saltiness and saltwater than they were ever planned to encounter, they also raise questions of the increasingly anthropogenic construction of the coast. For the coast is not only an increasingly overbuilt environment, but less of a clearly mapped divide between land and sea, and not only because of sea-level rise, in an age of global warming.

We have long mapped Florida by its beaches, and constructed buildings for a market that privileged the elusive and desired promise of a beach view. Yet despite the allure that the state offers as a sort of mecca of beach settlement, even generating a market for vicarious live beach webcams in Florida and refusing, in the 2020 pandemic, to close beaches and beach life that promises as an engine of economic activity, to imagine danger signs on the beaches, or the instability of the shoreline communities we already know are instable, and not only from rising sea-level and surging seas. The fantasy and attraction of the beach, perhaps because of its own apparent instability and otherworldly qualities as “edges” that we imagine ourselves to be exhilarated, and released from day-to-day constraints, offers a destination, and not only for retirement homes, promising a new prospect on life.

The long distinction of the state by its beaches–its uncertain edges with the ocean–demand to be mapped and acknowledged as less of the clear line between land and sea than not only a permeable boundary, but of a complex geography vulnerable to both above ground flooding and underground saltwater incursion, sustained exposure to salty air, winds of increased velocity, and an increasing instability of its shores that have long been a site attracting increased settlement. Can one view the ocean surrounding the shores not only as a quiescent blue, but as engaged with the redrawing of the line of the shore itself as a divide long seen as a stable edge of land and sea?

From the increased tensions of hurricanes from the warming oceans, to underground saltwater incursion, to a constant beach erosion and remediation, the beaches we map as lines are coastal environment whose challenges engineers who valued the economy and strength of concrete towers did not imagine. The combination of the influx of salty air, the erosion and replacement of beach “sand”, and increased construction of condominium have created an anthropogenic shore that demands to be examined less as a divide between land and sea than a complex ecotone where salt air, eroding sand, karst, and subsoil weaknesses all intersect, in ways that the mitigation strategies privileging seawalls and pumping stations ignore. As importation of sand for Miami’s “beach” continues, have we lost sight of the increasingly ecotonal organization of Florida’s shores?

Sands from Central Florida Arive with U.S. Army Engineers in January, 2020
Matias Ocner/Miami Herald

The point of this post is to ask how we can best map shifts in the increasingly anthropogenic nature of Miami’s shores to come to terms with the tragedy of Champlain Towers, to seek us to remain less quiescent in the face of the apparent rejiggering of coastal conditions as a result of climate change beyond usual metrics of sea-level rise. For the collapse of Champlain Towers provides an occasion for considering how we map these shores, even if the forensic search for the immediate structural weaknesses that allowed the disaster of Champlain Towers to occur.

Miami Beach has the distinction of the the lowest site in a state with the second-lowest mean elevation in the nation, and ground zero of climate change–but the drama of the recent catastrophic implosion of part of Champlain Towers should have become national news as it suggested the possible fragility of regions of building that are no longer clearly defined as on land or sea, but exist in complex ecotones where the codes of concrete and other building materials may well no longer apply–or, forty years ago, were just not planned to encounter. While we have focussed on the collapse of the towers with panic, watching the suddenly ruptured apartments akin to exposed television sets of everyday Americans’ daily lives, the interruption of the sudden collapse of the towers is hard to process but must be situated in the opening of a new landscape of climate change that blurs the boundary between land and sea, and challenges the updating of building codes for all coastal communities. The old building codes by which coastal and other condominiums were built by developers in the 1970s and 1980s hardly anticipated to being buffeted by salty coastal air, or having their foundations exposed to underground seepage or high-velocity rains: the buildings haven’t budged much, despite some sinking, but demand to be mapped in a coastal ecotone, where their structures bear stress of potential erosion, concrete cracking, and an increased instability underground, all bringing increased dangers and vulnerabilities to the anthropogenic coast in an era of extreme climate change.

Rescue Workers in Surfside Disaster Attempt to Find Survivors in Champlain Tower South

A small beachside community bordering the Atlantic Ocean just north of Miami Beach, on a sandy peninsula surrounded by Biscayne Bay and the Atlantic, the residential community is crowded with several low-rise residential condominiums. While global warming and sea-level rise are supposed to be gradual, the eleven floors of residential apartments–a very modest skyscraper–that collapsed was immediate and crushing, happening as if without warning in the middle of the night. As we count the corpses of the towers residents crushed by its concrete floors, looking at the cutaway views of eerily recognizable collapsed apartments, we can’t help but imagine the contrast between the industry and care with which bees craft their hives of sturdier wax hexagons against the tragedy of the cracked concrete slab that gave way as the towers collapsed, sending multiple floors underground, in a “progressive collapse” as vertically stacked concrete slabs fell on one another, the pancaking multiplying their collective impact with a force beyond the weight of the three million tons of concrete removed from the site.

This post seeks to question if we have a sense of the agency of building on the shifting shores of Surfside and other regions: even if the building codes for working with concrete have changed –and demand changing, in view of the battering even reinforced concrete takes from hurricanes, marine air, flooding, and coastal erosion and seawater incursion near beachfront properties–we need a better mapping of the relation of man-made structures and climate change, and the new coasts that we are inhabiting in era of coastal change, far beyond sea-level rise.

Champlain Towers
Chandan Khanna/AFP

As we hear calls for the evacuation of other forty-year old buildings along the Florida coast, it makes sense to ask what sort of liability and consumer protection exists for homeowners and condominium residents, who seem trapped not only in often improperly constructed structures for an era increasingly vulnerable to climate emergency, but inadequate assurances or guarantees of protection. We count the corpses, without pausing to investigate the dangers of heightened vulnerability of towers trapped in unforeseen dangers building in coastal ecotones. Indeed, with the increased dangers of flooding, both from rains, high tides, storm surges, and rising sea-lelel, the difficulty of relying on gravity for adequate drainage has led to a large investment in pumping systems in the mid-beach and North Miami area. The sudden collapse of the building, which civil engineers have described as a “progressive collapse,” as occurred in lower Manhattan during the destruction of New York’s World Trade Center, the worst fear of an engineer, in which after the apparent cracking of the structural slab of concrete under the towers’ pool, if not other structural damage. The thirteen-story building, located steps from the Atlantic ocean, was part of the spate of condo construction that promised a new way of life in the 1970s, when the forty year-old building was constructed; although we don’t know what contributed to the collapse that was triggered by a structural vulnerability deeper than the spalling and structural deterioration visible on its outside, the distributed liability of the condominium system is clearly unable to cope with whatever deep structural issues led to the south tower’s collapse.

Americans who hold  $3 trillion worth of barrier islands and coastal floodplains, according to Gilbert Gaul’s Geography of Risk, expanding investment in beach communities even as they are exposed to increased risk of flooding–risks that may no longer be so easily distributed and managed among condominium residents alone. And the collapse of the forty year old condominium tower in Surfside led to calls for the evacuation and closure of other nearby residences, older oceanfront residences vaunted for their close proximity to “year-round ocean breezes” and sandy beach where residents can kayak, swim, or enjoy clear waterfront. The promise that was extended by the entire condominium industry along the Florida coast expanded in the 1970s as a scheme of development that was based on the health and convenience of living just steps from the Atlantic Ocean, offering residences that have multiplied coastal construction over time. While the tragic collapse suggests not only the limits of the condominium as a promise of collective shouldering of liabilities, it also reminds us in terrifying ways of the increased liabilities of coastal living in an age of overlapping ecotones, where the relation between shore, ocean spray, saltwater incursion, and are increasingly blurred and difficult to manage in an era of climate change–as residences such as the still unchanged splashpage of Champlain Towers South itself promise easy access to inviting waters that beckon the viewer as they gleam, suggested exclusive access to a placid point of arrival for their residents that developers still promise to attract eager customers.

Although the shore was one of the oldest forms of “commons,” the densely built out coastal communities around north Miami, the illusion of the Atlantic meeting the Caribbean on Miami’s coasts offers a hybrid of private beach views and public access points, encouraging the building of footprints whose foundations extend to the shores, promising private views of the beach to which they are directly facing, piles driven into wetlands and often sandy areas that are increasingly subject to saltwater incursion. The range of condominiums on offer that evoke the sea suggest it is a commodity on offer–“surfside,” “azure,” “on the ocean,” “spiaggia“–as if beckoning residents to seize the private settlement of the coasts, in a burgeoning real estate market of building development has continued since the late 1960s, promising a sort of bucolic resettlement that has multiplied coastal housing developments of considerable size and elevated prices. Is the promise to gain a piece of the commons of the ocean that the real estate developers have long promoted no longer sustainable in the face of the dangers of erosion both of the sandy beaches and the concrete towers that are increasingly vulnerable not only to winds, salt air, and underwater inland flow, but the resettlement of sands from increased projects of coastal construction?

If collapse of the low-rise structure that boasted proximity to the beach may change the condo market, the logic of boasting the benefits of “year-round ocean breezes,” has the erosion of the coast and logic of saltwater incursion in a complex ecotone where salty air, slather flooding, and poor drainage may increasingly challenge the stability of the foundations of the expanding market for coastal condos–and to lead us to question the growing liability of coastal living, rather than investing in seawalls and beach emendation in the face of such a sense of impending coastal collapse, as the investment in concrete towers on coastal properties seem revealed as castles in the sand.

If the architectural plans for the forty-year-old building insured adequate waterproofing of all exposed concrete structures, in an important note in the upper left, the collapse left serious questions about knowledge of the structural vulnerability in the towers, whose abundant cracking had led residents to plan for reinforcement. The danger that Surfside breezes sprayed ocean air increased the absorption of chloride in the concrete over forty years that it cracked, allowing corrosion of the rebar, and greatly weakening the strength of supporting columns that had born loads of the tower’s weight, significantly weakening the reinforced concrete. The towers had been made to the standards of building codes of an earlier era, allowing the possible column failure at the bottom of the towers that engineers have suggested one potential cause for collapse in ways that would have altered their load-bearing capacity–the lack of reinforced concrete at the base, associated with the collapse of other mid-range concrete structures often tied to insufficient support and reinforced concrete structures. The dangers of corrosion of concrete, perhaps compounded by poor waterproofing, of cast in place concrete condominium towers in the 1970s with concrete frames suggest an era of earlier building codes, often of insufficient structuring covering of steel, weaknesses in reinforced concrete one may wonder if the weathering of concrete condominiums could recreate between columns and floors–and potential shearing of columns to the thin flat-plate slabs whose weight they bore, creating a sudden vertical collapse of the interior, with almost no lateral sideways sway.

Courtesy Town of Surfside, FL
Champlain Towers
Chandan Khanna/AFP

Even as we struggle to commemorate those who died in the terrifying collapse of a residential building, where almost a hundred and sixty of whose residents seem to be trapped under the collapsed concrete ruins of twelve floors, we do so with intimations of our collective mortality, that seems more than ever rooted in impending climate disasters that cannot be measured by any single criteria or unique cause. The modest condo seems the sort of residence in which we all might have known someone who lived, and its sudden explosive collapse, without any apparent intervention, raises pressing questions of what sort of compensation or protection might possibly exist for the residents of buildings perched on the ocean’s edge. Six floors of apartments seem to have sunk underground in the sands in which they will remain trapped, in sharp contrast to the bucolic views the condominium once boasted.

Miami Beach Coast/Alamy

While the apparent seepage in the basement, parking garage, and Champlain South that ricocheted over social media do not seem saltwater that seeped through the sandy ground or limestone, but either rainwater or pool water that failed to drain adequately, the concrete towers that crowd the Miami coastline, many have rightly noted, have increasingly taken a sustained atmospheric beating from overlapping ecotones of increased storms, saltwater spray, and the underground incursion of saltwater. If the causality of the sudden collapse twelve stories of concrete was no doubt multiple, the vulnerability to atmospheric change increased the aging of the forty year old structure and accelerated the problems of corrosion that demand to be mapped as a coastal watershed.

The bright red of coasts in the below map seems to evoke a danger sign that is intended to warn viewers about heightened increased consumer risk, from the Gulf Coast to Portland to Florida to the northeast, as sustained exposure to corrosive salt increased risk to over-inhabited coasts, particularly for those renting or owning homes in concrete structures built for solid land but lying in subsiding areas along a sandy beach. Indeed, building codes have since 2010 depended on the gustiness of winds structures would have to endure and not only along the coast, as this visualization of minimum standards across the state–mandating the risks coastal housing needs to endure–a green cross-hatched band marking new regions added to endure 700-year gusts of wind, inland from Miami.

Gusts Required Residences to Endure by Minimum Building Codes since 2010

Florida received a low grade for its infrastructure from the incoming administration of President Biden–he gave the state a “C” rather grudgingly on the nation’s report card as he promoted the American Jobs Plan in April, focussing mostly on the poor condition of highways, bridges, transit lines, internet access, and clean water. The shallow karst of the Biscayne aquifer is a huge threat to the drinking supplies of the 2.5 million residents of Miami-Dade County, but the danger of residences has been minimized, it seems, by an increasingly profitable industry of coastal building and development. While incursion of saltwater inland remains a threat to potable water, the structural challenges of the new As coastal Floridians have been obsessed with working on pumps to empty flooded roads to offshore drains, clearing sewer mains, and moving to higher grounds, the anthropogenic coastal architecture of towering condominiums offering oceanfront views have been forgotten as a a delicate link whose foundations and piles bear the brunt of the ecotonal crossfire of high winds, saltwater, and salty air that contributed to the “abundant cracking” of concrete that is not meant to withstand saltwater breezes, underground incursion, or the danger of coastal sinkholes in the sandy wetlands where they are built.

It is hard to look without wincing at a visualization designed to chart cost effectiveness by which enhanced concrete would mitigate the damage of hurricanes and extreme weather to coastal communities.

The below national map colors much the entire eastern and southeastern seaboard red, as a wake-up call for the national infrastructure. In no other coastal community are so many concrete structures so densely clustered than Florida. If designed and engineered for land, they are buffeted by salty air on both of its shores, from the Gulf of Mexico and the Atlantic; wind speeds and currents make the coast north of Miami among the saltiest in the world–as high winds can deliver atmospheric salts at a rate of up to 1500 mg/meter, penetrating as afar as one hundred miles inland that will combine with anthropogenic urban pollutants from emissions to construction–creating problems of coastal erosion of building materials, as much as the erosion of beaches and coast ecosystem threatened in Miami by what seems ground zero in sea-level rise, and, as a result, by saltwater incursion, and indeed the atmospheric incursion of salty air–concentrations of chloride that is particularly corrosive to concrete.

And is the exposure of concrete structures across southern Florida to salty air destined to increase with trends of rising sea-levels, already approaching five inches, and projected to deviate even more from the historical rate along the coast, exposing anthropogenic structures from skyscrapers to residences to increased flow of saltwater air?

Dr. Zhaohua Wu, FSU

We are all mourning the collapse of the Surfside FL condominium whose concrete pillars were so cracked and crumbling to expose rusted rebar exposed to salt air. Built on a sandbar’s wetlands, reclaimed as prime property, the town seems suddenly as susceptible to structural risk akin to earthquakes, posing intimations of mortality fit for an era of climate change. The collapse of the southern tower in the early morning of pose questions of liability after the detection of the cracked columns, “spalling” in foundational slabs of cement that allowed structural rebar within to deteriorate with rust that will never sleep. Its collapse poses unavoidable questions of liability for lost lives and unprecedented risk of the failure to respond to concrete cracking, but the ecotonal nature of the Florida shore, whose stability has been understood only by means of a continuing illusion as a clear division between land and sea, as if to paper over the risk of a crumbling shore, where massive reconstruction projects on its porous limestone expose much of the state to building risk of sinkholes and the sudden implosion or subsidence of the sandy shore in a county that was predominantly marshlands, and the inland incursion of salty air that make it one of the densest sites of inland chloride deposition–up to 8.6 kg/ha, or 860 mg/sq meter–and among the most corrosive conditions for the coastal construction of large reinforced concrete buildings facing seaward.

Miami building collapse: What could have caused it? - BBC News

While coastal subsidence may have played a large role in the sudden instability of the foundations that led the flat concrete slab on which the pool to crack, and leak water into the building’s garage in the minutes before it collapsed, the question of liability for the sudden death of Surfside residents must be amply distributed. For the question of liability can be pinned to untimely review process, uncertainty over the distribution of costs for repairs to condominium residents, and the failures of proper waterproofing of concrete as well as a slow pace of upkeep or repairs, the distributed liability raises broad questions of governance of a coastal community. The proposed price of upkeep of facade, inadequate waterproofing, and pool deck of $9.1 million were staggering, but the costs of failure to prevent housing collapse are far higher–and stand to be a fraction of needed repairs for buildings across Miami-Dade County over time.

The abundance of concrete towers in Miami-Dade county alone along the coast poses broad demands of hazard mitigation for which the Surfside tragedy is only the wake-up call: the calls from experts in concrete sustainability at MIT’s Concrete Sustainability Hub (CSHUB) for a reprioritization of preparation for storms from the earliest stages of building design has called fro changes in building codes that respond to the need for increased buffeting of coastal concrete buildings, arguing that buildings should be designed with expectations of increased damage on the East and Gulf coasts that argue mitigation should begin from the redesign of cement by a better understandings of the stresses in eras of climate change that restructuring of residential buidings could greatly improve along the Florida coast–especially the hurricane-prone and salt incursion prone areas of Miami-Dade county both by the design of cement by new technologies and urban texture to allow buildings to sustain increased winds, flooding, and salt damage. Calculated after the flooding of Galveston, TX, the calculation of a “Break Even Mitigation” of investing in structural investment of enhanced concrete was argued to provide “disaster proof” homes, by preventing roof stability and insulation, as well as preventing water entry, and saltwater corrosion in existing structures, engineering concrete that is more disaster resistant fro residential buildings in ways that over time would mitigate meteorological damage to homes to be able to pay for themselves over time; the “Break-Even Mitigation Percent” for residential buildings alone was particularly high, unsurprisingly, along the southern Florida coast.

MIT Concrete Sustainability Hub (CSHub)

1. Although discussion of causes of its untimely collapse has turned on the findings of “spalling,” ‘abundant” cracking and spiderwebs leading to continued cracks in columns and walls that exposed rebar to structural damage has suggested that poor waterproofing exposed its structure to structural damage, engineers remind us that the calamity was multi-causal. Yet it is hard to discount the stresses of shifting ecotones of tides, salty air, and underground seepage, creating structural corrosion that was exacerbated by anthropogenic pollution. The shifting ecotones create clear surprises for a building that seems planned to be built on solid ground, but was open to structural weaknesses not only from corrosion of its structure but to be sinking into the sandy limestone on which it was built–opening questions of risk that the coastal communities of nation must be waking up to with alarm, even as residents of the second tower are not yet evacuated, raising broad questions of homeowner and consumer risk in a real estate market that was until recently fourishing.

The apparent precarity of the pool’s foundations lead us to try to map the collapsed towers in the structural stresses the forty year old building faced in a terrain no longer clearly defined as a separation between land and sea, either due to a failure of waterproofing or hidden instabilities in its foundations. And despite continued uncertainty of identifying the causes for the collapse of the towers, in an attempt to gain purchase on questions of liability, the tower’s collapse seems to reflect a zeitgeist of deep debates about certainty, the anxiety with which we are consuming current debates about origins of its collapse in errors of adequate inspection or engineering may conceal the shaky foundations of a burst of building on an inherently unstable ecotone? While we had been contemplating mortality for the past several years, the sudden collapse. of a coastal tower north of Miami seemed a wake-up call to consider multiple threats to the nation’s infrastructure. Important questions of liability and missed possibilities of prevention will be followed up, but when multiple floors of the south tower of the 1981 condominium that faced the ocean crumbled “as if a bomb went off,” under an almost full moon, we were stunned both by the sudden senseless loss of life, even after a year of contemplating mortality, and the lack of checks or–pardon the expression–safety nets to the nation’s infrastructure.

The risks residents of the coastal condominium faced seemed to lie not only in failures of inspection and engineering, but the ecotonal situation of the overbuilt Florida coast. Residents seemed victims of the difficulties of repairing structural compromises and damage in concrete housing, and a market that encouraged expanding projects of construction out of concrete unsuited to salty air. As much as sea-level rise has been turned to visualize the rising nature of risk of coastal communities that are among the most fastest growing areas of congregation and settlement, as well as home ownership, the liability of the Surfside condominium might be best understood by how risk is inherent in an ecotone of overlapping environments, where the coast is not only poorly understood as a dividing line between land and water, but where risk is dependent on subterranean incursion of saltwater and increasing exposure inland to salt air, absent from maps that peg dangers and risk simply to sea-level rise? The remaining floors of the partly collapsed tower were decided to be dismantled, but the disaster remains terrifyingly emblematic of the risks the built world faces in the face of the manifold pressures of climate change. While we continue to privilege sea-level rise as a basis to map climate change, does the sudden collapse of a building that shook like an earthquake suggest the need to better map the risks of driving piles into sandy limestone or swampy areas of coastal regions exposed to risks of underground seepage that would be open to corrosion by dispersion of salt air.

Florida building collapse video: Surfside, FL condo disaster | Miami Herald

The search for the bodies of residents buried under the rubble of collapsed housing continued for almost a full week, as we peered into the open apartments that were stopped in the course of daily life, as if we were looking at an exploded diagram–rather than a collapsed building, wondering what led its foundations to suddenly give way.

Michael Reeves/Getty Images

As I’ve been increasingly concerned with sand, concrete, and the shifting borders of coastal shores, it seemed almost amazing that Florida was not a a clearer focus of public attention. The striking concentration of salts that oceans deposited along the California coast seemed a battle of attrition with the consolidation and confinement of the shores. Long before Central and Southern Florida were dredged in an attempt to build new housing and real estate, saltwater was already entering the aquifer. As the Florida coast was radically reconfigured by massive projects of coastal canalization to drain lands for settlement, but which rendered the region vulnerable to saltwater, risking not only contaminating potable water aquifers, but creating corrosive conditions for concrete buildings clustered along the shores of Miami-Dade County across Fort Lauderdale, Pompano Beach, as much of Florida’s coast–both in terms of the incursion of saltwater and the flow of salty air, that link the determination of risk to the apparent multiplication of coastal ecotones by which the region is plagued, but are conceptualized often only by sea-level rise. Even as Miami experienced a rise in sea-level some six times the rate of the world in 2011-2015, inundating streets by a foot or two of saltwater from Miami to Ft. Lauderdale was probably a temporary reflection of atmospheric abnormality or a reflection of the incursion of saltwater across the limestone and sand aquifer, lying less than two meters underground. Did underground incursion of saltwater combine with inland flow of salty air in dangers beyond tidal flooding in a “hotspot” of sea-level rise? One might begin to understand Surfside, FL as prone to a confluence of ecotones, both an overlapping of saline incursion and limestone and its concrete superstructure, and the deposit of wet chloride along its buildings’ surfaces and foundations, an ecotonal multiplication of risk to the consumers of buildings that an expanding real estate market offered along its pristine shores.

Approximate Inland Extent of Saltwater Penetration at Base of Biscayne Aquifer, Miami-Dade County, USGS 2018

While the inland expansion of saltwater incursion and penetration has become a new facet of daily life in Miami-Dade County, where saltwater rises from sewers, reversing drainage outflow to the ocean, and permeates the land, flooding streets and leaving a saltwater smell in the air, the underground penetration of saltwater in these former marshlands have been combatted for some time as if a military frontline battle, trying to beat back the water into retreat, while repressing the extent of the areas already “lost” to the sea. If the major consequences of such saltwater intrusion are a decay. and corrosion of underground infrastructure as water and sewage pipelines, rather than the deeply-set building foundations of condominiums that are designed to sustain their loads, the presence of incursion suggests something like a different temporality of the half-life of concrete structures that demands to be examined, less in terms of the damage of saltwater incursion on building integrity, than the immersion of reinforced concrete in a saline environment, by exposing concrete foundations to a wetter and saltier environment than they were built to withstand, and exposing concrete to the saline environment over time.

As much as we have returned to issues of subsidence, saltwater incursion, and other isolated data-points of potential structural weakness in the towers, the pressing question of the temporality of building survival have yet to be integrated–in part as we don’t know the vulnerability or stresses to which the concrete foundations of buildings perched on the seaside are exposed. The very expanse of the inland incursion of saltwater measured in 2011 suggests that the exposure to foundations of at least a decade of saltwater have not been determined, the risks of coastal buildings and inhabitants of the increased displacement of soils as a result of saltwater incursion or coastal construction demands to be assessed. The question of how soils will continue to support coastal structures encouraged by the interest of developers to meet demand for panoramic views of coastal beaches. While the impact of possible instability on coastal condominiums demands to be studied in medium-sized structures, the dangers of ground instability created by increased emendation of beach sand, saltwater incursion, and possible subsidence due to sinkholes. All increase the vulnerabilities of the ecotonal coastline, but only by foregrounding the increased penetration of saltwater, salt air, and soil stability in the increasingly anthropogenic coast can the nature of how ecotonal intersection of land and augment the risks buildings face.

Continue reading

2 Comments

Filed under anthropogenic change, Climate Change, Florida coast, Global Warming, shores

Loopy Maps to Rationalize Random Shut-Offs?

The announcement in California of the arrival of random power shut-offs this fire season sent everyone scrambling to maps. In order to stop fires spread of fires, the impending public safety power shut-offs crashed websites as folks scrambled to get updates in real time, frustrated by the relative opacity of maps in a hub of high tech mapping and public data, as the impending possibility of power shut-offs wreaked neurological chaos on peoples’ bearings. From mapping fires, we transitioned to the uncertainty of mapping regions where consumers would lose power in an attempt to prevent fires from spreading due to strikes on live wires of broken limbs, branches, or failed transmission structures whose immanent collapse were feared to trigger apocalyptic fires of the scale of those witnessed last fires season, as the largest fire in California history raged for days, destroying property and flattening towns, burning victims who followed GPS lacking real-time information about fires’ spread.

In an eery mirroring of looking to maps to monitor the real-time spread of fires, which we sensed in much of California by smoke’s acrid air, the expectancies to which we had been habituated to consult real-time updates was transferred to the availability of electricity, in a sort of mirror image unsurprising as the outages were intended to stop fires’ spread. The decision to continue public power safety shut-offs as a part of the new landscape of controlling fires’ spread in future years–perhaps needs to be accepted for up to a decade, although this was walked back to but five years in recent years.

Expanded power shut-offs justified by needs for public safety suggest how much climate change has changed the expanded nature of fire risk. But in an era when the vast majority of televised segments that aired on network television– ABC, CBS, and NBC combined–despite an abundance of powerful image and video footage the 243 segments on destructive wildfires raging across northern and southern California, a type of public disinformation seems to have been practiced by most news outlets that served only to disorient viewers from gaining any purchase on the fires, colored by the shifting validity of climate change denial as a position among their viewing public: only eight of the news media mentions of the fires, or 3.3%, mentioned climate change as a factor in the fires’s spread, from October 21 to November 1, as the spread of fires in northern California grew that precipitated public power shut-offs. If new cycles shy away from citing climate change as a factor on the spread of fires, most of the mentions came from specific weather reporters, from NBC’s Al Roker to CBS’ Jeff Berardelli, extending the range of fire seasons and area of burn, the silo-ization of such explanations were rarely digested in mainline reporting. And if FOX ran 179 segments on the fires, more than other cable networks, climate change was mentioned in 1.7%, with most segments mocking the contribution of climate change.

If we are poorly served by the news media in reporting the fires and downplaying climate change–or indeed criticizing California for poorly maintaining its forests’ safety, as President Trump, the eery landscapes provided by PG&E raise questions about the messages they communicate.

But the electric green maps of a startlingly unnatural aquamarine, yellow, and orange suggested a strange distantiation of the landscape in the age of Climate Change. The electrified hues of the maps, which monitored the possibility of customers loosing electricity in many districts, reveals a level of poor management and lack of any coherent strategy for climate change as much as the huge area that is served by PG&E, and the man-made infrastructure of electricity and transmission towers, which courts have rightly decided the privately-owned power agency that serves state residents is responsible for.

While we follow the news, even among the most die-hard news addicts, the prospect of “public safety” power shut-offs seemed unannounced and irresponsible, and a premonition of a new landscape of risk. For the shut-offs that were announced as impending by PG&E reflect a deep insecurity of fires, climate change and perhaps what we feared was a collective unpreparedness to deal with a new set of implications of climate crisis we have not even been able to acknowledge or even fully recognize, but which seemed spinning out of control–even in the nature of maps that were made of it–and to betray a lack of imagination, creativity, and foresight, abandoning the long-term view.

The sense of emergency electrifies a landscape whose woodland-urban interface is electrified by aging power structures and transmission lines, carrying increased current to extra-urban areas. And there is a fear that the long-term view is lacking, as we continue to turn to maps, even months after the first shut-offs were announced to forestall fears of a raging fire season. As we map the expanding sense of risk to respond to both demand for currently updated real-time maps for fires, and the calamitous images of apocalyptic fires raging that dominate the news cycle and make us fear the near future, or have a sense of living with a deferred sense of emergency at our doorsteps. And so when we received a text message of impending loss of electricity, we turned en masse to maps to learn about outages at risk, alerted to the need to ready ourselves as best we could by our local government-

Extreme fire prevention funding, precarious in the Trump Era, stands to be abolished as the Dept. of Interior retreats from federal fire programs: the Wildland Fire Office, funded at $13 million in 2012, if slated to be abolished in the Trump era, in an agenda denying climate change, lacks funding, undermining close scientific examination of a new topography of fires, even as climate change has increased the costliness of fires and the ferocity of their spreads. If the costs of the Camp Fire of 2018 grow beyond $10 billion–or over six times as much as the Oakland Firestorm of 1991–those costs and the cost of insurance liabilities only stand to grow. As we confront poor planning of climate readiness, as we lack real images of extinguishing fires’ spread–and imagine the temporary shut offs can intervene as a deus ex machine to prevent fires’ spreads all we have to forestall the fears of spreading flames and intense firestorms or whirls.

In the Bay Area, where I live, the danger of the new firescape is so pressing, and so impossible to process, that we can only digest it as a danger that is ever-present, akin to living in an active seismic area, but we cannot process in a static or dynamic map.

But this is an area of risk that we are living cheek-by-jowl beside in ways that are truly unfathomable. As the power shut-off zones have been expanded in clearer detail by PG&E in response to the growing gustiness of winds that threaten to compromise the safety of residents as well as the aging electric infrastructure of the state, we are oddly haunted by past promises to maintain or upgrade our national infrastructure–the promise to rebuild national infrastructure was itself an energizing call of the Trump campaign–only to be demoted by being assigned, with improved veteran care, the opioid addiction, workforce retraining, and the Middle East peace to Jared Kushner, in ways tantamount to moving it to the way back burner, soon after being mentioned in the State of the Union as a non-partisan issue in January, 2018.

And yet, the spread of fires with increased rapidity, across landscapes that remain highly flammable, has created terrifying imagers of a highly combustible landscapes, where the recent growth of fires–in this case, the Kincade Fire that did began only long after the shut-off policies began–chart the spread of fires across terrain multiple times larger than cities, moving across the landscape rapidly, driven by unprecedentedly strong offshore winds: the passing overhead of satellites charts its expansion, making us fear the expansion of the next pass overhead as realtime images of the durations of fires only grows.

Burrito Justice/VIIRS/MODIS fire spread map/October 27, 2019

Sure, the current landscape had long seemed to be burning up at a rate we had not begun to adequately acknowledge–as Peter Aldhous promptly reminded any of us who needed reminding in Buzzfeed, providing a GIF of CalFire’s data of areas of California that had burned since the 1950s, decade by decade, in an animation of red bursts of flames atop a black map, that seemed to eerily illuminate the state by the 2000s, and hit much of the north by the 2010s, as they close,–illuminating fires as a state-localized crisis–

Peter Aldhous for Buzzfeed from Cal Fire and frap.fire.ca.gov

but the scope of human-caused fires that have consumed land, property, and habitat are a truly endemic crisis in California, he showed, in ways that he suggested reflect a parched landscape and the uptick of human-generated fires that are a direct consequence of climate change, especially in a region of increased residential construction. This sense of illumination places a huge onus on PG&E for its corporate responsibility, and the very notion of distributing electricity and power as we once did,–and illuminates the imperative to think about a new form of energy grid.

Human-Generated Fires Peter Aldhous/Buzzfeed

Indeed, the parsing of “human-caused fires” as a bucket suggests the real need to expand the classification of wildfires. Whereas most earlier fires were caused by lightening strikes in the western states, the expansion of housing and electricity into areas suffering from massive drought–as if in an eery reflection of the spread of “slash fires” across the midwest during the expansion of railroads that caused a rage if firestorms coinciding with World War I–press against the category of fires as wild. The deeper question that these maps provoke–as do the data of Cal Fire–is whether the term wildfires is appropriate to discuss the hugely increased risk of fires that damage or destroy property and land.

Continue reading

Leave a comment

Filed under California, Climate Change, energy infrastructure, PG&E, urban interface

Mapping Radiation Levels: Toward a Vigilante Cartography or a Model of Data-Sharing?

Few maps rely entirely on self-reported measurements today:  the data-rich basis of maps make poor controls on data an early modern throwback.  But the ability to transmit datasets to the internet from local devices has changed all that.  The recent proliferation of radioactivity maps are based on the open sourcing of self-reported measurements to form a new picture, placing information taken with Geiger counters into a framework analogous to a template borrowed from Google Maps.  Although the only instrument to register radiation’s presence is a Geiger counter, and no standards have been developed for representing the rises in radiation counts in different regions–or indeed the limits of danger to personal health–the provision of such a map is crucial to disseminating any information about a potential future disaster.  While the three reactor that released radiation in the Fukushima meltdown created the largest release of radiation into the atmosphere, the mapping of this release of 300 tons of radioactive waste reported to be spewing from the reactors as they cooled into the Pacific Ocean may have slipped off the radar of daily news, but on the internet may have become the greatest environmental disaster we’ve encountered, increasing the demand and need to map it, and raising questions of its relation to the massive die-offs of Pacific starfish, herring,

By using the internet  to upload and broadcast shifting radiation levels, the flexibility of maps of radiation levels gain a new flexibility and readability through the platform of Google Maps can instantaneously register ambient radiation in air, earth, water, or rainfall, as well as the radioactivity of food, in striking visualizations of geographic space.  This came to a head in the maps that were made to respond to the threats of the Fukushima Daiichi nuclear disaster that occurred on March 2011, and the spread of radiation from the explosion across the Pacific that remind us of how wind and the medium of ocean waters distributed plumes of radioactive waste over time, as radioactive materials from the meltdown of three reactors were falsely imagined to have spread across most all of the Pacific rapidly–

 

image.png

Hoax Projection of radioactive plume from Fusushima Daichi Plant

 

–and even extending into parts of the Atlantic ocean, in ways that generated considerable panic as a density of radioactive waste moved toward the bucolic seas of Hawai’i–as if to create a sense of the terror of the contamination of the natural setting by radioactive plumes.

There was a sense that the natural disaster could be recorded in real-time, reflecting the extent to which Google Maps had changed our expectations for the mappability of global phenomena, and the visibility of fears of global contamination that could be registered and recorded as a real-time apocalypse as everyone could be their own prophet of the end of times on the platform that it allowed.

image.png

 

 

image.pngEarth First! (March 2012)

 

The news ecology itself seem to have shifted, indeed, as what was undeniably an environmental disaster with potential  global import if not an event with potentially massive environmental consequences was played down in the mainstream media and by science reporting agencies, in ways that led alternative media to promulgate all the more alarmist narratives of radioactive fish off the United States, die-offs of seafood, and radiations on beaches in California and Oregon and the image that the pristine seawaters of Hawai’i had become a sort of epicenter where all radioactive waste had accumulated and come to rest, as if to confirm the extent of technological disaster.

 

Dispersion of Fukishima 2012

 

The maps suggested a sense of atmospheric proximity revealed in radioactive plumes, to be sure, that generated multiple fake maps, designed as forms of fear-mongering to accentuate the proximity of radiation in the environment, using an undated map with the NOAA seal to suggest the spread of something from Japan–and folks assumed it was radioactive, given the climate online–although it was in fact a map  measuring effects of a March 11 2011  tsunami, provoked by the Tohoku earthquake on wave height and he communication of wave-energy across the Pacific–perhaps more of interest to surfers than those fearing fallout.

 

image

 

For the explosion created huge challenges for mapping a sense of global radiological risk, far transcending any place or the site of its explosion:  the greatest levels of radiation were far removed from the site of the disaster, at the same time as the contamination on the ground, where radioactive deposits were far more intense in relation to geographical proximity.  Despite the far broader time-lapse required for the radioactive plume to travel by ocean currents across the Pacific–here shown after two and a half years–based on water samples taken in 2013, which, if far lower than limits in drinking water at 1 Becquerels/cubic meter, were projected to peak to 5 in 2015-16–far less than you might eat in a banana, or experience in a dental x-ray.

 

image

Discontinuities trumped continuities, however, in the levels of Cesium 134–the isotope that was the fingerprint of the Dai-ichi explosion–confirmed the extent of the diffusion of radioactive isotopes linked to the Fukushima reactor, by 2015, contaminated not only Canadian salmon, as tracked at the University of Victoria, but spread across much of the Pacific ocean, leaving an incredible intensity of the fingerprint isotope linked to Fukushima Dai-ichi–Cesium-134–in offshore waters, which perhaps recirculated in the Alaskan gyre, and the radioactive plume was projected to reach American shores some 1,742 days after it was released.

 

image.png

 

 

If still detectable in shellfish sampling as well as salmon, in 2015, the dispersion of radiation made a delayed landfall to the Pacific coast in late 2016, much as the arrival of isotopes across the Pacific was recorded–raising questions of the travel by water of Cesium across the Pacific.

 

 

image.png

 

The air dosages of radiation immediately around Fukushima Daiichi suggest a dangerous level of radiation on the mainland, however, apparently confirmed in the growth of thyroid cancer, especially in children, birth defects, and the retention of Cesium 134 in power station workers who show an incidence of leukemia–

 

Air Dose rate Fukushima Daiichi.png

and a rise in thyroid cancer in California that follow no distinct geographical pattern–but may be due to pesticides, agricultural contamination, or other waste.

An assembly of multiple static and dynamic maps might assemble an otherwise ‘hidden map’ of local levels of radiation, however, and to reveal or expose otherwise hidden local dangers populations face from radiation leaks.  The notion of a shared database in cases of eventual emergency that can be regularly updated online suggested a way of monitoring and reacting to panic levels of the dispersion of radiation from the nuclear explosion, and indeed to measure the unclear relation between proximity to a blast and the intensity of remaining radiation and radioactive dangers.

Although the measurements of danger are debated by some, mapping radiation levels provides a crucial means to confront meltdowns, the breaching of chambers’ walls, or leaks, and to define limits of danger in different regions.   Interestingly, the map stands in inverse relation to the usual mapping of human inhabitation: rath er than map sites of habitation or note, it tracks or measures an invisible site of danger as it travels under varied environmental influences in ways hard to predict or track.  Although the notion of what such a disaster would be like to map has been hypothetical–and is, to an extent, in datasets like the National Radiation Map, which use the Google Earth platform or available GIS templates to diffuse information not easily accessible.  This is a huge improvement over the poor state of information at the time of the threatened rupture of the containment structure of the Three Mile Island in Harrisburg PA in 1978, when no sources had a clear of what radius to evacuate residents around the plant, or how to best serve health risks:  if a radius of 10 miles was chosen in 1978, the Chernobyl disaster required a radius of over the double.  The clean up of the plant went on from 1980 to 1993; within a 10 mile radius, high radiation levels continued in Harrisburg today.

The larger zones that were closed around the more serious and tragic Chernobyl Nuclear Power Plant that in fact exploded in April 1986 led to a clear Zone of alienation that was evacuated three days after the explosion, and considerable fear of the diffusion of radioactive clouds born in the environment to Europe and North America.  The irregular boundary of immediate contamination, including pockets of radiation hotspots not only in Belarus, but in Russia and  the Ukraine suggest limited knowledge of the vectors of contamination, and imprecise measurements.

chornobyl_radiation96

This raised a pressing question:  how to render what resists registration or simple representation–and even consensus–on a map?  And is this in any way commensurate with the sorts of risks that maps might actually try to measure?

The tragic occurrence of the 2011 Fukushima meltdown raised similar questions, but converged with a new basis to define an internet-based map of the region.  If the incident provided a case-in-point instance of ready demand for maps, the availability of a large number of online access in the region led to a considerable improvisation with the value of a crowd-sourced map not defined the local government or nuclear authorities, but by the inhabitants of the region who demanded such a map.  The accident that resulted from the tsunami no doubt contributed to a resurgence and perfecting of the crowd-sourced map both in the United States and, in  a more flexible way, Japan, as websites try to refine the informative nature carried in radiation maps to create an informative response by open-access maps that can quickly register the consequences of nuclear disaster–or indeed detect such a leak or structural compromise–in the age of the internet, and offer a reassuring image (or a cautionary image) adequate to meet with the invisible and intangible diffusion of radiation in the local or regional environment.

Demand for such online databases reveal and feed upon deeper fears of an official failure to share such data.  Indeed, the drive to create a map of some authority has dramatically grown in the light of recent radiation disasters that have not been mapped earlier, in part because of liability issues and because of fears that government protection of the nuclear industry has compromised their own responsibility.  If the growth of online sites is a sensible and effective use of data-sourcing on an open platform created by internet providers, it is also one no doubt fed by a paranoid streak in the American character stoked most heavily these days by folks on the Right.  I’ve decided to look at two examples of these maps below, both to reflect on the nature of a crowd-sourced map and to suggest the plusses and minuses of their use of a GIS framework to visualize data.

The emphasis on the map as a shared database and resource to monitor and publicize the sensitive information about radiation levels has unsurprisingly increased by the recent threat of contaminated waters that breached the containing walls during the meltdown of the Daichi-Fukushima reactor in March 2011, and also of the difficulties that providing a reliable map of radiation creates:  although reactors are licensed by governments and monitored by government agencies, debates about the public dangers that reactors pose concern both the danger levels of radiation and the ability to collect exact data about their spatial distribution, and communication through waters, air, and other environmental vectors.  The ability to upload such measurements directly to data-sharing platforms provides a new access for the relatively low-cost creation of maps that can be shared online among a large group of people in regularly updated formats.  Given the low-cost of accumulating a large data-set, Safecast concentrated on devising a variety of models to visualize distributions along roads or by interpolating variations in existing maps.

The group-sourced website showing regional and local fluctuations are not visually or cartographically inventive, but pose questions about using data feeds to reveal a hidden topography, as it were, of radiation across the country or landscape–as if to remedy the absence of an open-access trustworthy source of this information local governments would sponsor or collate.  Against a field that notes the sites of reactors by standard hazard-signs that designate active reactors, viewers can consult fluctuating readings in circled arabic numbers to compare the relative intensity measured at each reporting monitor station.  While rudimentary, and without adjustments or standardized measurements, this is an idea with legs:  the Safecast Project proposes to take mapping radiation in the environment along a crowd-sourced model–an example of either a pluralization of radical cartography or a radical cartography that has morphed into a crowd-sourced or “vigilante” form of mapping radiation levels.

Safecast wants to create a “global sensor network” with the end of “collecting and sharing radiation measurements to empower people with data about their environments.”  Its implicit if unspoken message of “Cartography to the People!”  echoes a strain in American skepticism, if not paranoia, about information access, and fear of potential radioactive leaks–in a counter-mapping of USGS topographic surveys, the movement to generate such composite maps on the internet is both an exciting dimension of crowd-sourced cartographical information, and a potentially destabilizing moment of the authority of the map, or a subversion of its authority as an image produced by a single state.

The interesting balance between authority and cartography is in a sense built into the crowd-sourced model that is implied the “global sensor network” that Safecast corporation wants to construct:  while not readily available in maps on access to government-sponsored sites, those interested in obtaining a cartographical record of daily shifting relative radioactive danger can take things into their own hands with a handy App.

The specific “National Radiation Map” at RadiationNetwork.com aims at “depicting environmental radiation levels across the USA, updated in real-time every minute.”  They boast:  “This is the first web site where the average citizen (or anyone in the world) can see what radiation levels are anywhere in the USA at any time.”  As impressive are the numbers of reactors that dot the countryside, many concentrated on the US-Canadian border by the Great Lakes, as in Tennessee or by Lake Michigan.  Although a credible alert level is  100, it’s nice to think that each circle represents some guy with a Geiger counter, looking out for the greater good of his country.  The attraction of this DIY cartography of inserting measurements that are absent from your everyday Google Map or from the Weather Channel is clear:  self-reporting gives a picture of the true lay of the radioactive land, one could say.  This is a Jeffersonian individual responsibility of the citizen in the age of uploading one’s own GPS-determined measurements; rather than depending on surveying instruments, however, readings from one’s own counters are uploaded to the ether from coordinates that are geotagged for public consumption.

Of course, there’s little level of standard measurements here, as these are all self-reported based on different models and designs–they list the fifteen acceptable models on the site–in order to broadcast their own data-measurements or “raw radiation counts,” which makes the map of limited scientific reliability and few controls.  So while the literally home-made nature of the map has elements of a paranoid conspiracy–as most any map of nuclear reactors across the country would seem to–the juxtaposition of trefoil radiation hazard signs against the bucolic green backdrop oddly renders it charmingly neutral at the same time:  the reactors are less the point of the map than the radiations levels around them.

 

USA map radioactivity

 

But the subject that is mapped is anything but reassuring.  When we focus on one region, the density of self-reported sites gains a finer grain in the Northeast, we can see the concentration of hazard signs noting reactors clustering around larger inhabited areas, oddly, like the ring around New Jersey, just removed from New York, the nuclear reactors in the triangle of Tennessee and Virginia, or those outside of Chicago and in Iowa, and one sees a somewhat high reading near Harrisburg PA.  But it’s reassuring that a substantial number of folks were using their Geiger counters at that moment, and inputting data into this potentially useful but probably also potentially paranoid site.  I hope they do interview them beforehand, given the very divergent readings at some awfully proximate sites.

 

NortheastUS

 

If we go to a similarly dense network on the West Coast, the folks at Mineralab offer a similar broad spread among those informants, and the odd location of so many reactors alongside rivers–no doubt using their waters for cooling, but posing potential risks of downriver contamination at the same time.

PacificNW

 

Although the view of Southern California is perhaps still more scary, and reminds us that the maps have not taken time to denote centers of population:

 

AmericanSW

 

And there’s a charming globalism to this project. Things aren’t particularly worse off in the USA in terms of the reliance on reactors, if we go to Europe, where reporters are similarly standing vigilant with Geiger counters at the ready given the density of those familiar trefoil hazard signs in the local landscape:

 

Europe.jpg

 

The truly scary aspect of that map is the sheer distribution of reactors, no doubt, whose hazard signs that dot the countryside like scary windmills or danger signs.  And, just to put in perspective the recent tsunami that leaked radioactive waste and waters from the Fukushima reactor whose walls it breached, sending material waste and leaching radioactive waters to California’s shores, consider Japan.  An impressive range of reactors dot the countryside, and but one vigilant reporter in Sapporo notes the very low levels of radiation that reach his counter:

 

Japan.jpg

 

Withe a week after the March 11, 2011 earthquake hit Japan, the greatest to ever hit Japan, Safecast was born as a volunteer group dedicated to open-platform radiation monitoring in the country and worldwide; in addition to over 15880 dead in the Tsunami and quake, the tsunami caused level 7 meltdowns at three reactors in the Fukushima Daiichi Nuclear Power Plant complex, necessitating evacuating hundreds of thousands of residents, as at least three nuclear reactors exploded due to hydrogen gas that breached outer containment buildings after cooling system failure.   When residents were asked to evacuate who dwelled within a 20 km radius of the Fukushima Daiichi Nuclear Power Plant, the United States government urged American citizens evacuate who lived within a radius up to 80 km (50 mi) of the plant to evacuate.  This rose questions about the dispersal of radiation from the plant, and deeper questions arose about the safety of returning within a set zone, or the need to demarcate an no-entry zone around the closed plant.

The rapid measurement of radiation distributions not only gained wide demand but provided as of July 2012, Safecast includes some 3,500,000 data points that collect radiation levels, and provided a new mode of sharing information about dangerous levels of radiation.  In ways that capitalize on how the internet allows a massive amount of data to be uploaded from numerous points around the world, Safecast exploits a model of data-sharing on its open platform, offering different models to visualize their relation to each other:  Safecast allows the possibility to visualize the maps against a road-map, topographic map, and map of local population distributions, so that they can better understand their relation to the readings that they’ve collated on line.

The process of massing data is what makes Safecast such a pioneer in creating a large range of readings that can promise a more comprehensive picture of radiation distribution than the uneven distributions that isolated readers might allow.  The Safecast team hopes and promises to improve upon their readings by designing and promoting a new Geiger counter, and has made available the handy workhorse bGeigie, although the cost of $1000/apiece and the time-consuming nature of their assembly is a major obstacle they’re trying to confront. The smaller and handier Geigier Nano Kit creates a dandy device you can easily carry, affix to your car, and whose measurements are easily uploaded to the Safecast website:

 

IMG_7008

 

The DIY glee of presenting the tool to measure radiation levels with one’s own mini-Geiger is part of the excitement with which Safecast promises to provide a new map of Japan’s safely habitable land.  The excitement also derives from a belief in the possibility of “empowering” people to measure and compile data about their environments, rather than trust a map that is assembled by “experts” or official sources who have not been that forthcoming with data-measurements by themselves.  The above smile also reflects the vertiginous success of Safecast in distributing its bGeigei, and the boast to have amassed an open-sourced database for open access.

This seems the new key to revealing knowledge in the multiple visualizations that Safecast offers for viewers:  with the enthusiasm of great marketing, their website announces with some satisfaction: “attach it to your car and drive around collecting geo-tagged radiation data easily uploaded to Safecast via our API upload page.” This suggest a whole other idea of a road trip, or even of a vacation, in the multiple ‘road-maps’ that volunteers have uploaded for approval on the Safecast site, with over 10,000 data points deriving from bGeigei imports, that Safecast can readily convert to a map:

Tokyo traffic Safequest

 

This is also quite serious stuff, taking crowd-sourced cartography to a new degree:  with some 4,000,000 radiation points detected by the Safecast team, the website is able to assemble a comprehensive map of relatively uniform readings, complimenting the sites of radioactivity assembled and culled by the Japanese government with their own independent data from an impressive range of aggregate feeds of environmental data from several NGOs and individual observers across Japan’s coast:

 

agg-300x300

The image of such aggregate data feeds allowed Yahoo! Japan to build their own map displaying the static sensor data of Safecast:

Yahoo Japan feeds

Kailin Kozhuharov has created a detailed map to visualize the distribution of radiation levels in the island through the Safecast database:

 Kozhuharov Visualization of Radiation Levels

 

The coverage is truly impressive, and multiplication of data points technically unlimited and potentially comprehensive. While divergent readings may be entered every so often as a Geiger counter wears down or malfunctions, controls are built into the system. An example of the coverage in Japan, again the focus of mapping radioactivity in the wake of the recent Fukushima disaster, where Safecast is based, using locally obtained data once again:

 

Safecast Japan

 

The widespread appeal of this device, even more than the Radiation Network, reveals the widespread nature of a belief or suspicion–no doubt with some grounds or justification–that a true map of the dangers or levels of radiation is in fact never already provided or available to citizens, and that the failure of governments of communicating an accurate mapping of radiation demands a privatized response.  And with its partnership with Keio University, developed after Fukushima, Safeguard has developed the “Scanning the Earth” (STE) project that maps the historical data of radiation readings across the globe.  With the Fukushima prefecture, Safecast has also issued a comprehensive global mapping of the dispersal of high levels of radioaction from Fukushima from its own massive database to chart the impact of the environmental disaster over time:

 

Fukushima Prefecture World Map

Although this map reflect the ties to the MIT Media Lab, it is informed by a dramatically new local awareness of the importance to create a map flexible enough to incorporated locally uploaded data measurements for open access.  It is also a great example of how an event can create, provoke, or help to generate a new sense of how maps can process the relation of local phenomenon to the global in a variety of readily viewable formats.  The demand for creating this world map clearly proceeded from the local event of the 2011 Tsunami, Safecast was at a position to observe the importance of maintaining an open-sourced database (now including some 2,500,000 readings) that offer an unprecedented basis for developing a platform of data-sharing that is readily available online.  In working with the same databases, they also offer some cool visualizations of the data that they collect to illustrate differentials radiation levels in readable ways linked to potential dangers to individual health:

Fukushima?  Safecast

The new facility that the internet has created in the ability to upload, share, and compile information from diverse and multiple sites without considerable costs has meant so lowered the cost of collaboration that it can occur without any reference or dependence on a central governmental authority. This has allowed the compilation of an immense amount of simultaneous data to be regularly uploaded and stored with almost no extra cost from a group of volunteers and to be available in transparent ways on an open-access platform.  (Late in updating this post, I came across an earlier PBS NewsHour episode on Safecast’s interest in data-collection in the wake of the disaster, and the demand of local residents in Japan for further data, given the silence of official government sources on the disaster and its dangers:

(The episode offers great data on using Geiger counters to detect radiation levels at multiple sites near the exclusion zone that rings the reactor, including a restaurant parking lot.)

The means for offering locally contributions to a world map of radiation level distributions reveal an expanded ability to share information in a map the relation of place to environmental disasters.  Indeed, the map itself foregrounds new graphical forms of information-sharing.  There are clear problems with the Safecast model that Japan, in fact, is likely to be an exception to:  Japan was a place providing access to large numbers of its population already in 2003, offering free wi-fi in trains, airports, and cafés or tea houses.  In comparison, the far more limited numbers of the population have access to wi-fi or online resources in rural American towns, or even in urban areas, would make access questions less possible in the United States, where a similar movement has failed to expand not only because of the lack of a disaster of similar proportions.  There is the danger that the “freedom of information” they champion is in the end not as openly accessible as one would wish:  if roughly one quarter of hotspots worldwide are in the United States, it shared with China and Italy the lowest number of hotspots per person, at lower than 3 per person as of 2007, while Japan had nearly 30 million 3G connections.  This creates a significant obstacle to the expansion of the internet as a universal access service outside urban areas with municipal wireless networks, despite the significant plans to expand internet access on interstates.  Despite plans to expand free service zones in Asia, Canada, and parts of the Americas, the broadcasting of regional variations in a natural disaster would be limited.

There may be something oddly scary that Safecast has had its own corporate profile and successful Kickstarter campaign, marking the privatization of the sort of public traditions of cartography formerly undertaken by states for their own populations to devolve to the private sphere.  For whereas we have in the past treated cartographical records as an accepted public good, there is limited acceptance of accessible data collection and synthesis.  As a result, one seems more dependent on the active participation in one’s construction of a more accurate of radiation levels, or upon a network of vigilant vigilante cartographers who can upload data from wi-fi zones.  Is there the risk of a disenfranchisement of a larger population, or is data-sharing the best available option?

An alternative model for mapping radiation might be proposed in the compelling map of the oceanic travel of radiation (probably in contaminated waters, but also in physical debris) that has been suggested by vividly compelling cartographical simulations of the dispersal of the long-term dispersal of Cesium 137 (137CS) from waters surrounding the Fukushima reactor.  Although the map is indeed terrifyingly compelling, in relying only on oceanic currents to trace the slow-decaying tracer across the Pacific, the video map seems to arrogate capacities of measuring the dispersal over the next ten years of radioactive material in ocean waters with a degree of empiricism that it does not in fact have.  How ethical is that?

http://bcove.me/cp9fol2w

For all the beauty of the color-spectrum map of a plume of radiation expanding across ocean waters–and the value of its rhetorical impact of strikingly linking us directly to the reactor’s meltdown–its projected charting of the plume of contaminated waters due to reach the waters of the United States during 2014, if normal currents continue, is far less accurate or communicative than it would seem.  To be sure, as the Principal Investigator and oceanographer Vincent Rossi, a post-doctoral researcher at the Institute for Cross-Disciplinary Physics and Complex Systems in Spain, “In 20 years’ time, we could go out, grab measurements everywhere in the Pacific and compare them to our model.”  But for now, this expanding miasma offers an eery reminder of the threat of widespread circulation of radioactive materials worldwide.

8C8811758-130831-science-fuku-1025a.blocks_desktop_small

 

Indeed, the charts that project the spread of radiation over a period of five years, benefitting from the power of computer simulations to map by tracer images the diffusion of radioactive discharge along clear contour lines in the Atlantic, provide a compelling chart of how we might want to look and measure the levels of radioactivity in our national waters.

 

tracer image of radiation from Fukushima

2 Comments

Filed under Chernobyl, disaster maps, Fukushima, radiation maps, Vigilante Cartography