mixtures
Chesapeake acidification could compound issues already facing the bay, researchers find.
As oceans around the world absorb carbon dioxide and acidify, the changes are likely to come faster to the nation’s largest estuary.
For ten days across recent summers, researchers aboard the University of Delaware research vessel Hugh R. Sharp collected water samples from the mouth of the Susquehanna River to Solomons Island in a first-of-its-kind investigation. They wanted to know when and where the waters of the Chesapeake Bay were turning most acidic.
One finding: As oceans around the world absorb carbon dioxide and acidify, the changes are likely to come faster to the nation’s largest estuary.
Scientists have long studied the slow and steady acidification of the open oceans — and its negative effects. Acidifying waters can kill coral, disrupt oyster reproduction, dissolve snail shells like nails in a can of bubbly Coke.
But researchers are just beginning to investigate the consequences for the Chesapeake. And they’re finding that acidification could compound the ecological challenges already wracking the bay.
Not all effects are immediately negative on all species. Experiments are showing that blue crabs, marsh grasses and algae could theoretically thrive in the conditions expected to develop over the next century. But the acidification is a threat to other keystone bay species, such as oysters — a key source of food for crabs. Scientists say acidification could dramatically and unpredictably alter the delicate balances that stabilize the bay ecosystem.
With so many variables expected to affect bay creatures — including rising acidity, warming waters and continued nutrient pollution — research is complex.
“When you have three things changing at once, that’s where our challenges really increase,” said Jeff Cornwell, a research professor at the University of Maryland Center for Environmental Science in Cambridge. “All these things are intertwined.”
Water, as they teach in middle school chemistry, has a neutral pH of 7. But over the past 300 million years or so, ocean water has registered as basic, with an average pH of 8.2.
As carbon dioxide has multiplied in the atmosphere over the past century, it has also dissolved into the oceans, producing carbonic acid. That has dropped ocean pH to 8.1. The shift might seem slight, but it actually represents a 30 percent increase in acidity, because the pH scale is logarithmic.
The consequences, coupled with the impacts of rising ocean temperatures, could eventually be severe. Research suggests that the acidity that could develop by 2100 could make it harder for oysters, clams, sea urchins and corals to build their protective shells, and could even dissolve the shell of the pea-sized creature at the base of the food web known as a sea butterfly.
But that’s just in the open ocean. Most research focuses on that massive habitat, because its chemistry is largely consistent from one spot to another. In environments close to land, where ecology is more complex and active, biological processes like photosynthesis and respiration drive more volatile swings in acidity and other chemistry.
Whitman Miller is a research scientist at the Smithsonian Environmental Research Center in Edgewater.
If an acidifying ocean is like a bottle of carbonated seltzer water, he said, estuaries like the Chesapeake are similar to a beer.
“Because it’s so uniform, in some ways, we know much more about the open ocean at the global scale than we do these local scales we’re tangling with in estuaries,” he said.
Researchers around Maryland and across the country are working to bridge that knowledge gap.
Research published last month in the journal Nature Communications showed that acidification is already apparent in the bay. The team that measured acidity across the bay, led by the University of Delaware marine science professor Wei-Jun Cai, found a zone of increasing acidity at depths of about 30 to 50 feet across the Chesapeake. While surface waters hover around the pH norm of 8.2, the deeper waters registered almost one point lower — nearly ten times more acidic.
The researchers, who included Cornwell and colleagues from UMCES, believe it’s not only the global effects of carbon dioxide emissions, but also the dead zones of low or no oxygen that have plagued the bay for decades. The zones are created when nitrogen and phosphorus runoff from farms, lawns and sewage fertilize large algae blooms. Microbes strip oxygen from the water to decompose the blooms when they die, and release more carbon dioxide in the process.
The problem is worsened when organic matter is decomposed in water that is already stripped of oxygen — the bacteria use up other compounds in the water that produce an acidic chemical, hydrogen sulfide. Hydrogen sulfide is what makes the muck around the bay smell like rotten eggs.
Cai said the processes suggest that the bay, and other waterways struggling to reduce nutrient loads, are especially vulnerable as the pH of waters around the globe decline.
“You have something we call a synergistic effect, where one plus one gives you something more than two,” he said. “There’s a very strong acidification effect.”
Miller and colleagues at the Smithsonian are exploring the consequences in a meadow of marsh in that looks like so many others around the Chesapeake — except this one is dotted with metal heat lamps and plexiglass chambers that are helping to simulate the environment of the future. They call it the Global Change Research Wetland.
Scientists are conducting experiments to study the effects of increasing carbon dioxide, nutrients and temperature on the growth of sedge grasses and invasive plants, and the ability of the Rhode River marsh to grow upward to match sea level rise.
One study has been running for 30 years. Pat Megonigal, a biogeochemist and lead investigator of the research wetland, says it should be listed in the Guinness Book of World Records for the longest-running climate change experiment.
Along the creek that flows into the marsh, researchers from the Smithsonian Institute have built a gateway through which they are measuring the carbon content of water as it flows in and out twice a day with the tides. Some of it is carbon dioxide, but a portion is the compounds carbonate and bicarbonate — elements that may actually help counteract acidification in the estuary.
They hope the data will help explain not only what changes acidification could bring, but also what role natural ecological processes could play in limiting them.
“It tells us something about the influence of the marsh on the chemistry of the water,” Megonigal said. “We think the net effect of water leaving the marsh is to buffer the acidity of the estuary.”
Other researchers are eagerly testing what those changes could mean for the bay’s crabs and oysters.
For her recently completed dissertation, UMCES doctoral candidate Hillary Glandon exposed blue crabs to both warmer and more acidic waters and watched their response. She found that acidification alone didn’t affect them, but when it was coupled with warmer waters, crabs grew faster, molting old shells more frequently, and they also ate more food.
Previous research has already shown that oysters, mussels and similar shellfish could struggle in acidifying waters. They build their shells out of a compound in the water known as calcium carbonate, and scientists have found there will be less of those building blocks available as ocean carbon dioxide levels rise.
So Glandon’s colleagues Cornwell and Jeremy Testa are investigating what that could mean for restoration of the Chesapeake Bay’s oysters. They’re getting input from researchers in Oregon, where acidification has already challenged aquaculture efforts by killing oyster larvae. Though they don’t expect the exact same conditions in the bay, they are watching pH levels closely in places such as Harris Creek, one of three Choptank River tributaries where millions of dollars have been spent on building and seeding new reefs.
Any changes could throw off a complex food web. While crabs could be thriving in warmer and more acidic bay waters in the future, the oysters and mussels they eat could be struggling.
“Crabs don’t exist in a vacuum,” Glandon said. “If food they’re going to be eating is less abundant, there may be negative effects.”
Tom Miller, director of UMCES’ Chesapeake Biological Laboratory in Solomons, said the stakes demand that more resources be put into measuring and understanding acidification. In the same way state and federal officials have tried to limit pollution to protect crabs, oysters, marshes and underwater grasses, he said, acidification should be getting more attention in bay policy discussions.
“It has the potential to fundamentally change the pattern, the seasonality and the location of fishing in a way that the grandfathers of today’s watermen wouldn’t recognize,” he said. “We should be having those discussions now — not in 20 years or so, when it becomes, I wouldn’t say too late, but when it becomes much more contentious.”
ICYMI: Syrian seeds shake up Europe’s plant patent regime.
Salvatore Ceccarelli knew he was engaging in a subversive act when, in 2010, he took two 20 kilo sacks of bread and durum wheat seeds from a seed bank outside of Aleppo, Syria and brought them to Italy during a visit back to his home country.
Salvatore Ceccarelli knew he was engaging in a subversive act when, in 2010, he took two 20 kilo sacks of bread and durum wheat seeds from a seed bank outside of Aleppo, Syria and brought them to Italy during a visit back to his home country. Now, seven years later, those seeds from the Fertile Crescent, the birthplace of domesticated agriculture, with thousands of years of evolution behind them, are poised to challenge the system of plant patenting in Europe, and, soon enough perhaps, the United States.
Ceccarelli, one of the world’s foremost seed scholars and practitioners and an honorary research fellow at Biodiversity International, has consulted with governments on policies to encourage biodiversity. He has also been a leading advocate of participatory plant breeding—which, as he describes it, means engaging farmers in the process of breeding new crop varieties, rather than leaving that to the rapidly consolidating group of global seed companies.
Ceccarelli arrived in Syria in 1984 and stayed for the next quarter-century as a senior breeder and researcher at the International Center for Agricultural Research in Dry Areas (ICARDA), one of nine UN specialized agencies founded to protect regionally evolved seeds. His specialty is wheat, barley and other cereals, bred for dry and hot climates — precisely the conditions that many of the earth’s food-growing lands now face as climate change raises the temperature and disrupts precipitation patterns.
ICARDA was based in Tal Hadya, a town about 20 miles outside of Aleppo, until it was finally abandoned last year when the city became a focus of the Assad government’s brutal counter-offensive against Syrian rebels, including the Islamic State. Ceccarelli was gone by the time the last Syrian scientists were forced to evacuate, but he had ensured that at least a part of the seed bank’s legacy lives on in Italy. (ICARDA’s work continues in Morocco and Lebanon and a collection of its seeds is stored at the Svalbard Global Seed Vault in Norway).
In each one of those sacks Ceccarelli took from Syria were dozens of different wheat varieties. Working with a Tuscany-based NGO, the Rural Seed Network (Reto Semali Rurali, RSR) Ceccarelli arranged to have the seeds planted with a farmer in Sicily and another in Tuscany.
The RSR and a coalition of environmental NGOs from the UK, Germany, Austria, Denmark, and France went to work lobbying in Brussels, to convince the European Council of Ministers – the EU’s executive body – to amend a key provision which requires that all seeds sold in Europe be registered as single seeds with uniform, distinct and stable characteristics. In other words, each seed remains uniform and distinct from other varieties year after year, a registration requirement that is also a key precursor to what is often the next step – patenting. But this uniformity, the coalition argued, makes them uniquely unsuited to the extreme volatility in growing conditions being wrought by climate change – a hot-button question in Italy and throughout much of Europe, which has been facing record-breaking temperatures and, in some regions including Italy, a multi-year drought.
In 2014, the coalition succeeded. The EU agreed to waive those registration requirements on four crops for what it described as “a temporary experiment … for the marketing of populations of the plant species wheat, barley, maize and oats.” For the first time, populations of seeds, evolving and changing and sharing genes in the ways that plants do naturally, could be registered for sale.
The Syrian seeds have already provided a lesson in ‘Evolution 101’ on the Italian farms. Within four growing seasons, the two populations growing in different parts of Italy showed significantly different characteristics, a live illustration of the adaptation process, Ceccarelli said.
In Sicily, which receives a fraction of the rainfall of Tuscany, the wheat is maturing several weeks earlier, and is as many as two to four inches shorter than the Tuscan varieties, which, in the more moderate and moist climate, mature later and deliver more protein per plant.
“Compare the two in the same environment,” says Ceccarelli, “and it’s day and night.” Ceccarelli argues that its their diversity which gives the fields the ability to adapt to new conditions. “Explain to me how a crop that is uniform and stable responds to climate change?” he says. “Today if you are a dynamic seed company you are working on varieties for 2025. For which sort of climate? How many more degrees hotter will it be? Do they know what pests and diseases will come with the new conditions? These population mixtures are extremely dynamic, the cheapest and most dynamic way to cope with climate change.”
The experience of field inspecting a diverse population was a first for the seed inspectors in from Rome, recalled Riccardo Franciolini of the RSR. “It was interesting to see their response,” he said from the group’s headquarters near Florence. “We asked them to do the opposite of what they’re used to doing. They’re used to seeing a single variety, all the same in a field. But the idea of a ‘population’ changes the vision in a profound way.”
In June, the Italian Agriculture Ministry authorized the farmer in Sicily, Giuseppe li Rosi to sell up to two tons per year of the seeds cultivated there; the Tuscan farmer, Rosario Floriddia, could sell up to three tons per year of the seeds he had grown. The difference reflects the different yields of each of the two distinct populations, which of course had been just one single population back in Tel Hadya, Syria. At least 100 farmers are now growing the wheat from those seeds in Italy, according to Ceccarelli. The yields may not match bushel for bushel the yields of neighboring farms — many of which require intensive synthetic chemical inputs. But, he says, they’re showing “high rates of yield stability, year in and year out, which is what farmers care about.” And the bread and pastas made with their wheat are finding a budding market.
The movement is now bigger than the fields Ceccarelli seeded. The EU directive gives each member-state the right to authorize seed populations in the four designated crops. At least 20 such ‘cross-composite populations’ — the technical term for them — have also received authorization from national seed authorities in the UK, Germany, Denmark and France, representing a total of about 300 to 400 tons of seed, according to Klaus Rapf, a Board Member and Adviser to Arche Noah (Noah’s Ark), a seed-saving and research institution in Austria that was part of the coalition fighting for the change. More exact totals won’t be known until next year, said Rapf, when the EU compiles all the registrations held by national authorities, in their respective languages, and releases the Europe-wide figures to the public. The registrations come after years of research throughout Europe comparing the performance and resilience of diverse versus single seed populations, including by Ceccarelli and other scientists.
The fields are now a long-running fuse that could present the first major challenge to the plant-patenting system in either Europe or the U.S., proponents believe. When the first phase of the experiment is completed, at the end of 2018, there will be an assessment of its success. The program could be expanded to other crops, sustained, or stopped.
If it continues beyond 2018, the global dimensions of the seed business suggest it would not take long for the principles to make their way into the United States, where similar research is underway. The experiment could force a reassessment of existing rules, which prioritize individual varieties. You can’t ‘patent’ a population, or at least in ways that patents are currently defined. Populations are dynamic, changing in response to changing conditions – unlike hybrids or genetically engineered seeds, the patenting of which has been the foundation for the companies that now dominate the global seed trade, and which rely on standardized regulations to export their seeds.
“What’s at stake is the very concept of ‘variety’,” said Klaus Rupf of Arche Noah. “Defining something as a ‘variety’ is an abstract concept created to defend turning a seed into a protected intellectual property, based on the notion of very high uniformity.”
Or, as Ceccarelli puts it, “We are registering and certifying something that is evolving—next year will be different. You start with one thing and you end up with another thing totally different … Yes, it is a little radical,” he said.
Mark Schapiro is an investigative journalist specializing in the environment. His latest book, Seeds of Resistance, a journey in search of the seeds we need to respond to climate chaos in our food-growing lands, will be published by Skyhorse Publishing in early 2018. He is also a Lecturer at the UC Berkeley Graduate School of Journalism.
Harvey shines a spotlight on a high-risk area of chemical plants in Texas.
Long before the storm dropped barrels of rain over one of the world’s largest industrial corridors, the area was rife with potentially dangerous chemicals.
It was 2am Texas time on Thursday when the Arkema chemical plant in Crosby caught fire and exploded. Flooded by ex-Hurricane Harvey’s torrential rains, the plant lost power and refrigeration, and soon thereafter lost control of highly flammable organic peroxides it produces for use in paints and polystyrenes. The explosion cast a 30ft plume of toxic smoke over an evacuated Crosby.
But the explosion of Arkema, as dramatic as it was, is hardly the only source of chemicals potentially dangerous to Texans.
“Houston would be the largest hub of petrochemical and refining production capacity in all of North and South America,” said Trey Hamblet, the vice-president of global research for chemical processing at Industrial Info Resources Inc, a company that tracks chemical manufacturing worldwide.
The Texas-Louisiana border is home to a melange of 840 petrochemical, refining and power plants operated by some of the world’s largest companies, according to Industrial Info. Valero, ExxonMobil and Shell Oil all operate plants here. The area is arguably one of the largest industrial corridors on earth.
Advertisement
Most shut down safely before and during the storm, but the contamination they cast over the area existed long before Harvey dropped barrels of rain over the low-lying area.
“In many ways Harvey is unprecedented – with the level of rain – but the difference is we had all of the information,” said Charise Johnson, a researcher for the Union of Concerned Scientists who has worked with neighborhoods in Houston that are surrounded by the same chemical plants. “We had it all. We knew how to prepare the communities. We knew how to prepare infrastructure. But that didn’t happen.”
Neighborhoods like Manchester, on Houston’s east side, reportedly stank of gas for days after the storm without explanation. Many assumed it was because a tank spilled an “unspecified” amount of its 6.3m gallon capacity.
Tightly framed by bayous, freeways and a huge refinery, Manchester has notoriously bad air quality, even by Houston’s standards. Several residents there said they did not flood and added that the air quality on Thursday was not noticeably worse than normal.
But the Texas Commission on Environmental Quality received dozens of reports of compromised infrastructure in Harvey’s wake.
Advertisement
The second largest oil refinery in the country, a Baytown facility belonging to ExxonMobil, had a roof collapse and released pollutants into the air. Shell reported similar incidents due to heavy rains. Two tanks holding crude oil burst into flames near a wildlife preserve outside of Port Arthur after lightning struck the Karbuhn Oil Co.
About 20 miles east of Manchester along the Texas Independence Highway, where storage tanks are decorated with murals of the Battle of San Jacinto, there was a faint acrid smell in Baytown, a city of over 75,000 people.
Waiting for his order at the Taqueria Margarita taco truck, Marco Paz, a 21-year-old student, said that for the first time, some members of his family “were having trouble breathing and [getting] headaches” that lasted about two days at the height of the flooding.
Nearby, in the almost-deserted Town Square park, Canaan McKiernan strummed his guitar while his friend, Toby Smith, played with a wooden catch-ball toy.
“The chemical plants are… a mile or two away and the whole time during the hurricane you could see the flames [flaring] from way far, even in the rain; but it’s not the only time it does that, it does that kind of regularly,” McKiernan said. “I’ve lived here my whole life so you get used to it.”
Smith, 17, recently moved back to the area and said the air quality is “horrible”, especially given the heat and humidity of south Texas. “I feel like sometimes it just gets like harder to breathe, especially if you’re running or exercising sometimes. It’s just like, I’m more out of breath than I was up in Indiana,” he said.
“There are so many people in this town that work for them,” McKiernan, a 19-year-old student, said. “Unfortunately in the state that we live in and the area we live in it’s kind of a necessary evil. I mean, if Exxon wasn’t there then this whole town would be shit.”
Environmental worries did not stop when Harvey’s record-breaking rains abated.
“Our biggest concern is now that the flood water has receded is the flood water carried… a number of chemicals in the homes,” said Yvette Arellano, a researcher for Texas Environmental Justice Advocacy Services, better known as “Tejas”. “Our major priority we’re focusing on [is] cleanup.”
“We don’t even know what the complex mixtures are at this point,” said Garrett Sansom, an environmental scientist at Texas A&M; University, who works in Manchester. “They’ve been dealing with areas of poor environmental conditions with the air and standing water, with heavy metals, polycyclic hydrocarbons,” he said, mentioning chemicals that come off of burnt materials.
For example, within one mile of Manchester, there are 11 generators of hazardous waste, nine major air polluters, eight stormwater discharge facilities and four factories that treat, store or dispose of hazardous waste, according to one of Samson’s studies.
Tests of surface water in the area found the heavy metal barium in every sample. Arsenic, barium, chromium, lead and mercury were found in water two blocks from a public park.
In another study, the Union of Concerned Scientists found significantly higher cancer risks and respiratory hazards in Manchester than in other areas of Houston. Yet another study found benzene, a known carcinogen, belching out of pipelines below the ship channel near Manchester.
However, where water might have carried chemicals and metals since the neighborhood was inundated is unclear. Sansom is bringing his team to Manchester to collect samples on Friday afternoon.
“It’s usually a pretty complex mixture of chemicals that were already present in the environment and sewage and wildlife, like snakes,” said Samson about flood waters.
Others, like Johnson, hoped the loss of life and property might bring something else – hope.
“Doing environmental work in a place like Texas – where officials deny climate change is even happening – is extremely difficult. It’s a tough road, and they’ve been fighting this fight for a long time,” she said. “All I hope now is after this tragedy maybe people will start to listen, maybe they’ll take action”.
Sewage, debris, mosquitoes: Flood waters increase health risk for Harvey victims.
Flood water - a nasty cocktail of chemicals, heavy metals, sewage, debris and wildlife – is still pouring into people’s homes.
Tropical storm Harvey continues to threaten lives in Houston, where officials are focused on evacuating hospitals and securing life-saving emergency transportation, knowing they face long-term health threats.
“Our number one priority now,” said Chris Van Deusen, a clearly frayed spokesperson for the Texas Department of State Health Services, is “to make sure hospital patients and those with medical needs are taken care of.”
Flood water – a nasty cocktail of chemicals, heavy metals, sewage, debris and wildlife – was still pouring into people’s homes on Tuesday. Social media overflowed with images of people being rescued via jet ski, canoe and fishing boat.
Twelve hospitals in the Houston area were evacuated by Tuesday. Some emergency medical services were coming back online in Corpus Christi and Victoria.
“Aside from just the general public health functions, we also help coordinate medical transportation, assisting and coordinating and evacuating hospitals,” said Van Deusen. “We have been moving ambulances, ambulance buses, and we’re staging some helicopters,” he said.
Public health researchers who focus on long-range impacts watched the catastrophe with disbelief. Before the flood, Houston already had problems. Flood waters seemed to only exacerbate potential dangers.
“Words just can’t describe it,” said Garrett Sansom, an environmental health scientist at Texas A&M; University. “We’ve been trying to wrap our heads around a unified response as researchers, but also the communities we work with have been hit the hardest.”
Houston was already affected by inequality and healthcare disparities. The Manchester neighborhood in Houston is what Samson described as a “classic environmental justice” area – a Latino neighborhood on the Houston shipping channel where petrochemical plants surround houses and most people speak Spanish.
“Barium is ubiquitous in the area because of refineries, as well as arsenic and mercury,” said Sansom. “All of that is going to be in potential of coming into contact with humans. There’s sort of the complex chemical mixture.”
Wildlife can also become a sudden danger. Standing water left after the flood recedes will leave an ideal breeding ground for mosquitoes – which were already a pest in Houston.
Play Video 1:41
Houston flooding: 'we've never experienced anything like this' – video report
“I can’t emphasize the vector-borne disease issue,” said Dr Gerald Parker, as associate dean at Texas A&M; who served for 36 years in disaster response for the federal government. Mosquito-borne diseases, he said, are “just something I’m really concerned about”.
Zika captures the most headlines of any mosquito-spread diseases, but it’s far from the only one. The same Houston-endemic mosquitoes transmit dengue and chikungunya, infections characterized by fever. Other mosquito species spread West Nile virus, which can be dangerous for the elderly and health compromised.
Flood waters have also delivered fire ants to front doors, and Sansom warned flooded houses can become a home for venomous snakes such as water moccasins.
Water-borne and person-to-person infections can also easily spread after a disaster. Overwhelmed sewer systems bring people into contact with disease-spreading bacteria. Stomach illnesses are common following floods, public health officials said.
“The main thing that people have to watch out for is gastrointestinal infections,” said Dr Rick Watkins, an infectious disease doctor who studied waterborne diseases following Hurricane Katrina. “Those are going to be because of the disruption of the sewage systems and, unfortunately, the drinking water is going to be contaminated.”
About 300 public water systems were in the path of tropical storm Harvey, according to the US Environmental Protection Agency. Texas environmental protection officials said it is still unclear which systems might be contaminated.
“We do not have that information yet,” said a secretary at the Texas commission on environmental quality, before hanging up.
Close quarters in shelters can also create an ideal environment for the spread of illnesses such as norovirus, a stomach bug that causes severe diarrhea. Because of poor sanitary conditions, hepatitis A and hepatitis E have the potential to spread.
“It’s really unprecedented the flooding and the people displaced,” said Parker. “We still don’t have a number on that yet.”
The most important, and perhaps most difficult recommendation from Sansom, was to avoid the flood water. “Everything, unfortunately, is contaminated and has been exposed,” he said. “Returning to the home, you really need to exercise a lot precautions.”
How does clean coal work?
Capturing carbon dioxide and sending it below the Earth's surface, explained.
• "Clean coal" usually means capturing carbon emissions from burning coal and storing them under the earth.
• Carbon capture and storage works, but is expensive to build or to retrofit onto old plants.
• We'd need to do a lot more capturing and storing to make a difference in battling worldwide carbon emissions.
• There are other dangers associated with burning coal, too.
For decades now, "clean coal" has been a political pipe dream. It's the idea that coal—our oldest, dirtiest energy source—could be reshaped in a way that lets us keep using it without doing so much harm to the environment. But what is it, exactly?
The term "clean coal" has been applied to many technologies, ranging from wet scrubbers, which remove sulfur dioxide from coal-generated gas, to coal washing, which removes soil and rock from coal before it's sent to a factory. Hypothetically, the term could be applied to anything that makes coal plants more efficient, like digitization. However, when people talk about clean coal these days, they're typically talking about something called carbon capture and storage (CCS).
CCS technology has been around since the 1980s. While the other technologies mentioned above cut down on sulfur dioxide and coal ash (which are important), CCS is meant to handle the big environmental nightmare, the heat-trapping gas largely responsible for global warming, carbon dioxide (CO2).
Coal plants today typically use what is called pulverized coal. That means the coal is ground up, burned, and the steam from that burning drives turbines. There's also newer integrated gasification combined cycle (IGCC) technology, which sends the gas through a combustion turbine to generate electricity, and then routes excess heat from that process to generate even more electricity through a traditional steam turbine.
In either process there are multiple points at which CCS technology could intervene. One such point is called pre-combustion. At this stage, an air separation unit produces a stream of almost-pure oxygen, which flows into a coal gasifier. Gasifiers are essentially tanks that produce synthetic gas mixtures known as syngas. The oxygen in this coal gasifier reacts with fuel to create a syngas made up of hydrogen, carbon monoxide, water, and CO2. (This form of syngas is nothing new: invented in the 1790s by William Murdoch, in the 19th century it was used to power gas lights in many towns and gained the nickname "town gas.")
CCS technology sends the syngas to a shift reactor, where it encounters steam. That steam transforms the carbon monoxide that's present into hydrogen and even more CO2. The CO2 is then captured from the gas stream, compressed, and dehydrated. That leaves it ready for transport and storage.
"Transport and storage," when applied to CCS technology, basically means sending the CO2 in a pipeline several kilometers below the earth and into rock, the idea being that it's stored there rather than released into the atmosphere to contribute to climate change. Ideal spots for this include old oil and gas fields, which have already dug into the earth, but any deep saline formation, filled with porous rock and salty water, will do. Ships could also send the CO2 to refineries in the ocean. The idea is that the CO2 stays there for millions of years and eventually chemically binds with the surrounding rock.
The technology can also be used post-combustion and with oxygen-based fuel. Post-combustion uses a solvent to bind with CO2, which then drives it off into storage. Oxygen-based CCS burns away the CO2 in a boiler. All three work—CCS has been praised by everyone from the Intergovernmental Panel on Climate Change to the International Energy Agency. The Obama Administration invested $84 million in the technology, and in a rare act of bipartisanship, one of those investments has been trumpeted by current Energy Secretary Rick Perry. That plant, Petra Nova, is the world's first post-combustion plant and is about 30 miles southwest of Houston, where it captures 1.6 million tons of carbon dioxide each year.
The question is mostly one of cost and efficiency. CCS plants are expensive to build and maintain, and retrofitting the technology onto older plants requires an increase in power and costs. That's why there are only 21 CCS plants across the globe, according to the Carbon Capture & Storage Association. That number goes up to 38 if you include projects under construction or in developmental planning, which takes some time. A report from the Global CSS Institute estimates it could cost "$100 billion annually" to develop CCS, and that the technology represents "a classic catch-22 scenario. The only way costs can decrease is by installing a large number of CCS projects worldwide. However, the high cost of CCS is challenging project development."
The technology itself has its critics, too. While CCS can effectively capture around 90 percent of the CO2 produced at power plants, some people point to the fact that coal has so many pollutants that no singular technology can capture all of them. They point to mercury, nitrogen oxide, and other poisonous contaminants that coal plants could still produce even if they're not pumping out CO2. The IEA estimates that the global storage rate could exceed 7 gigatons of carbon dioxide per year, which works out to a staggering 15 trillion pounds.
CCS offers a viable alternative to modern coal plants because even as global fossil fuel demands drop steadily, every day the industry exists is one where it could be polluting less. The IEA "has found that the world needs to capture and store almost 4,000 million tonnes per annum (MTPA) of CO2 in 2040 to meet" a scenario where the Earth's temperature rises only 2 degrees C, according to last year's Global Status of CCS report. "Current carbon capture capacity for projects in operation or under construction sits at approximately 40 MTPA," a paltry figure in comparison.
"The numbers speak for themselves," the report says. "It means that there is a lot of ground to make up."
Washington state officials troubled by oilpatch secrets.
Washington State officials have privately complained about a lack of information — vital for an oil spill response — on the ingredients of the diluent used to help Alberta bitumen flow through Kinder Morgan’s Trans Mountain oil pipeline.
Washington state officials troubled by oilpatch secrets
By Stanley Tromp in News, US News, Energy, Politics | August 14th 2017
A panoramic view of Burnaby Mountain, with Kinder Morgan’s oil storage tanks on its southern slope. Across the border, Washington State officials have questions about the makeup of tar sands oil and the diluent that makes it mobile. Photo by Zack Embree
Previous story
Washington State officials have privately complained about a lack of information — vital for an oil spill response — on the ingredients of the diluent used to help Alberta bitumen flow through Kinder Morgan’s Trans Mountain oil pipeline.
The data is crucial for spill response planning as the company proceeds with a proposed $7.4 billion Trans Mountain pipeline expansion that would triple the daily flow between Edmonton, Alberta and Burnaby, B.C. to 890,000 barrels. From the company’s Burnaby site, the oil would be shipped to Asian markets in tankers through Vancouver Harbour and then through the waters of the Juan de Fuca Strait shared by British Columbia and Washington State.
The pipeline company has suggested in responses to National Observer that it has been transparent enough, publishing a list of 52 products that Transport Canada has approved for the pipeline, as well as components listed on crudemonitor.ca for various types of oil. It has told Canada’s National Energy Board (NEB) it would quickly disclose ingredients in the event of a spill.
Yet officials in Washington State’s Department of Natural Resources voiced grievous doubts in internal memos dated January 2017. “What is frustrating is ... tar sand oil manufacturers’ lack of transparency on what is used for diluents and those diluent properties, which in my mind (alludes) to dishonesty,” wrote the state’s oil spill response coordinator.
The memos were obtained by National Observer through a request under the state’s freedom-of-information law, asking about the potential environmental impacts of the Trans Mountain project. The officials who wrote the memos did not respond to requests for comment on Kinder Morgan’s responses to National Observer.
Bitumen petroleum is too thick to flow in a pipeline at ground temperature, so it needs to be thinned with a light, volatile petroleum product called diluent. In general, diluents are either mixtures of light hydrocarbons, synthetic crude oil, or both. Typically, diluted bitumen (or dilbit) is 70 to 80 percent bitumen, and 20 to 30 percent diluent.
Canada’s spill response regime “a couple of decades behind”
The federal government supports the Trans Mountain expansion and has pledged a new “world class” spill response regime. B.C.’s New Democratic Party premier John Horgan has vowed to block Trans Mountain through “every means at his disposal” with his Green Party partners.
Despite the provincial government's opposition, Trans Mountain spokesperson Ali Hounsell told National Observer construction is proceeding this fall and “we congratulate” Horgan on his election win.
South of the border, worries date back to at least 2004, preceding Kinder Morgan’s expansion plan, when a study by the Washington State Department of Ecology concluded that a major oil spill would cost the state 165,000 jobs and $10.8 billion in economic impacts.
State ecology officials in the spill response section wrote to the Washington State governor in 2013 that, “B.C. lacks authority over marine waters, and their federal regime is probably a couple of decades behind the system currently in place in Washington State. When it is spilled, we are concerned that dilbit oil may be considerably more toxic and damaging, and far more difficult to clean up, than conventional crude from Alaska.”
Hounsell, of Trans Mountain, told National Observer that detailed investigations by government researchers, academics and industry have found dilbit just as safe to transport as other types of crude oil. She cited a company fact sheet called, Mythbusting: The Three Most Common Misconceptions About Diluted Bitumen.
The January 2017 internal memos from Washington State Natural Resources officials express a very different view.
“There is definitely a need for full disclosure and transparency regarding the products used to create bitumen and other crude oil diluents and their properties,” wrote the department’s habitat stewardship specialist. “Policy makers have a long way to go to require (let alone enforce) adequate mitigation.”
The Washington State oil spill response coordinator expressed acceptance of the need for oil in the current economy but added that “without unbiased research” governments cannot have an honest debate on many questions. Among the questions: "how fast the diluent will evaporate in real life conditions, how explosive is the air in an oil spill due to properties of diluent and how does this affect a response, how soon will the oil sink, how well will sinking oil be addressed if at all, how will sunken oil be tracked, what will be the impact of that oil on ecosystem(s), how will it be monitored and recorded, and how wil we gauge mitigation plans proposed to repair or at least compensate for damage?"
The documents connect the spill questions to the 2010 British Petroleum oil spill in the Gulf of Mexico. The same oil spill response official asks, “Without these questions being answered to an exhaustive degree, how can the public be asked to accept these risks? ... How can we honestly say that we are ‘prepared’?
"The times of oil companies asking the public to trust them are over, as we still seek to understand the full implications of the BP oil spill.”
The official said those who propose "oil handling facilities" need to be held to "the highest bar": “The interest in all these projects will soon be invigorated and there will most likely be a push to build and seek answers later. We need to work together to hold proponents to answer these difficult questions before we concede to risks.”
Companies may hide behind commercial secrecy, officials said
The same official again raised the problem of commercial secrecy to a colleague:
“Decision makers do not have, nor seek, the level of information needed to make decisions based on cumulative impact review. Those that propose these projects provide the bare minimum and hide behind proprietary protection measures to keep discussion vague. Stovepiping allows for high-impact projects to move through review process even when clear significant harm is forecasted.” Stovepiping is a term used to describe the presentation of raw information without proper context.
Hounsell indicated Trans Mountain is transparent. “As you can see on the crudemonitor.ca website, there is a full listing of components for each type of synthetic, light and heavy crude products, and each varies.” When National Observer asked for a list of diluent ingredients to be used in the Transmountain pipeline, Hounsell sent a link to a list on its website of the 52 products the government approved for the pipeline. These range from “super lights” such as regular gasoline and Peace River Condensate to diluted bitumen products such as Borealis Heavy Crude, Access Western Blend, Cold Lake Blend, and Seal Heavy, each with their own formula.
A spokesperson for Natural Resources Canada said data on the chemical properties of dilbit can be found in the public domain by Googling the Material Safety Data Sheet (MSDS) for diluted bitumen, and by reading chapter eight of the National Energy Board’s 553-page report on the Trans Mountain project. He also cited www.crudemonitor.ca.
“I think it's fair to say Kinder Morgan is being transparent enough with the products they will be shipping,” said Peter McCartney, climate campaigner for the Wilderness Committee, referring to B.C. cities with the same concerns about Trans Mountain. “Where cities may be running into trouble it might be that they are unable to figure out exactly what's coming through at any time.”
Trans Mountain disclosures incomplete, say environmental groups
Other Canadian environmental groups said disclosures about Trans Mountain product are incomplete.
“Crudemonitor does not separate out the constituents - so Cold Lake Blend, which is a diluted bitumen, does not have one content list for bitumen and one for diluent - it's just a single list for dilbit,” said Kate Logan, an independent toxicologist with the Raincoast Conservation Foundation. “Diluent acts differently from crude oil. For one thing, it weathers off quite fast because it’s so light.”
Keith Stewart, head of Greenpeace’s climate and energy campaign, said in an email that “Kinder Morgan is obscuring the issue by saying that there are lists of things that could be in dilbit, but avoiding the issue of what precisely is in any given batch (and it will vary, depending on price and availability).”
Stewart added that concerns are about polyaromatic hydrocarbons, or PAHs, that are in dilbit: benzene (a carcinogen), toluene, ethyl benzene and xylene – and that it appears all dilbit would contain these compounds to varying levels. (Naphtha, butane, hexane, hydrogen sufide, sulfur and nitrogen have also been noted in dilbit.) “It's tough to say exactly what will be in the pipeline at any given time, which may be the cities' frustration.”
In its Gainford study, Trans Mountain provided PAH levels for two dilbit blends — Cold Lake and Access Western — and this data was summarized in the company report to the NEB. Logan counters that this data is inadequate because “these are just two of the dozens of products the pipeline is approved to carry, and again, there is no separate information for diluent and bitumen.”
Should trade secrets trump public health and safety?
The obtained Washington state officials' e-mails include a link to a recent study of the impacts on oceans of spilled bitumen and diluents.
“Even if we know the diluent contents, the quantity and formulas are still largely private,” said lawyer Eugene Kung of West Coast Environmental Law. “This is a big problem with fracking also, where companies claim their information is proprietary. There is certainly a place for trade secrets, but to what extent does that trump public health and safety? And should governments override that?”
“I do think that more transparency on the exact constituents of dilbit would help (perhaps a range for each constituent), but to me all of the diluent is dangerous, for it's all flammable and volatile, even though the exact amount of benzene in it may vary,” said Angela Brooks-Wilson, a physiology professor at Simon Fraser University. “To have the properties that make it a thinner for the bitumen, the diluent must be made up of smaller molecules, which because they are smaller give it the properties of volatility and flammability.”
Trans Mountain did not respond to followup questions on the specific contents of diluent as separate from bitumen, and the lack of PAH data for diluents.
The NEB’s final report on Trans Mountain noted that Environment Canada recommended that Kinder Morgan commit to providing spill responders “a specific suite of test data for all types of hydrocarbon products to be shipped” — before shipping — to help them plan a good response.
The NEB overrode that advice because the company had committed to give those parties “timely information on the physical and chemical characteristics of any product spilled” and it trains its own and external workers in spill response. The company also declined Environment Canada’s advice, because it was awaiting more research on the behavior of dilbit in water.
Specific diluent content typically not available to public
A 2015 U.S. National Academy of Sciences (NAS) report called Spills of Diluted Bitumen from Pipelines said specific content of diluents is typically not publicly available. "The individual selection of diluents varies depending on the desired outcome, the current cost of acquiring and transporting the diluent to the bitumen source, and other internal considerations of pipeline operators."
Does diluted bitumen differ importantly from other crude oils? In its submission to the NEB, Trans Mountain said that dilbit is “a stable, homogenous mixture that behaves similar to other natural crude oils when exposed to similar conditions and undergoes a weathering process."
By contrast, the National Academy of Sciences study said that many dilbit properties “are found to differ substantially from the other crude oils,” the key differences being the high density, viscosity and adhesion properties of the bitumen portion that affects how oil behaves in water under various weather conditions.
Trans Mountain told the NEB that the diluent and bitumen of dilbit should be considered as one blended product, not separately. But the NAS study disagrees, saying that after a spill, weather conditions alter the dilbit and “the net effect is a reversion toward properties of the initial bitumen.”
Governments have raised concerns about diluted bitumen secrecy before.
In July 2010 a pipeline operated by Enbridge burst and spilled over a three million litres of diluted bitumen into the Kalamazoo River in Michigan. After several days, the volatile hydrocarbon diluents evaporated, leaving the heavier bitumen to sink in the water. The spill cost over $1.2 billion to clean up, with heavy environmental impacts. (In its website cited above, Transmountain calls it a “myth” that dilbit would sink in B.C. waters.)
Nine days before the Kalamazoo accident, the U.S. Environmental Protection Agency had warned that the proprietary nature of the diluent found in dilbit could complicate cleanup efforts. (The agency was commenting on the proposed Keystone XL dilbit pipeline.)
“Without more information on the chemical characteristics of the diluent or the synthetic crude, it is difficult to determine the fate and transport of any spilled oil in the aquatic environment,” EPA officials wrote. "For example, the chemical nature of diluent may have significant implications for response as it may negatively impact the efficacy of traditional floating oil spill response equipment or response strategies.”
At the NEB hearings on Trans Mountain's proposal, the City of Vancouver and others asserted that the evaporation of diluents, especially benzene, from a dilbit spill would be a health risk to spill responders. Kinder Morgan denied that in its 440-page final submission to the NEB, writing that critics supplied “misstated and misleading estimates about vapour concentrations (specifically, benzene) that are available for evaporation that may be encountered by people in the area.”
The B.C. NDP government declined to comment for now. B.C. Green Party leader Andrew Weaver said, “I concur with the Washington State memos that state that spill responders cannot adequately respond to a spill without knowing the ingredients or formula of the bitumen and diluents.”
Editor's note: This article was updated at 10:50 p.m. ET to correct that diluted bitumen was spilled into the Kalamazoo River following the 2010 Enbridge pipeline rupture.
My depressing summers in Belize.
I spend the hot months in the water, studying ocean ecosystems. What I see happening to our coral reefs is deeply alarming.
My Depressing Summers in Belize
By JOHN BRUNOJULY 6, 2017
Continue reading the main storyShare This Page
Share
Tweet
More
Save
Photo
Entrance to the famous Great Blue Hole, part of the Belize Barrier Reef Reserve System. Credit Michele Westmorland/Corbis, via Getty Images
When summer arrives, my friends and family inevitably roll their eyes when I tell them I’m packing for my fieldwork in the Caribbean. They picture a book and a white-sand beach. I do get a tan. But it’s no vacation.
I study ocean ecosystems. The work is chronically underfunded, so food and housing is basic or worse. When we’re in Belize monitoring the health of coral reefs, about half the nights we sleep under the stars on a dock. When I can afford a roach- and gecko-infested room, it’s often so rustic that it’s preferable to sleep outside.
There are also the tropical diseases we acquire (dengue, for instance), the insects that lay eggs under our skin (bot flies), stinging jellyfish, scorpions hiding in our shoes and, of course, feisty sea turtles (on one trip an enormous loggerhead turtle bit one of my graduate students on the rear). It’s also physical work, made harder by the intense heat and humidity. One former undergrad in my lab was in the National Guard. After she was deployed to Kuwait, she emailed us to say that the assignment was easier than fieldwork with us.
Continue reading the main story
Photo
The Belize Barrier Reef. Credit Pete Oxford/Minden Pictures, via Getty Images
Still, I love all of it. One of the big rewards is the wonders you stumble into by just spending so much time in nature, the kind of things you see in BBC documentaries narrated by David Attenborough. Last summer I woke up in the middle of the night, looked over the dock and saw a dozen spotted eagle rays slowly circling beneath me. It looked like a mobile you’d hang over a baby’s crib. We’ve also come across mating leatherback turtles (awesome, but not so sexy), orcas and manta rays in the Galápagos Islands, a huge tiger shark in Moorea and fields of tiny eels peeking out of their holes on the sandy seafloor in Palau.
Continue reading the main story
ADVERTISEMENT
Continue reading the main story
Like many of my peers, I’ve walked away from the type of purely basic academic science I was trained to do to focus on trying to understand and slow the rapid changes underway in ocean ecosystems. My team has been working on determining whether protection from fishing and pollution in well-policed marine reserves can moderate or reverse the loss of Caribbean corals, the small invertebrate animals that build up reefs over thousands of years.
Since 2009 we’ve been annually surveying 16 reefs across the Belizean Barrier Reef, half of which are inside a protected reserve. We typically survey two reefs a day, filming the seafloor with video cameras and counting and identifying every fish in 100-foot-long bands.
Unfortunately, we’ve found local conservation is ineffective in stopping coral loss. Dozens of other studies around the world have reported the same finding. The most striking example is probably mass bleaching and coral mortality on Australia’s Great Barrier Reef in 2016 and again this year. This well-protected reef, relatively isolated from human activities, is nevertheless susceptible to global warming. I was a co-author of a paper last year that found (to my surprise) that the world’s most isolated reefs were no healthier than those adjacent to coastal cities. Even the most remote marine ecosystems in the Central Pacific and the North Atlantic and around Antarctica are being radically altered as oceans warm and become more acidic.
Continue reading the main story
Photo
A diver observing bleached coral at Heron Island on Australia’s Great Barrier Reef last year. Credit Xl Catlin Seaview Survey/Agence France-Presse — Getty Images
The Caribbean has warmed by about two degrees Fahrenheit during my lifetime. Carbon dioxide and other greenhouse gases act as a sort of blanket around the earth, trapping heat that would otherwise be lost to space. Incredibly, 94 percent of this extra heat is going into the oceans, and it’s not just coral reefs that are being affected. Thousands of species are rapidly migrating away from the Equator, trying to stay cool. This is creating new mixtures of plants and animals that are interacting in new and unpredictable ways.
Newsletter Sign UpContinue reading the main story
Sign Up for the Opinion Today Newsletter
Every weekday, get thought-provoking commentary from Op-Ed columnists, the Times editorial board and contributing writers from around the world.
Enter your email address
Sign Up
You agree to receive occasional updates and special offers for The New York Times's products and services.
SEE SAMPLE MANAGE EMAIL PREFERENCES PRIVACY POLICY OPT OUT OR CONTACT US ANYTIME
Our goal as scientists isn’t to save only endangered invertebrates like coral but to preserve the reefs that hundreds of millions of people depend on. Food, jobs, tourism revenue, recreation and buffers from coastal storms are just some of the value coastal communities get from healthy reefs.
I grew up in South Florida in the 1970s, when the reefs of the Florida Keys were still relatively healthy. Snorkeling just a foot or two above acres of golden elkhorn corals was like flying over golden fields of wheat. That is what inspired me to spend my life learning and teaching about the oceans. I was about 10 years old then.
By the time I graduated from high school, most of that coral splendor was gone. A disease linked to ocean warming wiped out about 99 percent of elkhorn coral colonies across the entire Caribbean — literally hundreds of millions of corals disappeared in a matter of months. This species and closely related staghorn corals had dominated Caribbean coral reefs for at least 5,000 years.
Things aren’t getting any better. A few days ago, a colleague, Bill Precht, a coral reef scientist with an environmental consulting firm, sent me a note describing what he saw on a recent dive at Florida Keys National Marine Sanctuary. It’s typical of my summer correspondence from fellow scientists. Depressing.
“This reef is a coral graveyard,” he wrote. “Lots of recently dead colonies now covered with a thin veil or sediment and turf algae.”
So what can be done to protect corals and other marine animals from ocean warming? The obvious solution is to switch to solar and wind energy, now a cheaper source of electricity than coal. Although our economy is already making this shift, it’s happening too slowly to avoid catastrophic warming. A revenue-neutral carbon tax is one effective mechanism to promote renewable energy sources. This solution has been championed by a bipartisan patchwork that includes the former NASA scientist James Hansen; the Republican elders James A. Baker, George P. Shultz and Henry Paulson; and my dad.
Despite all the loss and the looming threats, there is still so much left to conserve. Like the amazingly healthy Orbicella coral reefs I saw in the crystal-clear waters of the Bay of Pigs, Cuba, a few years ago, and the staghorn coral reefs within swimming distance of the beachfront hotels of Fort Lauderdale that are now threatened by an Army Corps of Engineers dredging project. There are also a few reefs at higher latitudes or in other lucky locations that are warming much more slowly and could hold out for decades or centuries.
I really don’t know how this will all turn out. Corals and other creatures could adapt to their changing environments. People could radically reduce their carbon emissions. Yet both outcomes are unlikely, and reality is draining my ocean optimism. It isn’t too late, but we need to act very soon.
John Bruno is a marine ecologist at the University of North Carolina, Chapel Hill.