chronic fatigue syndrome
Monsanto’s driverless car: Is gene editing driving seed consolidation?
Gene editing technology is being heralded as a game-changer, but it raises serious questions as five of the Big Six agriculture and chemical companies seek to merge.
BY TWILIGHT GREENAWAY | Business, Food Policy, GMOs
04.10.17
When the CEOs of both Monsanto and Bayer met with Donald Trump to talk about their potential merger just three days before the inauguration, they made some big promises. If the union between the world’s largest seed company and the German multinational chemical, pharmaceutical, and life-sciences company is blessed by antitrust regulators, the companies have pledged to add 3,000 high-tech American jobs and to combine—rather than consolidate and trim—their R&D; budgets to the tune of $16 billion over the next six years, or $2.7 billion a year.
The two companies have been locked in a dance since May 2016, when Monsanto rejected Bayer’s initial $62 billion offer. Then, last fall, the merger reappeared in the news in a noteworthy chain of events.
On September 14, Bayer upped its offer to $66 billion and Monsanto accepted, putting a third major seed company merger on the table, beside ChemChina’s $43 billion takeover of Syngenta and Dow Chemical’s intended merger with DuPont. On the day it was announced, the Washington Post called the Bayer-Monsanto deal the “mega-deal that could reshape [the] world’s food supply.”
Less than a week later, spokespeople for the companies behind all three mergers were asked to testify before the senate judiciary committee, on what senator Chuck Grassley (D-Iowa) called a “merger tsunami.” Then, just two days later, Monsanto announced it had licensed the rights to use CRISPR/Cas9 gene editing—a technology that has been called the “Model T of genetics” for its power to change the way we live.
This rapid-fire timing may have been a coincidence, but it also may be a sign of what’s to come. And it’s just one of many indications that CRISPR/Cas9 and other next-generation gene editing technologies will likely be at the forefront of the seed industry in the years ahead. Some even see gene editing, which is said to be simpler, less expensive, and more consumer-friendly than traditional genetic engineering, as one factor driving the mergers. And while that’s up for debate, it’s clearly an important part of the strategy for companies looking to control, and profit from, the world’s seeds.
Last week the European Union cleared the way for the ChemChina–Syngenta takeover, suggesting the other two mergers may be imminent. If that happens, the resulting three companies would control nearly 60 percent of global seedstocks (including as much as 80 percent of U.S. corn seeds) and 70 percent of the global pesticide market. And these companies are also making a bid to control much more than seeds and pesticides. Monsanto, for example, is already making a play to control many other facets of modern agriculture—including tools for precision planting and high-tech weather prediction.
So while much of the media coverage of gene editing has pointed to its potential to break molds and change the genetic playing field, when it comes to agriculture, it will likely follow a more familiar path: CRISPR and other similar technology will most likely be used by scientists mainly to continue developing seeds that withstand consistent doses of pesticides and herbicides on large, industrialized farms.
“Monsanto has been conducting research with genome-editing techniques for years, and we are excited to be integrating additional technology from licensing partners in to this body of work,” Tom Adams, biotechnology lead for Monsanto, said in an email. Over the past year, the company has announced several licensing agreements that will allow it to access gene-editing technologies, such as CRISPR/Cas9 and CRISPR-Cpf1 (which is said to be more precise), as well as a tool from Dow AgroScience called EXZACT™ Precision Technology® Platform, among others.
Although Adams said this work is still in its early days, he added, “we believe that genome-editing techniques have great potential to improve and unlock capabilities across our leading germplasm and genome libraries to enable a wide variety of improvements across crop systems.” However, he added, “We do not view it as a replacement for plant biotechnology.”
SEED: The Untold Story will premiere on the PBS series Independent Lens on April 17, 2017 at 10PM. Learn more.
Gene Editing vs. Genetic Engineering
Since 1996, Monsanto has released a series of genetically engineered herbicide-resistant seeds, beginning with Roundup Ready soybeans in 1996, and moving on to corn, cotton, sugar beets, canola, and more. Today, Roundup Ready crops account for over 94 percent of the soybeans and 89 percent of the corn grown in the United States.
As these products have come to dominate the farm landscape, weeds have also become resistant to Roundup. According to the Weed Science Society of America, “overreliance on herbicides with a single mechanism of action to control certain weeds has led to the selection of weeds resistant to that mechanism of action.” Similarly, incidents of pesticide resistance have also been on the rise.
Photo credit: Mike Mozart
As a result, farmers often find themselves on what critics call a pesticide treadmill, where each new form of resistance requires a more powerful solution. Companies have spent the last several years working on pesticides and herbicides with “stacked traits,” and the seeds that are resistant to them. Gene stacking involves combining two or more genes of interest into a single plant. In the case of Monsanto, that has meant, for instance, breeding seeds that tolerate both glyphosate and a second herbicide called dicamba.
The main difference between gene editing and classical genetic engineering is that the former allows scientists to manipulate the genetic makeup of an organism—by changing or “knocking out” the function of a gene—without introducing genes from other organisms. This last part is key, because it’s often the combination of parts of various organisms—such as genes from bacteria added to corn to create herbicide resistance or genes from an arctic flounder added to strawberries to make them able to withstand cold weather—that has made the public wary of GMOs in their food.
But the image of CRISPR/Cas9 and other gene editing tools as an “entirely pristine” technology that rules out all foreign DNA isn’t entirely accurate, says Maywa Montenegro, a Ph.D. candidate in Environmental Science, Policy, and Management at University of California, Berkeley.
“They aren’t wrong in saying CRISPR doesn’t need to introduce foreign DNA, but it absolutely can. That’s what it’s very good at,” she said. “But it’s also important for people to understand that you can create huge, impactful changes in a plant’s functioning without introducing anything foreign.” De-activating, or knocking out, a gene function, can significantly change the plants and animals involved.
In the case of the mildew-resistant wheat developed in China, for instance, scientists were able to introduce “targeted mutations” using CRISPR/Cas9 without inserting new genes. In another example, Cibus, a San Diego-based startup, has produced (and commercialized) an herbicide-resistant canola using another early gene-editing technique called Rapid Trait Development System (RTDS).
The company also says it has other crops, such as herbicide-resistant rice and flax seeds, in the pipeline. DuPont is also working with the Berkeley-based start-up Caribou Biosciences (founded by Jennifer Doudna, one of the founders and patent-holders of CRISPR/Cas9 technology) to develop gene-edited, drought-resistant corn and wheat varieties.
The most widely discussed food produced using gene editing today is a non-browning mushroom developed using CRISPR/Cas9 at Pennsylvania State University. The mushroom received a great deal of media attention last spring, when the Penn State scientists received a letter from the U.S. Department of Agriculture (USDA) informing them that the agency would not be regulating its field testing.
At the time, a number of media outlets reported that the mushroom had “escaped regulation,” suggesting that gene editing was not only remarkably different than tradition genetic engineering in crucial ways, but that it also might be the key to avoiding government oversight. But on both accounts, the reality may be less cut and dried.
Will Gene-Edited Seeds be Regulated?
Doug Gurian-Sherman, director of sustainable agriculture and senior scientist for the Center for Food Safety (CFS), says the letter USDA sent to Penn State about the non-browning mushroom was just one of over 30 that went out at that time in response to requests by a variety of entities working with gene-editing technology.
While the USDA did clearly mention the fact that the mushroom didn’t contain any foreign DNA in its response, that wasn’t the only reason it abdicated its regulatory authority. Just as important, it seems, is the fact that the mushroom was not in any way considered a “plant pest.” (Think Bt corn, which is engineered to express an insecticide in the field.)
You see, when it comes to regulating GMO crops, plant pests have been at the heart of the USDA’s regulation approach; all other genetically engineered products fall under the auspices of either the U.S. Food and Drug Administration (FDA) or the Environmental Protection Agency (EPA). (A document from the Pew Charitable Trusts includes a handy chart detailing which agency is supposed to regulate what types of organisms.)
In fact, the letter sent to Penn State concluded with this sentence: “Please be advised that the white button mushroom variety described in your letter may still be subject to other regulatory authorities such as the FDA or EPA.”
So gene editing was by no means the only factor at hand. “As soon as they put genes in from any plant pest they would immediately become regulated by USDA,” said Gurian-Sherman.
Of course, exactly how the FDA plans to regulate gene editing is yet to be seen. Since January, the agency has been seeking public input on the topic in both medical research and agriculture. One core question at hand is whether gene editing will be considered “genetic engineering.” And at a time when a growing number of consumers want to know exactly what’s in their food—and around 90 percent of Americans say they want to see genetically engineered ingredients in food labeled—this is as much a question of consumer demand as it is a question of regulation.
“We already see lots of people who are supportive of genetic engineering, calling [gene editing] ‘advanced breeding,’” said Gurian-Sherman. But, he added, “In terms of most of the legal definitions of genetic engineering that are out there right now, it applies. I think it is a legitimate area for argument whether this is generally safer or not or more acceptable, but they clearly don’t want to label it genetic engineering.”
According to Michael Hansen, senior staff scientist at Consumers Union, “the FDA’s documents now clearly say their definition of bioengineering is the same as the definition of modern biotechnology held by Codex Alimentarius.” That’s the “Food Code” established by the U.N. Food and Agriculture Organization and the World Health Organization. The Codex definition refers to any organism made using “the application of in vitro nucleic acid techniques.” And since gene editing does precisely that, Hansen believes the answer is clear: Gene editing should be seen as genetic engineering.
But not everyone agrees. In an editorial last January, for instance, the editors of Nature endorsed “the principle of transparency in the production of genome-edited crops and livestock…with no further need for regulation or distinction of these goods from the products of traditional breeding.”
U.C. Berkeley’s Montenegro describes CRISPR/Cas9 as a kind of Swiss army knife with the potential to be paradigm-shifting. But, she adds that, for that reason, it calls for a lot more scrutiny and regulatory oversight
Hansen agrees. Using gene editing, he said, “You can identify a key sequence you want to cut. But wherever that sequence occurs in the genome, you would get a cut. And you will also get a cut at sequences that look similar.”
Hansen also points to the fact that scientists have experienced at least some off-target effects with most gene editing technology to date. He points to the case of an effort to destroy the HIV virus with CRISPR/Cas9. Although scientists engineered T-cells with CRISPR to recognize and destroy HIV, he said, “it took the HIV just a couple of weeks to evolve resistance to CRISPR.”
And in a recent effort to artificially synthesize a new genome for E. coli, a group of scientists decided not to use gene editing because, they wrote, “these strategies…likely would introduce off-target mutations.”
Despite these concerns, CFS’s Gurian-Sherman says there are big questions about how regulatory bodies under the Trump administration will choose to respond to the technology. For one, he says, gene editing could be much harder to test for.
“Detecting [transgenic] engineered changes, for a molecular biologist is really, really easy,” he said. But some of these [gene edits] are not going to leave much of a fingerprint, if at all, and they’re going to be very hard to trace,” he said. “So something like the kind of testing the Non-GMO Project does probably wouldn’t be possible in foods edited with CRISPR.”
Ultimately, Monsanto appears to be preparing for the possibility of regulation. When Lux Research, an independent technology research and advisory firm, looked into the Monsanto-Bayer merger in December, they surmised that gene editing was an important part of Monsanto’s appeal to Bayer, but that it was by no means the only technology they’re banking on.
“Monsanto’s advantage in the space is that they’re super diverse and they have their hands in all the cookie jars,” said Laura Lee, the author of Lux’s report. “So they’d be able to advance traits using CRISPR, or if the regulatory bodies step in and decide to classify CRISPR as genetic modification or put a harmful label on it, they’ll have of other options.”
“More Accessible” Technology
While CRISPR and other gene editing tools are seen as more affordable and more efficient, they’re also being touted as more accessible than traditional genetic engineering—and they are already being used in small private laboratories.
“We think the fact that this science is accessible to and being explored by many researchers across the public and private sectors is exciting—and will only improve the types of products that will ultimately be accessible to farmers,” said Monsanto’s Tom Adams.
Indeed, most traditional genetically engineered traits take years and cost millions to produce (an average of $136 million to be exact). So bringing that number down could bring more constituents into the fold, despite the consolidation at the top.
But seeds produced this way will still be subject to strict intellectual property fines, says Gurian-Sherman. “[CRISPR] won’t be as controllable by the big companies, but the patenting (or lack thereof) could really be a limiting factor for smaller companies,” he said. Case in point, a non-exclusive license to use CRISPR/Cas9 is valued at $265 million.
Of course, if that license is used to create a handful of seed traits, it could be more than worth the investment for a company like Monsanto—particularly if it can deliver on sought-after traits such as drought tolerance. And it might lead one to deduce that a newly merged company such as Monsanto-Bayer would use gene editing to bring down its overall R&D; budget.
But that’s not necessarily the case, says Montenegro. In addition to facing pressure from the Trump Administration to spend mightily in the U.S., she points to an economic phenomenon called Jevons paradox, wherein technology makes a process more efficient, but that efficiency ends up leading to increasing demand. (Jevon first observed the phenomenon while observing the coal industry of the 19th Century.)
Another important question is whether this more accessible technology will be put to use to create seeds designed for alternative or more sustainable farming systems.
Montenegro says she has heard from one plant breeder at the University of Minnesota who was interested in using CRISPR/Cas9 for participatory plant breeding—a tactic involving farmers that is often used in the developing world—and to breed plants that could be amenable to diversified organic farming systems. But she says it’s not likely that a wider playing field will change the basic premise of the bulk of the work done using gene-editing technology—which is to engineer seeds used on large-scale industrial farms.
“While I don’t want to foreclose the possibility of using CRISPR for agroecology, [companies and institutions] are underinvesting and undercutting basic agroecology research to such a large degree that even the lower-hanging fruit hasn’t yet been picked,” Montenegro said. This “massive asymmetry” makes her doubtful that the technology will help researchers tread new paths when it comes to sustainable practices.
Gurian-Sherman is no more optimistic. “There are ways you can breed or adapt crops for sustainable agricultural systems that don’t rely on inputs like fertilizers and pesticides as much,” he said. “You can breed crops that attract natural enemies, or take advantage of the slower release of organic nutrients from cover crops and manure. I can go on and on about traits that are valuable to sustainable farming. But that’s not going to be of interest to these companies because they’re actually antithetical to their business models.”
Consumers Union’s Hansen says the current excitement about gene editing reminds him of the very early days or genetic engineering. “In the late 80s and early 90s, they were saying they’d be able to do everything with GE. Thirty years later, all you have is herbicide-tolerant plants and Bt plants. Or that’s the vast majority.”
This story was created in partnership with ITVS.
Ironically, this scorched-Earth campaign season may end with lovely Election Day weather.
Hundreds of warm temperature records will fall across large portions of the Central and Southern U.S. during the next two weeks, as the jet stream lifts north to a summer-like position across Canada, allowing unusually warm weather to move into the U.S. for an extended period of time. The warmth could last straight through election day.
Scorched-Earth campaign season may end with warm, tranquil election day weather
416
SHARES
Share Tweet Share
A woman relaxes next to Tempe Town Lake in Arizona.IMAGE: AP PHOTO/ROSS D. FRANKLIN
BY ANDREW FREEDMAN
2016-10-28 17:54:08 UTC
Hundreds of warm temperature records will fall across large portions of the Central and Southern U.S. during the next two weeks, as the jet stream lifts north to a summer-like position across Canada, allowing unusually warm weather to move into the U.S. for an extended period of time. The warmth could last straight through election day.
The weather maps for the next 10 days look more like mid-to-late August than late October and early November, with storm systems zipping across Canada, and few, if any, outbreaks of cold air moving southward into the U.S.
This unusual weather pattern has implications for the presidential election, since a warmer-than-average, relatively tranquil election day would likely be ideal for maximizing voter turnout.Â
Temperatures may be high enough through November 10 to threaten not only daily records, but even monthly records too, with highs running 10 to 20-plus degrees Fahrenheit above average from the end of October into the beginning of November in cities including Minneapolis, Omaha, Chicago, Dallas, Atlanta, Phoenix, Denver and Oklahoma City.Â
Temperature outlook for November 4 to 10, 2016, showing milder than average conditions across the U.S., including Alaska. This includes election day.
IMAGE: NOAA/CPC
Many places across the U.S. will also be unseasonably warm during the run-up to Nov. 8, and many states, like Florida and Ohio, have already started early voting.
While this forecast could change significantly, given the amount of uncertainty associated with predictions so far in advance, computer model projections are in agreement that nearly the entire country will be warmer-than-average and relatively storm-free on Nov. 8.Â
Two exceptions may be the Mid-Atlantic and Northeast, where an area of low pressure, with cooler temperatures, may develop, as well as the Pacific Northwest.
Computer model projection of temperature anomalies on Nov. 8, 2016. Reddish hues correspond to unusually mild temperatures.
IMAGE: WEATHERBELL ANALYTICS
Prior to voting day, however, record warmth will envelop areas outside the Northeast, California and Pacific Northwest. Thousands of people will go trick-or-treating during what is likely to be the warmest Halloween on record.Â
In fact, you may want to rethink your costume if you're currently planning to go as a character or concept that requires multiple layers of clothing, given that it may feel like summer outside.Â
This is especially the case if you are heading to a Halloween party in the Midwest, Plains or South.
For example, current forecasts call for a high temperature of at least 80 degrees Fahrenheit in St. Louis on Monday, which would be 20 degrees Fahrenheit above average for the date. Even Minneapolis, which typically can see snow on Halloween, may flirt with the 70-degree mark.Â
CFS ensemble mean temperature departures from average from Oct. 28 to Nov. 7.
IMAGE: WEATHERBELL ANALYTICS
The warm finish to October could help ensure that many locations in the Plains and South will set a monthly high temperature record.
The warm weather in early November does not mean that the U.S. will have a mild winter, however. In fact, there are indications the winter could be colder and snowier than average for many, particularly in the Upper Midwest, Northeast and Mid-Atlantic states.
Â
Here's what the science really says about Fort McMurray and climate change.
A few days ago Mike Flannigan travelled to Fort McMurray as part a special scientific research team studying the raging wildfire. He got a bird’s eye view from above and a close look at the ruins on the ground. With smoke still smouldering in the forests, not many people had come back, and yet nature had begun it's nascent process of recovery and return. “It was kind of eerie to go around neighbourhoods," Flannigan tells National Observer. "Birds were building nests on front lawns. Wildlife was starting to take over."
Here's what the science really says about Fort McMurray and climate change
By Mike De Souza in Analysis, Energy | June 3rd 2016
#80 of 80 articles from the Special Report:
Race Against Climate Change
Smoke continues to smoulder in the forests near Fort McMurray on May 27, 2016. Photo courtesy of Mike Flannigan.
A few days ago Mike Flannigan travelled to Fort McMurray as part a special scientific research team studying the raging wildfire. He got a bird’s eye view from above and a close look at the ruins on the ground. With smoke still smouldering in the forests, not many people had come back, and yet nature had begun it's nascent process of recovery and return. “It was kind of eerie to go around neighbourhoods," Flannigan tells National Observer. "Birds were building nests on front lawns. Wildlife was starting to take over."
Now Fort McMurray residents have begun the difficult journey home weeks after that terrifying day in May when an unprecedented inferno, fueled by unusually hot and dry spring weather, caused them to flee and led to the largest evacuation in Alberta’s history.
Those unusual weather conditions have been widely attributed to El Nino, a naturally-occurring phenomenon linked to warm ocean water that disrupts the weather.
But Flannigan, a professor of wildland fire from the University of Alberta, and many other climate change scientists agree that while the Fort McMurray fires cannot be directly linked to the carbon pollution produced by humans, Canadian wildfire activity of the past few years is well above average. And it's connected to the warming climate.
In terms of the total areas destroyed by fires, there's an unmistakable escalation, they say.
They see these fires as vivid markers of dangers to come for the forests and for the people and wildlife that live in them and around them.
As temperatures warm, they say, the likelihood that more out of control infernos will consume more trees and human infrastructure will increase as well.
“Climate change is here," Flannigan says. "And we’re seeing more fires and arguably more intense fires because of it. Our area burned in Canada has doubled since the seventies. I and others say that this increase in area burned is related to temperature, which is related to human caused climate change.”
Government assessment shows increased forest fire activity in last 40 years
Provincial officials had immediately deployed four air tankers and helicopters as part of a team of firefighters on Sunday May 1, soon after 4 p.m. local time when a patrol first spotted the flames in a very small area of forest near Fort McMurray.
Two hours later, the fire had grown by 30 times to about 60 hectares.
“It goes to show how hot and dry it has been here with no rain in the area for the last two months," Alberta government wildfire manager Chad Morrison told the media on May 7. "Any fire that starts can grow very quickly and move very fast.”
Alberta officials describe the rapid spread of the Fort McMurray wildfire after it was first discovered on May 1. Government of Alberta video.
A week after it started, the wildfire had covered an area 200,000 square kilometres in size. As of today, it has consumed 600,000 hectares, the size of the state of Delaware, according to a Bloomberg report and it continues to burn.
The inferno leapt over rivers and whipped across highways before attacking Fort McMurray on May 3. More than 80,000 people abandoned their homes to seek safety.
It was so powerful it created its own weather, fierce and powerful winds and lightning emerged from the monster-sized blaze.
Flannigan has seen cases where forest fires generated their own thunderstorms, but he's never heard a fire generating its own lightning that sparked a new blaze to grow and spread. That's a first, and he knows what he's talking about.
The prolific researcher has degrees in physics, atmospheric science and plant sciences, and he has published more than 100 research papers in peer-reviewed journals, including 30 papers related to forests and climate change.
A Mountie stands in the middle of a hard-hit Fort McMurray neighbourhood affected by the wildfires of May 2016. RCMP photo.
Flannigan is also among a number of the scientists whose research was featured in a climate change assessment of Canadian forests published by the federal government’s Natural Resources Department in 2009.
The assessment highlighted a number of different peer-reviewed studies showing that forest fire activity has increased significantly over the last 40 years.
The numbers and facts show how rising temperatures are providing a dangerous cocktail of flammable ingredients. One study, quoted in the assessment, found that snow packs were melting one to four weeks earlier than they did 50 years ago in the United States. The U.S. wildfire season is also 78 days longer than it used to be.
And when a wildfire strikes in the U.S., it lasts an average of 37 days, up from 7.5 days, the government’s assessment said.
All in all, the frequency of large fire years and the area burned in the North American boreal region has more than doubled since the 1960s, with most of the activity occurring in the western part of the boreal forest.
The Fort McMurray wildfires are consistent with the trend, most climate scientists say.
Climate change doubters question whether there is any real pattern of connection since there have been some annual fluctuations in the amount of area burned. But statistics from the last three years suggest indicate an alarming pattern.
About four million hectares of land have been affected by forest fires in Canada for the last three years. This is well above the average of about two million hectares over the past decade. It’s even larger than the average from the 1960s when about one million hectares of forest were burned every year in Canada from wildfires.
Research shows the wildfires will get worse, says Yan Boulanger
Taking a step back, over the last 2000 years, forest fire activity has actually been decreasing due to increases in summer precipitation, says Yan Boulanger, a Quebec-City based research scientist from the Canadian Forest Service.
But he said something different has been happening over the past 40 years.
“We can in fact say that the rise of temperature is now starting to counterbalance the trend in precipitation seen in previous decades,” Boulanger told National Observer. “Another thing that we can say is that most aggressive forcing scenarios are pushing fire activity by 2100 close or beyond what was observed during the past.”
It's no time for blame, say environmentalists
Most Canadian media have not reported extensively on the climate change connection to the Fort McMurray wildfires. The Financial Post printed stories questioning whether there was a trend, including a graph of annual statistics from the 1970s that they say suggest no pattern, due to the annual fluctuations from year-to-year.
The city of Fort McMurray is at the heart of Alberta's oilsands industry which hold the world's third largest reserves of crude, behind Saudi Arabia and Venezuela.
The oilsands industry is the country's fastest growing source of greenhouse gas emissions and the government knows that the sector's rising carbon pollution must be capped if Canada is serious about meeting international commitments to reduce its climate-warming emissions.
So Fort McMurray comes up in all discussions about climate change in Canada.
For more than a decade, the oilsands industry has had a big target on its back from environmental groups, backed by climate scientists and governments from around the world who say that we must phase out fossil fuels and keep much of it in the ground to avoid more dangerous changes to the atmosphere.
But many environmental groups in Canada have been cautious about repeating this message, with regards to the oilsands, since the wildfires struck the oilsands hub.
The directors of ten major Canadian environmental groups, including Greenpeace Canada, the David Suzuki Foundation and Environmental Defence, waited a few days after the May 3 evacuation to collectively express their concerns, carefully drawing a link to the planet’s climate change crisis.
“This is not a time for blame. It is time to stand together and make sure the people are safe and well cared for,” the environmental leaders said in a joint statement.
They also said that rebuilding would eventually begin in Fort McMurray and that this would include taking steps to prevent this from happening to any other communities.
“We need to figure out how can we adapt our communities to deal with the risk of fires and we need to take action to reduce the risk of climate change,” the environmentalists said in their statement. “This is part of the rebuilding process and this is the responsibility of all Canadians.”
The federal government, at the same time, says it has been working on improving its national forest fire prevention strategy.
La Nina is coming
Looking at short-term trends, Kerry Anderson, an Edmonton-based research scientist and meteorologist from the CFS said that he and his colleagues had predicted a rough wildfire season in western Canada as early as March, due to the strong El Nino.
But the government experts anticipate the risks of extreme fires - in the current season - to diminish later this summer as El Nino gets replaced by La Nina, which generally brings colder and wetter weather to Canada, he said.
“The El Nino that we experienced is certainly collapsing,” Anderson said. “We’re into a neutral situation and it will likely evolve into a La Nina, probably in late June, early July.”
This won't mean wildfire activity will stop, he says, but it will likely reduce the severity of the conditions that kindle them.