ElectricintheForest

Bernie Sanders Becomes Fastest Candidate In History To Reach One Million Individual Contributions

With just over 12 hours left before the end of the quarter fund raising deadline, the Bernie Sanders campaign announced at 11:30 AM this morning that it reached the amazing milestone of one million individual contributions; “We’ve just reached our goal of one million online contributions. There’s still 12 hours to send a powerful message about the strength of our movement before tonight’s midnight ET deadline.”

Bernie Sanders is not only the first candidate running for the 2016 presidency to reach one million contributions, but his campaign has reach this milestone earlier than any presidential candidate in history – the goal was last achieved by President Obama in October 2011.

The Sanders campaign has been fed by a diet of small donations with this week’s donations averaging just under $25. Sanders has consistently refused to take money from any super PACs and his campaign is capitalizing on this accomplishment as an opportunity to send a powerful message “This deadline is an opportunity to send a powerful message to the political media and the super PACs attacking us about the strength of our campaign,” the campaign posted in a fundraising message on Facebook. “Let’s make sure they hear us loud and clear.”

Last quarter Sanders campaign raised $15 million in only two months with average contribution of $34 and 99% of contributions amounting to less than $250. Sanders makes use of the popular site Reddit, a collection of forums, to raise money. His subreddit is the largest of any presidential candidate and boasts more than 110,000 members. To show their enthusiasm today, as the deadline approached, members of his subreddit posted screenshots of their donations as encouragement for others to do the same.

Bernie Sanders’ campaign is truly a grassroots movement founded on the powerful message of economic popularism – the idea that establishment economics is not working to help the poor and the middle class. As Bernie Sanders gets the message out, more and more people from Maine to California are ‘feeling the bern’ as evidenced by this truly amazing milestone that the Sanders campaign reached today

 

Could Plastic-Eating Worms Be Enough to Overcome Mounting Waste?

Mealworms munch on Styrofoam, a hopeful sign that solutions to plastics pollution exist. Wei-Min Wu, a senior research engineer in the Department of Civil and Environmental Engineering, discovered the larvae can live on polystyrene.

Consider the plastic foam cup. Every year, Americans throw away 2.5 billion of them. And yet, that waste is just a fraction of the 33 million tons of plastic Americans discard every year. Less than 10 percent of that total gets recycled, and the remainder presents challenges ranging from water contamination to animal poisoning.

Enter the mighty mealworm. The tiny worm, which is the larvae form of the darkling beetle, can subsist on a diet of Styrofoam and other forms of polystyrene, according to two companion studies co-authored by Wei-Min Wu, a senior research engineer in the Department of Civil and Environmental Engineering at Stanford. Microorganisms in the worms’ guts biodegrade the plastic in the process – a surprising and hopeful finding.

“Our findings have opened a new door to solve the global plastic pollution problem,” Wu said.

The papers, published in Environmental Science and Technology, are the first to provide detailed evidence of bacterial degradation of plastic in an animal’s gut. Understanding how bacteria within mealworms carry out this feat could potentially enable new options for safe management of plastic waste.

“There’s a possibility of really important research coming out of bizarre places,” said Craig Criddle, a professor of civil and environmental engineering who supervises plastics research by Wu and others at Stanford. “Sometimes, science surprises us. This is a shock.”

Plastic for dinner

In the lab, 100 mealworms ate between 34 and 39 milligrams of Styrofoam – about the weight of a small pill – per day. The worms converted about half of the Styrofoam into carbon dioxide, as they would with any food source.

Within 24 hours, they excreted the bulk of the remaining plastic as biodegraded fragments that look similar to tiny rabbit droppings. Mealworms fed a steady diet of Styrofoam were as healthy as those eating a normal diet, Wu said, and their waste appeared to be safe to use as soil for crops.

Researchers, including Wu, have shown in earlier research that waxworms, the larvae of Indian mealmoths, have microorganisms in their guts that can biodegrade polyethylene, a plastic used in filmy products such as trash bags. The new research on mealworms is significant, however, because Styrofoam was thought to have been non-biodegradable and more problematic for the environment.

Researchers led by Criddle, a senior fellow at the Stanford Woods Institute for the Environment, are collaborating on ongoing studies with the project leader and papers’ lead author, Jun Yang of Beihang University in China, and other Chinese researchers. Together, they plan to study whether microorganisms within mealworms and other insects can biodegrade plastics such as polypropylene (used in products ranging from textiles to automotive components), microbeads (tiny bits used as exfoliants) and bioplastics (derived from renewable biomass sources such as corn or biogas methane).

As part of a “cradle-to-cradle” approach, the researchers will explore the fate of these materials when consumed by small animals, which are, in turn, consumed by other animals.

Marine diners sought

Another area of research could involve searching for a marine equivalent of the mealworm to digest plastics, Criddle said. Plastic waste is a particular concern in the ocean, where it fouls habitat and kills countless seabirds, fish, turtles and other marine life.

More research is needed, however, to understand conditions favorable to plastic degradation and the enzymes that break down polymers. This, in turn, could help scientists engineer more powerful enzymes for plastic degradation, and guide manufacturers in the design of polymers that do not accumulate in the environment or in food chains.

Criddle’s plastics research was originally inspired by a 2004 project to evaluate the feasibility of biodegradable building materials. That investigation was funded by the Stanford Woods Institute’s Environmental Venture Projects seed grant program. It led to the launch of a company that is developing economically competitive, nontoxic bioplastics.

More information: “Biodegradation and Mineralization of Polystyrene by Plastic-Eating Mealworms. 1. Chemical and Physical Characterization and Isotopic Tests.” Environ. Sci. Technol., Just Accepted Manuscript DOI: 10.1021/acs.est.5b02661

“Biodegradation and Mineralization of Polystyrene by Plastic-Eating Mealworms. 2. Role of Gut Microorganisms.” Environ. Sci. Technol., Just Accepted Manuscript DOI: 10.1021/acs.est.5b02663

Provided by: Stanford University

Antibiotic Overuse Might Lead To Antibiotic Resistant Bacteria and Heightened Allergies

Scientists have warned for decades that the overuse of antibiotics leads to the development of drug-resistant bacteria, making it harder to fight infectious disease. The Centers for Disease Control and Prevention estimates that drug resistant bacteria cause 23,000 deaths and two million illnesses each year.

But when we think of antibiotic overuse, we don’t generally think of allergies. Research is beginning to suggest that maybe we should.

Allergies are getting more and more common

In the last two to three decades, immunologists and allergists have noted a dramatic increase in the prevalence of allergies. The American Academy of Asthma, Allergy and Immunology reports that some 40%-50% of schoolchildren worldwide are sensitized to one or more allergens. The most common of these are skin allergies such as eczema (10%-17%), respiratory allergies such as asthma and rhinitis (~10%), and food allergies such as those to peanuts (~8%).

This isn’t just happening in the US. Other industrialized countries have seen increases as well.

This rise has mirrored the increased use of antibiotics, particularly in children for common viral infections such as colds and sore throats. Recent studies show that they may be connected.

Antibiotics can disrupt the gut microbiome

Why would antibiotics, which we use to fight harmful bacteria, wind up making someone more susceptible to an allergy? While antibiotics fight infections, they also reduce the normal bacteria in our gastrointestinal system, the so-called gut microbiome.

Because of the interplay between gut bacteria and the normal equilibrium of cells of the immune system, the gut microbiome plays an important role in the maturation of the immune response. When this interaction between bacteria and immune cells does not happen, the immune system responds inappropriately to innocuous substances such as food or components of dust. This can result in the development of potentially fatal allergies.

Exposure to the microbes at an early age is important for full maturation of our immune systems. Reducing those microbes may make us feel cleaner, but our immune systems may suffer.

Do more microbes means fewer allergies?

Research done in Europe has shown that children who grow up on farms have a wider diversity of microbes in their gut, and have up to 70% reduced prevalence of allergies and asthma compared to children who did not grow up on farms. This is because exposure to such a wide range of microbes allows our immune systems to undergo balanced maturation, thus providing protection against inappropriate immune responses.

In our attempts to prevent infections, we may be setting the stage for our children to developing life-threatening allergies and asthma.

For instance, a study from 2005 found that infants exposed to antibiotics in the first 4-6 months have a 1.3- to 5-fold higher risk of developing allergy. And infants with reduced bacterial diversity, which can occur with antibiotic use, have increased risk of developing eczema.

And it’s not the just the antibiotics kids take that can make a difference. It’s also the antibiotics their mothers take. The Copenhagen Prospective Study on Asthma in Childhood Cohort, a major longitudinal study of infants born to asthmatic mothers in Denmark, reported that children whose mothers took antibiotics during pregnancy were almost twice as likely to develop asthma compared to children whose mothers did not take antibiotics during pregnancy.

Finally, in mice studies, offspring of mice treated with antibiotics were shown to have an increased likelihood of developing allergies and asthma.

Why are antibiotics overused?

Physicians and patients know that overusing antibiotics can cause big problems. It seems that a relatively small number of physicians are driving overprescription of antibiotics. A recent study of physician prescribing practices reported that 10% of physicians prescribed antibiotics to 95% of their patients with upper respiratory tract infections.

Health care professionals should not only be concerned about the development of antibiotic resistance, but also the fact that we may be creating another health problem in our patients, and possibly in their children too.

Parents should think carefully about asking physicians for antibiotics in an attempt to treat their children’s common colds and sore throats (or their own), which are often caused by viral infections that don’t respond to them anyway. And doctors should think twice about prescribing antibiotics to treat these illnesses, too.

As we develop new antibiotics, we need to address overuse

As resistant bacteria become a greater problem, we desperately need to develop new antibiotics. The development process for a new antibiotic takes a considerable amount of time (up to 10 years), and drug companies have previously neglected this area of drug development.

Congress has recognized that antibiotic overuse is a major problem and recently passed the 21st Century Cures bill. This bill includes provisions that would create payment incentives from Medicare for hospitals that use new antibiotics.

But this approach would have the perverse effect of increasing the use of any new antibiotics in our arsenal without regard for whether bacterial resistance has developed. This would not only exacerbate the problem of resistance, but potentially lead to more people developing allergies.

Congress should consider more than just supporting increased development of new antibiotics, but also address the core problem of overuse.

This may stave off the further development of antibiotic resistant bacteria and reduce the trend of increasing development of allergies.

We May Have Just Bought Ourselves An Extra Decade To Avoid Catastrophic Climate Change

Climate

The world appears to have bought itself a little time in the fight to avoid climate catastrophe, according to a new analysis.

Virtually every major country has made pledges to limit or reduce carbon pollution in advance of the Paris climate talks this December. These pledges generally end in 2025 or 2030, and so they only matter if the world keeps ratcheting down its greenhouse gas emissions in future agreements until we get near zero by century’s end. Otherwise we will blow past the 2°C line of defense against very dangerous-to-catastrophic global warming, and hit 3.6°C warming by 2100.

That’s the key finding of a new analysis from Climate Interactive and the MIT Sloan School of Business, tallying up the global pledges to limit carbon pollution leading up to the big Paris climate talks later this year.

Those pledges, called intended nationally determined contributions (INDCs), include the European Union cutting total emissions 40 percent below 1990 levels by 2030, the U.S. cutting net greenhouse gas emissions emissions 26 to 28 percent below 2005 levels by 2025 (including land use change and forestry), and China’s peaking in CO2 by 2030.

The good news, as you can see, is that the INDCs have bought us another five to 10 years of staying close to the 2°C path. I asked Andrew Jones, one of the systems-thinking savants behind Climate Interactive, if that was correct and he said, “Yep, about seven years.” By “staying close” I mean staying close enough to the 2°C path that it remains plausibly achievable – though (obviously) politically still very, very challenging.

Of course, like all emissions models, the Climate Interactive model makes assumptions about what is a plausibly achievable 2°C path given how long we have delayed acting. And that involves deciding how fast the world could plausibly cut its greenhouse gas emissions each year – sustained for many decades. They use 3.5 to 4 percent a year. That is mostly a political-economic judgment, since there is no real way of knowing how fast humanity could act once we become truly desperate to avoid multiple simultaneous catastrophes that are irreversible on a timescale of many centuries.

The point is that a successful outcome of Paris will not “solve the climate problem” and indeed won’t give us a 2°C world, as anyone who is paying attention understands. (Sadly, a lot of folks in the media aren’t paying attention.)

The bad news, of course, is that since about 2007 leading climate experts have been explaining we only have five to 10 years to act. I debunked the myth that they’ve “always” been saying that in my May post, “The Really Awful Truth About Climate Change.”

So what Paris can accomplish is to give us another five to 10 years of … having five to 10 years to act!!! Woo-hoo.

In reality, international climate talks can never buy us more than five to 10 years at a time – until and unless countries are willing to make long-term multi-decade CO2 reduction commitments as the United States tried to do with the 2009 climate bill that was killed in the Senate. Stabilizing at 2°C requires taking global emissions down to near zero steadily by century’s end. Most Paris CO2 commitments are for 2025 or 2030.

Still, this would be an important accomplishment – and one that mirrors the incremental approach the world took to save the ozone layer. As NASA’s Gavin Schmidt told the New York Times, “By the time people get 10, 15 years of actually trying to do something, that’s going to lead to greater expertise, better technology, more experience.” Schmidt, who heads the same climate team James Hansen once did, added, “People will then say, ‘Oh, you know what? We can commit to do more.'”

NASA confirms that liquid water flows on Mars

Liquid water likely exists on the surface of Mars during the planet’s warmer seasons, according to new research published in Nature Geosciences. This revelation comes from new spectral data gathered by NASA’s Mars Reconnaissance Orbiter (MRO), a spacecraft that studies the planet from orbit. The orbiter analyzed the chemistry of weird dark streaks that have been known to appear and disappear seasonally on the Martian surface. The analysis confirms that these streaks are formed by briny – or salty – water flowing downhill on Mars.

NASA has advertised these findings as the solution to a major Mars mystery: does the Red Planet truly have liquid water on its surface? Researchers have known that water exists in ice form on Mars, but it’s never been confirmed if water can remain in a liquid state. The space agency is claiming that we now have that answer.

NASA has advertised these findings as the solution to a major Mars mystery

This isn’t the first study to suggest liquid water is present in some form on Mars. Scientists have theorized for years that Mars was once home to a large ocean more than 4 billion years ago. And recent findings from the Mars Curiosity rover suggest that liquid water exists just underneath the Martian surface. The discovery of water on Mars has almost become a joke among planetary scientists. Alfred McEwan, a planetary geologist at Arizona State University who also worked on this research, wrote in Scientific American that the studies have become extremely commonplace: “Congratulations-you’ve discovered water on Mars for the 1,000th time!” he joked.

Today’s findings seem to offer more direct evidence of liquid water than most, though the study only confirms what NASA has long suspected – that flowing liquid water forms the strange, dark streaks that have been observed on Mars. These streaks – called recurring slope linae – were first observed by the MRO spacecraft in 2010. The lines are blackish and narrow at less than 16 feet across. During the warmer seasons, the streaks grow thicker and longer; they then fade and shrink at times when Mars is colder.

This led scientists to believe years ago that perhaps water and salt were involved in the creation of these lines. “[The streaks] loved forming at temperatures that were right for liquid water to exist,” study author Lujendra Ojha, a graduate student at Georgia Tech, told The Verge.

The average temperature on Mars is a frigid -80 degrees Fahrenheit, but on a summer day near the equator, the temperature can reach up to 70 degrees Fahrenheit. Ojha and his team speculated that when conditions are warm enough, liquid water filled with perchlorates – a type of salt – flow downhill on the planet’s sloping geological features. Together, water and perchlorates form a brine solution, which has a much lower freezing point than water. This allows the brine to stay in a liquid state even when temperatures grow colder. Ultimately, the streaks are the leftover salt deposits from these briny flows, Ojha believed.

The new study published today offers direct evidence that liquid water is indeed involved. Using the MRO’s imaging spectrometer, the researchers studied the chemical makeup of the recurring slope linae. The visible-infrared spectrometer, which can determine the composition of minerals by observing them in different light wavelengths, showed that the dark streaks were indeed composed of hydrated salts that have molecular water in their crystal structure. “What that seems to be telling us is that water plays a key role in the formation mechanism of these features,” said Ojha.

Water strengthens the possibility of finding microbial life on the Red Planet

As for where this water is coming from, Ojha noted there are three possible sources. The perchlorates may be pulling water out of the Martian atmosphere when the air grows particularly humid. The water also may be from a subsurface reservoir of ice that turns into liquid when it comes in contact with the salts. There’s even the possibility of an aquifer that is generating the water needed for the briny flows.

Whatever the source, Ojha said the evidence is unambiguous proof that liquid water exists on Mars. And if so, that strengthens the possibility of finding microbial life on the Red Planet. The presence of liquid water on Earth is intimately linked with the formation of life, so the odds are better than ever that extraterrestrial organisms are nearby in our Solar System.

Except we kind of already knew that. But now, we’re really, really sure.

France bolsters ban on genetically modified crops

The European Union’s largest grain grower and exporter has asked the European Commission for France to be excluded from some GM maize crop cultivation under the new scheme, the farm and environment ministries said in a joint statement.

As part of the opt-out process, France also passed legislation in the National Assembly that would enable it to oppose the cultivation of GM crops, even if approved at EU level, on the basis of certain criteria including environment and farm policy, land use, economic impact or civil order, the environment ministry added.

Widely grown in the Americas and Asia, GM crops have divided opinion in Europe. France had already banned cultivation of U.S. group Monsanto’s GM maize, saying it had serious doubts that it is safe for the environment.

Monsanto says its maize (corn) is harmless to humans and wildlife.

The EU opt-out, agreed in March, allows individual countries to seek exclusion from any approval request for GM cultivation in the 28-member bloc or varieties already cleared as safe by the EU.

Monsanto’s MON810 maize is the only GM crop grown in Europe, where it has been cultivated in Spain and Portugal for a decade, but other maize crops are in the process of being approved at EU level.

One of them is an insect-resistant maize known as 1507. Its developers, DuPont Pioneer and Dow Chemical, have been waiting nearly 15 years for the EU executive to authorize its cultivation in the bloc.

The French request concerns nine GM maize strains. Producers also include Switzerland’s Syngenta, a spokesman for the environment ministry said.

Germany also intends to make use of the new EU rules to stop the growing of GM crops, documents seen by Reuters showed last month.

The European Commission is responsible for approvals, but under the new rules requests for opt-outs also have to be submitted to the company making the application.

Monsanto has said it will abide by requests from Latvia and Greece to be excluded from its application to grow a GM crop in the EU but accused them of ignoring science.

(Reporting by Sybille de La Hamaide, Gus Trompiz and Valerie Parent; Editing by Jane Merriman and David Goodman)

Can Battery Technology Overcome the Last Hurdle for Sustainable Energy?

“The worldwide transition from fossil fuels to renewable sources of energy is under way …” according to the Earth Policy Institute’s new book, The Great Transition.

Between 2006 and 2012, global solar photovoltaic’s (PV) annual capacity grew 190 percent, while wind energy’s annual capacity grew 40 percent, reported the International Renewable Energy Agency. The agency projects that by 2030, solar PV capacity will be nine times what it was in 2013; wind power could increase five-fold.

Electric vehicle (EV) sales have risen 128 percent since 2012, though they made up less than 1 percent of total U.S. vehicle sales in 2014. Although today’s most affordable EVs still travel less than 100 miles on a full battery charge (the Tesla Model S 70D, priced starting at $75,000, has a 240-mile range), the plug-in market is projected to grow between 14.7 and 18.6 percent annually through 2024.

The upward trend for renewables is being driven by concerns about climate change and energy security, decreasing solar PV and wind prices, rising retail electricity prices, favorable governmental incentives for renewable energy, the desire for energy self-sufficiency and the declining cost of batteries. Growing EV sales, also benefitting from incentives, are affecting economies of scale in battery manufacturing, helping to drive down prices.

Sun and wind energy are free, but because they are not constant sources of power, renewable energy is considered “variable”-it is affected by location, weather and time of day. Utilities need to deliver reliable and steady energy by balancing supply and demand. While today they can usually handle the fluctuations that solar and wind power present to the grid by adjusting their operations, as the amount of energy supplied by renewables grows, better battery storage is crucial.

Batteries convert electricity into chemical potential energy for storage and back into electrical energy as needed. They can perform different functions at various points along the electric grid. At the site of solar PV or wind turbines, batteries can smooth out the variability of flow and store excess energy when demand is low to release it when demand is high. Currently, fluctuations are handled by drawing power from natural gas, nuclear or coal-fired power plants; but whereas fossil-fuel plants can take many hours to ramp up, batteries respond quickly and when used to replace fossil-fuel power plants, they cut CO2 emissions. Batteries can store output from renewables when it exceeds a local substation’s capacity and release the power when the flow is less or store energy when prices are low so it can be sold back to the grid when prices rise. For households, batteries can store energy for use anytime and provide back-up power in case of blackouts.

Batteries have not been fully integrated into the mainstream power system because of performance and safety issues, regulatory barriers, the resistance of utilities and cost. But researchers around the world are working on developing better and cheaper batteries.

Shell says it will abandon oil exploration in Alaska Arctic

Update 4 a.m. Monday:

In midday trading in London, Shell’s share price was down 1.7 percent in a weak overall market after opening on news that the company will cease exploration in the Alaska Arctic.

But an industry analyst told The Wall Street Journal he believes Shell’s investors will generally be happy with the development.

“Investors don’t want Shell to deliver more [capital expenditures] into Alaska,”  Bernstein research analyst Oswald Clint told WSJ. “I imagine investors will be OK with a $1 billion hit versus tens of billions in the future.”

————

Sunday night story:

Royal Dutch Shell will cease exploration in Arctic waters off Alaska’s coast following disappointing results from an exploratory well backed by billions in investment and years of work.

The announcement was a huge blow to Shell, which was counting on offshore drilling in Alaska to help it drive future revenue. Environmentalists, however, had tried repeatedly to block the project and welcomed the news.

Shell has spent upward of $7 billion on Arctic offshore exploration, including $2.1 billion in 2008 for leases in the Chukchi Sea off Alaska’s northwest coast, where an exploratory well about 80 miles off shore drilled to 6,800 feet but yielded disappointing results. Backed by a 28-vessel flotilla, drillers found indications of oil and gas but not in sufficient quantities to warrant more exploration at the site.

“Shell continues to see important exploration potential in the basin, and the area is likely to ultimately be of strategic importance to Alaska and the U.S.,” Marvin Odum, president of Shell USA, said in The Hague, Netherlands. “However, this is a clearly disappointing exploration outcome for this part of the basin.”

Shell will end exploration off Alaska “for the foreseeable future,” the company said, because of the well results and because of the “challenging and unpredictable federal regulatory environment in offshore Alaska.

The Burger J well drilled this summer will be plugged and abandoned, Shell spokeswoman Megan Baldino said. The two rigs mobilized by Shell for its Chukchi work, the Polar Pioneer and the Noble Discoverer, will be heading south, along with the support vessels, she said. “We’ll begin demobilizing now,” she said late Sunday. That process will take “as long as it takes to do it safely,” she said.

Baldino said no decisions have been made yet about Shell’s workforce, and she did not have figures on Sunday night for total employees and contractors. But the pullout will mean “lower staff” numbers, she said.

Shell’s decision to cease offshore Arctic exploration affects both the Chukchi and the Beaufort. Its leases in the Chukchi are scheduled to expire in 2020 and most of its Beaufort leases are scheduled to expire in 2017, according to a status report from the Bureau of Ocean Energy Management.

Margaret Williams of the World Wildlife Fund in Anchorage, called the news of Shell’s withdrawal stunning.

“That’s incredible. That’s huge,” she said. “All along the conservation community has been pointing to the challenging and unpredictable environmental conditions. We always thought the risk was tremendously great.”

Environmental groups said oil exploration in the ecologically fragile Arctic could lead to increased greenhouse gases, crude oil spills and a disaster for polar bears, walrus and ice seals. Production rigs extracting oil would be subject to punishing storms, shifting ice and months of operating in the cold and dark. Over the summer, protesters in kayaks unsuccessfully tried to block Arctic-bound Shell vessels in Seattle and Portland, Oregon.

“Polar bears, Alaska’s Arctic and our climate just caught a huge break,” said Miyoko Sakashita, oceans program director for the Center for Biological Diversity, in a statement. “Here’s hoping Shell leaves the Arctic forever.”

Monday was Shell’s final day to drill this year in petroleum-bearing rock under its federal permit. Regulators required Shell to stop a month before sea ice is expected to re-form in the lease area.

The U.S. Geological Survey estimates U.S. Arctic waters in the Chukchi and Beaufort seas contain 23 billion barrels or more of recoverable oil in total. Shell officials had called the Chukchi basin “a potential game-changer,” a vast untapped reservoir that could add to America’s energy supply for 50 years.

Shell had planned at least one more year of exploration with up to six wells drilled.

A transition to production could have taken a decade or longer.

Shell had the strong backing of Alaska officials and business leaders who want a new source of crude oil filling the trans-Alaska pipeline, now running at less than one-quarter capacity.

Charles Ebinger, senior fellow for the Brookings Institution Energy Security and Climate Initiative, said in an interview that a successful well by Shell would have been “a terribly big deal,” opening an area that U.S. officials say contains 15 billion barrels of oil.

While oil prices have dropped significantly in recent years and nations have pushed for cleaner energy sources, analysts predict that the world between 2030 and 2040 will need another 10 million barrels a day to meet growing demand, especially in developing countries, Ebinger said.

“Areas like the Arctic are one of the areas that, if we’re going to be able to do this, we need to examine,” he said.

Shell in 2012 sent drill rigs to the Chukchi and Beaufort seas but was not allowed to drill into oil-bearing rock because the containment dome had been damaged in testing.

The company’s vessels suffered serious setbacks getting to and from the Arctic.

One drill vessel broke loose from its towline in the Gulf of Alaska and ran aground near Kodiak Island. Owners of the leased Noble Discoverer, which drilled in the Chukchi and is back this year, pleaded guilty to eight felony maritime safety counts and paid a $12.2 million fine.

That was proof of Shell’s Arctic incompetence, critics said.

Odum called drilling off Alaska’s coast the most scrutinized and analyzed oil and gas project in the world and said he was confident Shell could drill safely.

Alaska Dispatch News reporter Yereth Rosen contributed to this report.

How One Commitment Can Change Your Life

Welcome to The Academy Full disclosure- I didn’t join Superhero Academy through the normal means. The instructor, Marc Angelo Coppola, held my interest for other reasons, but I’ll get into those later. Regardless, now that I’m in it, I wouldn’t have changed this part of my life for any other option. Ultimately, what got me …

Read more

Norway rewards Brazil with $1 billion for keeping the Amazon full of trees

Much of South America’s Amazon rainforest will continue to be clean, lush, and green-thanks in part to a country on the other side of the world.

In 2008, when the Amazon was facing a severe deforestation crisis, Norway, a country made rich from oil and gas production (and the biggest donor to protect tropical rainforests), pledged $1 billion to the government of Brazil if it could slow down the destruction. Doing so would protect the forest’s wildlife and also enormously reduce climate-harming greenhouse gas emissions, which are produced when forests are burned to make way for human development.

Brazil has more than risen to the task. By enforcing strict protection laws, promoting education efforts, and withholding loans to local counties that clear too much of the forest, the country has scaled back its forest destruction rate by 75%. It’s estimated that Brazilian farmers and ranchers have saved more than 33,000 square miles (roughly 53,100 square kilometers) of forest-equivalent to 14.3 million soccer fields-from being cut down.

This week, an applauding Norwegian government said it will pay out the country’s final $100 million-rounding out its $1 billion promise-to Brazil at a December UN summit on climate change. In a statement, UN Secretary-General Ban Ki-moon called the deal an “outstanding example” of international collaboration on sustainability.

Brazil’s blazing success in deforestation reduction is, indeed, a model for other countries-particularly the others that occupy the Amazon rainforest. Its Norway-subsidized efforts have translated into the largest emissions cut in the world, preventing roughly 3.2 billion tons of carbon dioxide emissions. That’s, by the way, how much America would save by taking all the cars off its roads for three years.

Veil Nebula Supernova Remnant

NASA’s Hubble Space Telescope has unveiled in stunning detail a small section of the expanding remains of a massive star that exploded about 8,000 years ago.

Called the Veil Nebula, the debris is one of the best-known supernova remnants, deriving its name from its delicate, draped filamentary structures. The entire nebula is 110 light-years across, covering six full moons on the sky as seen from Earth, and resides about 2,100 light-years away in the constellation Cygnus, the Swan.

This view is a mosaic of six Hubble pictures of a small area roughly two light-years across, covering only a tiny fraction of the nebula’s vast structure.

This close-up look unveils wisps of gas, which are all that remain of what was once a star 20 times more massive than our sun. The fast-moving blast wave from the ancient explosion is plowing into a wall of cool, denser interstellar gas, emitting light. The nebula lies along the edge of a large bubble of low-density gas that was blown into space by the dying star prior to its self-detonation.

Image Credit: NASA/ESA/Hubble Heritage Team

Last Updated: Sept. 24, 2015
Editor: Sarah Loff

Sturdy Solidwool furniture is formed from wool and bio-resins

We’re mostly familiar with wool as a material for sweaters and socks (or perhaps for eco-friendly biobricks), but English designers Hannah and Justin Floyd of Solidwool are transforming this soft material by combining it with bio-resins, and turning locally sourced wool into sturdy, handcrafted furniture.

© Solidwool
© Solidwool© Solidwool © Artifact Uprising
© Blok Knives
© Fan Optics
© Solidwool
© Solidwool
© Solidwool © Solidwool

The Floyds’ intention was to revive their hometown of Buckfastleigh, which has been traditionally known for its production of wool. Their product, Solidwool, is meant to be an alternative to injection-moulded plastics and fibreglass, by using wool as the reinforcing material, and bio-resins as the binder. They use the fleeces of a particular, local breed of sheep, the Herdwick, which were once widely used by the UK carpet industry. However, demand has declined so far that this “wiry, dark and hard” wool has been unfortunately devalued, but the duo hope to change that by finding new ways to utilize and sustain the local sheep-based economy:

This wool is something special, but along the way, something has gone wrong and its perceived value has been lost. It is currently one of the lowest value wools in the UK. Once this wool was a major part of a shepherd’s income. Now the wool from one sheep sells for around 40p. But we see a beauty in this natural material and want to help see its value increase. The Herdwick flock and their shepherds are custodians of their wild landscape. We want to help them stay that way.

Working with bio-resins and wool in the past few years, the pair have come up with a composite material that they believe is a viable alternative to petrochemically based plastics found in a lot of mass-made furnishings. With a bio-based renewable content of about 30 percent, the bio-resins are diverted from the waste-streams of other manufacturing processes like wood pulping and bio-fuel production, meaning carbon emissions are halved compared to conventional resin production and no resources are diverted from agricultural crops.

Solidwool isn’t just for the chairs and tables seen in their Hembury collection; it could be used for any product like eyeglasses and knife handles. According to Design Milk, the Floyds are collaborating with other companies like Blok Knives, Artifact Uprising and Fan Optics to use Solidwool in their products.

We are liking how wool is used in a surprising and new way here to create durable and eco-friendly furniture and accessories, beyond the conventional fashion. You can see more or shop around over at Solidwool.

Please enable JavaScript to view the comments.

First comprehensive Tree of Life illustrates relationships between 2.3 million species

In what’s being called the “first real attempt to connect the dots and put it all together,” this open-access project aims to link “all biodiversity through a shared evolutionary history.”

How did life on Earth go from simple single-celled organisms to the incredibly complex human body? A number of attempts have been made to build an evolutionary ‘tree of life’ that connects the organisms on the planet, but until now, there has been no single comprehensive tree of life assembled. However, thanks to a multi-year grant from the U.S National Science Foundation, a collaborative effort from researchers at 11 institutions has produced an initial draft of this audacious project, which includes some 2.3 million species, called the Open Tree of Life.

The Open Tree of Life builds on the work of previous researchers, who have created some tens of thousands of smaller ‘trees’ for individual branches, and the result is a massive digital resource that aims to connect the threads of millions of species on Earth. The project is open-access and editable, which means that not only can anyone view the data, but can also edit or add to it, somewhat like a Wikipedia for evolutionary relationships.

“Evolutionary trees, branching diagrams that often look like a cross between a candelabra and a subway map, aren’t just for figuring out whether aardvarks are more closely related to moles or manatees, or pinpointing a slime mold’s closest cousins. Understanding how the millions of species on Earth are related to one another helps scientists discover new drugs, increase crop and livestock yields, and trace the origins and spread of infectious diseases such as HIV, Ebola and influenza.” – Duke University

Lead by principal investigator Karen Cranston of Duke University, the Open Tree of Life project is based on almost 500 previously published trees, and is significant for a couple of reasons, not the least of which is its sheer size and scope, because while many other evolutionary trees have been created, many of them have not been available previously in a digital format for download or analysis.

By making this data readily accessible and editable, it is hoped that this work will help researchers to “fill in the gaps” between what we know and what we don’t know, and to clarify and resolve conflicts in certain branches of phylogeny. It will also serve as a starting point for adding new species as they are discovered and named.

“Twenty five years ago people said this goal of huge trees was impossible. The Open Tree of Life is an important starting point that other investigators can now refine and improve for decades to come.” – Douglas Soltis of the University of Florida, co-author

The Open Tree of Life is free to browse and/or download at https://tree.opentreeoflife.org, and the source code is available at GitHub. An article on the project was recently published at the Proceedings of the National Academy of Sciences of the United States of America (PNAS): Synthesis of phylogeny and taxonomy into a comprehensive tree of life. Six of the authors also took part in an AMA (Ask Me Anything) event on Reddit yesterday, fielding many questions about the project.

Please enable JavaScript to view the comments.

The Associate Press wants reporters to stop calling people “climate deniers”

The Associated Press’ style guide is a kind of Bible for many in the media, and there are plenty of journalists will tell you they’ve become so indoctrinated in its rules that they write everything from emails to texts in AP style. So, when the AP makes changes to its recommendations about a highly covered topic like climate change, plenty of people will be talking about it.

Yesterday, the AP made a change to how it recommends its reporters and editors describe the people we’ve been calling ” climate change deniers” or ” climate change skeptics.” In fact, the AP doesn’t recommend either of those terms, but instead favors either “climate change doubters” or “those who reject mainstream climate science.”

It turns out that the phrase “climate change skeptics” was a basically offensive to scientists who consider themselves skeptics. The AP explains:

“Scientists who consider themselves real skeptics – who debunk mysticism, ESP and other pseudoscience, such as those who are part of the Center for Skeptical Inquiry – complain that non-scientists who reject mainstream climate science have usurped the phrase skeptic.”

But, it turns out that those same skeptical scientists, as well as other groups that work on the issue of climate change, aren’t that enthusiastic about the AP’s recommendation to use “climate change doubter” instead. 350.org’s spokesperson Karthik Ganapathy told the Huffington Post that “doubt seems to imply a lack of clarity – and there is a lack of clarity on some things, like what the ideal solution to climate change is, but there’s zero lack of clarity on whether or not it’s happening.”

The Center for Skeptical Inquiry, which lobbied against “climate change skeptic,” likewise doesn’t like the term “doubter” but endorses “those who reject mainstream climate science.” The later phrase is clear, but sadly not concise.

Meanwhile, the AP also doesn’t recommend “climate change denier” because it has “the pejorative ring of Holocaust denier.” This recommended change has been met with more resistance.

One can make the case that “denier” has the right corrective sting to deal with people confronted with an overwhelming body of evidence that they are wrong. So often, we associate “denying” with opposing the truth, as someone who is in denial is someone who fails to see reality.

So, how are other publications handling the AP announcement? Grist says they’ll stick with the term “deniers.” Erik Wemple at the Washington Post finds the “argument that the term ‘denier’ can’t be paired with another term without tinging it with Holocaust implications” to be “specious” and seems like a “a dicey precedent.”

Of course, the debate also offers a delicious opportunity to propose other alternatives. Personally, I could go for “unhelpful climate womp womps” or “lying hypocrites who make money from the fossil fuel industry.” Feel free to add yours in the comments!

Please enable JavaScript to view the comments.

The Divestment Movement Has Grown 50-Fold In Just One Year

Climate

by

CREDIT: Photo by Jordan Strauss/Invision/AP, File

The divestment movement is really gaining steam – non-coal, non-fossil-fuel powered steam.

Investors representing $2.6 trillion in assets have pledged to cut fossil fuels from their portfolios, a fifty-fold increase from last year. At least 436 institutions have pledged to stop investing in fossil fuels – for moral or financial reasons. Large pension funds and private companies make up 95 percent of the assets, according to analysis released Tuesday by Arabella Advisors.

“If these numbers tell us anything, it’s that the divestment movement is catching fire,” said May Boeve, executive director of campaigners 350.org.

Actor Leonardo DiCaprio, who established a fund for conservation projects in 1998, also announced that he would join the movement by divesting his assets and those of the Leonardo DiCaprio Foundation.

“Mainstream financial views of fossil fuels will never be the same,” Ellen Dorsey, executive director of the Wallace Daniel Fund, said at a press conference Tuesday. “It is increasingly clear that it is neither OK nor smart to be invested in fossil fuels.”

The divestment movement has two primary components: The idea that owning fossil fuel investments is tantamount to funding climate change, and the idea that the fossil fuel industry itself is poised to lose value over the long term.

“The movement has exposed the embedded vulnerabilities in the fossil fuel industries, from carbon reserves that can never be burned to wasting of company funds on continued exploration for new fossil fuels that can never be used,” Dorsey said. “You are increasingly risking the value of your portfolio if you stay invested in fossil fuels.”

Another analysis found that Massachusetts’ pension plan lost half a billion dollars in the last fiscal year through its fossil fuel investments, ThinkProgress reported Monday.

Perhaps unsurprisingly, as the divestment movement’s reach expands, its geographic footprint does, too. Since last year, the number of divesting foundations based outside the United States has increased from 20 to 34 percent, according to Tuesday’s report. Likewise, the number of universities has expanded from 14 to 40, now representing $130 billion in assets.

The diversity of organizations – from religious institutions to universities to giant public pension funds – suggests that divestment is trumping public or corporate pressures.

“When an organization divests, there’s an acknowledgment of the seriousness of climate change and an acknowledgment that some of these [fossil fuel] companies bear some of the responsibility and could be viewed as part of the problem,” Will Lana, a partner at Trillium Asset Management, told ThinkProgress. “It takes a lot of courage for an institution to recognize that, even if it’s clearly the case,” he added.

That’s a far cry from divesting. But the smaller move appears to have been spurred by Pope Francis’ visit to the United States this week. The pope has been outspoken in the need to act on climate change, which has alienated some U.S. Catholics. Georgetown University, another Catholic institution, has already voted to divest from coal.

In addition to DiCaprio, 2,039 other individual investors have pledged to withdraw from fossil fuel investments. And it is becoming easier for people concerned about climate change to track their money. A website launched last week, Fossil Free Funds, allows people to check their mutual funds and retirement plans for fossil fuel investments.

The Axiom House is a flatpack prefab net-zero “game-changing concept home”

Being an architect can be so frustrating; the established ways are so entrenched, residential building technology is so primitive, it’s all done by hand in the field where the biggest innovation in thirty years was the nail gun replacing the hammer. Meanwhile in the architects offices there are tools that we never dreamed of thirty years ago- computers instead of drafting boards, 3D renderings that spring our of our drawings like magic, and perhaps most importantly of all, the Internet that changes how architects can market what they do.

That’s why the Axiom House, being developed in Kansas City by Acre Designs, is so interesting. Jennifer Dickson is an architect; Andrew Dickson is an industrial designer; together they are trying to turn a house into an industrial product that can be delivered anywhere in a shipping container, for a price that is competitive with conventional construction. They are not thinking like designers, but like a tech startup:

Acre is the very definition of a technology company. We apply scientific knowledge from the fields of architecture, engineering, environmental design, and material and construction science in the most practical way imaginable. We’ve used these practices to create homes that take half the time to build, use a fraction of the resources, and have as little as half the lifetime cost of traditional homes.

The house itself is a flexible 1800 square feet, designed to adapt to its occupants’ life cycles. It’s described as Net Zero energy, producing as much energy as it consumes; it achieves this by being built to near passive house standards so that very little energy is required to operate it in the first place. As they note,

Our homes are 90% more efficient than standard construction to begin with, and we make up the difference with a small solar panel (PV) array. We start with an efficient floor plan and a tight building envelope to prevent air from getting in or out. We use high-efficiency doors, windows, and appliances, take advantage of natural (passive) heating from the sun, and utilize unique heating and cooling solutions.

© Acre

The heating and cooling system is indeed unique; I had to ask for an explanation. In the early days of Passive Houses, many had what are called Earth Tubes, or big pipes buried in the ground that were used as ducts to pre-cool or pre-warm air to the ground temperature, which is about 55°F in Kansas City. But earth tubes proved hugely problematic, delivering condensation, mold, radon and other wonderful things as well as air. Instead, the Axiom house has what they call Passive Geothermal, (PGX) a riff on what others have called brine loops or glycol ground loops. There is a grid of pipes buried in the ground which deliver water at near 55°F to a heat exchanger built into the Heat Recovery Ventilator (HRV) that is required in a house that is so tight. So one gets all the benefits of an earth tube, preheating or precooling the air, without the problems and at a lot lower cost than a fancy ground source heat pump. They appear to work well; in an article by Martin Holladay in Green Building Advisor, a passive house builder called them “amazingly effective.” However Martin, always the skeptic, writes:

Of course, just because a ground loop works, doesn’t mean the system is cost-effective. Many energy experts have speculated that the pump needed to circulate the glycol solution uses almost as much energy as the system collects. The results of one monitoring study indicate that these experts may be right; data gathered in Vermont suggest that the simple payback period for this type of system may be as much as 4,400 years.

© Acre

They are also delivering the tempered water to the radiant floor, and and addition to this system, the house also has a mini-split air source heat pump. Given the near- passive house amount of insulation, tight construction and careful siting, I suspect it won’t get a lot of use.

© Acre

The structure is a flatpack of SIPs, or structural insulated panels. These are a sandwich panel of plywood or OSB board and expanded polystyrene insulation, 10″ thick for the roof and 8″ for the walls. They claim that that it can be built at prices competitive with other houses in Kansas City, running now at $110 to $135 per square foot. How do they do it?

It’s not any one thing, but a combination of strategies, that allows us to achieve this. A few examples:

By offering fixed plans, we can build hours of engineering and design into the base cost of the home. Just like with your car or phone, focused product development helps us deliver a refined, high-performance home that can be repeated again and again. Starting with a right-sized, efficient floor plan has a domino effect: reducing up-front costs, energy demands and system sizes throughout the house. With a lighter load, we can eliminate ductwork, wiring, plumbing runs, and the expensive labor associated with these. With streamlined, repeatable construction, we shave months of labor costs out of each job.

© Acre

Jennifer Dickson tells Metropolis:

We see no reason why architect-designed, highly efficient housing should not be attainable at a reasonable price point. To do that, we are treating this more like a car than a house. With cars, the design effort goes in at the front end, and at the purchase end, the customers do not get a custom product, but they get access to high-end finishes and their choice of features. We think we can leverage buying power by providing a set of well-designed packages.

Having used these same arguments for a decade when I was working in prefab, I am a bit skeptical that they can do that. I found again and again that designs are rarely repeatable, everyone wants to customize, and that customers don’t care about right-sizing, they care about price per square foot. And it’s just so hard to compete with conventional construction, the guy in a pickup with a magnetic sign and a nail gun.

But it is so exciting to see architects and designers trying to innovate in the design of homes and the way that their services and the product are delivered. I am really rooting for them and hope it works. Read more on the website and like any startup looking for money, attention and validation, they are crowdsourcing on Indiegogo.

Please enable JavaScript to view the comments.

Too bad NASA’s plan for space-based solar never happened

It’s always irksome when tech companies talk about their latest “moonshot.” The actual moonshot was one of the most incredible accomplishments of humankind. In 1961, President John F. Kennedy challenged NASA to put someone on the moon by the end of the decade, and NASA, which hadn’t even put someone in orbit yet, was like, “On it, boss,” and then had three people on the moon eight years later. So sorry, Google, even if Google Glass hadn’t flopped, it wouldn’t have been a moonshot, and neither will anything else that comes out of the ” moonshot factory.”

So it’s a real bummer to find out that the agency that today’s most powerful engineers and entrepreneurs so desperately want to emulate had a mind-blowingly awesome plan for a space-based solar factory back in the 70s that never came to fruition. Here’s the scoop from Motherboard:

At the height of the oil crisis in the 1970s, the US government considered building a network of 60 orbiting solar power stations that would beam energy down to Earth. Each geosynchronous satellite, according to this 1981 NASA memo, was to weigh around 35,000 to 50,000 metric tons. The Satellite Power System (SPS) project envisaged building two satellites a year for 30 years.

To get said power stations into orbit, the once-powerful aerospace manufacturing company Rockwell International designed something called a Star-Raker, which, in addition to sounding like something from a sci-fi movie, also would have acted like one:

… The proposed Star-Raker would load its cargo at a regular airport, fly to a spaceport near the equator, fuel up on liquid oxygen and hydrogen, and take off horizontally using its ten supersonic ramjet engines. A 1979 technical paper lays out its potential flight plan: At a cruising altitude of 45,000 feet, the craft would then dive to 37,000 feet to break the sound barrier. At speeds of up to Mach 6, the Star-Raker would jet to an altitude of 29km before the rockets kicked in, propelling it into orbit.

Just to recap: The Star-Raker would have broken the speed of sound by diving seven miles. And the spacecraft would have been making so many regular trips to orbit that it would have essentially been a 747 for space, Motherboard reports.

In terms of feasibility, here’s how one scientist put it at the time:

“The SPS is an attractive, challenging, worthy project, which the aerospace community is well prepared and able to address,” physicist Robert G. Jahn wrote in the foreword to a 1980 SPS feasibility report. “The mature confidence and authority of…[the working groups]…left the clear impression that if some persuasive constellation of purposes…should assign this particular energy strategy a high priority, it could be accomplished.”

Putting solar plants in space would’ve been hard, sure, but this proposal came just ten years after NASA landed Apollo 11 on the moon, so doing seemingly impossible things was kind of their thing. Even if SPS hadn’t happened as planned (and for more details on what exactly that plan was, check out this in-depth look from Wired), there’s no doubt that with the right amount of support and funding, NASA could’ve done something incredible in the clean tech arena.

Today, NASA remains an indispensable source of climate change research. Unfortunately, politicians aren’t as eager to throw money at the agency now that we’re no longer trying to show up the Soviet Union (in fact, the U.S. government is now relying on Russia to take U.S. astronauts up to the International Space Station). And some members of Congress (lookin’ at you, Ted Cruz) have it in their heads that NASA shouldn’t even be doing Earth sciences research in the first place.

We know from the landing of the Curiosity Rover on Mars back in 2012 that NASA still has the ability to inspire and astonish. People geeked out hard over those “seven minutes of terror” and for good reason. Getting that same kind of support behind something that addresses climate change would be exactly what this world needs. If only the one organization proven capable of doing moonshots wasn’t beholden to a bunch of science-hating idiots.

Save the bees with seed bombs

Seed bombs began as a fun and friendly tactic for greening abandoned lots in urban spaces. “Guerrilla gardeners” throw balls of seeds and fertilizer into fenced-off spaces that are otherwise neglected, such as brownfields or land in zoning limbo.

Now, a California company is using seed bombs as a strategy to fight the disappearance of bees. Ei Ei Khin and Chris Burley started Seedles with the aim of spreading bee-friendly wildflowers in neighborhoods around the country. Their goal is to grow 1 billion wildflowers with the help of colorful seed balls, a project they call “Grow the Rainbow.”

Bee populations have been dropping for about a decade. Scientists think there are a number of contributing factors to colony collapse, including the proliferation of certain pesticides, parasites, and even stress. But a decline in natural habitat-along with the loss of bees’ preferred wildflowers -is also a big factor. That’s how Seedles hopes to help, by encouraging people to plant more flowers.

© Seedles

Seedles creates seed balls with wildflowers native to six different regions of the United States. For example, the Midwest mix may include wild perennial lupine, lemon mint and butterfly weed. The seeds are rolled up with organic compost to fertilize the seeds, and non-toxic color powders to add a bit of fun. The balls can be tossed anywhere you want flowers to grow, and with the help of some rain and sun will start to sprout.

For Khin and Burley, helping the bees is part of building a more sustainable food system, which is dependent on pollinators for many foods. Burley told Bay Area Bites that the company is partnering with like-minded local food companies, to give away seed balls and raise awareness about the connection between bees and food.

A pack of 20 seedballs sells for $13.00 on the Seedles website. Or if you’re feeling crafty, check out this DIY tutorial on Gardenista.

Naturally occurring ‘GM’ butterflies have wasp genes

Things are getting freaky in the critter world. Researchers from Spain and France have discovered genes from parasitic wasps present in the genomes of many butterflies. The results of their study reveal that even the iconic monarch contains naturally produced GMOs.

Say what?!

It all seems to have started with the particular habits of parasitic braconid wasps. These guys (well, females actually) lay their eggs inside caterpillars and inject a “giant virus” named bracovirus to trip up the caterpillars’ immune response. Proving once and for all that truth really is stranger than fiction, this nifty trick allows the virus to integrate into the DNA of the caterpillars and control caterpillar development, allowing the wasp larvae overlords to colonize their host.

The bracovirus genes were found in the genomes of several species of butterfly and moth in addition to monarchs, including silkworms and pests such as the Fall Armyworm ( Spodoptera frugiperda) and the Beet Armyworm ( Spodoptera exigua).

And the genes found within are not just remnants, it appears that in fact, they play a protective role against other viruses known as baculoviruses. In addition, remarkably, the genes weren’t exclusive to the wasp virus, some of them originated from the actual wasp. In the armyworm species of moths, the researchers found genes that are closely related to genes from hymenoptera, including the honey bee.

Proponents of producing GM insects might latch on to this as an argument in favor of their work – that GM insects already exist in nature, so it’s a natural thing to do. But in showing fluidity of genes between species, the study really provides more ammunition for those opposed. For example, if insecticide resistance genes were to be artificially introduced into wasp species for biological control of other pests, it could lead to accidental transmission of this resistance to the target pests. And then what? I know, let’s not try it and see what happens.

On MNN: 8 of the cutest toxic caterpillars

Please enable JavaScript to view the comments.