Monday, December 9, 2013

Reverse engineering for fuel

Making carbon dioxide by burning hydrocarbons is easy. A pair of novel catalysts recently made by researchers at the University of Illinois at Chicago could make it far more practical to do the reverse, converting carbon dioxide and water into fuel.

Because running this reaction normally requires large amounts of energy, it has been economical only in rare cases. But if the process could be done commercially, liquid fuels could be made from the exhaust gases of fossil-fuel power plants.


The new work, described this week in the journal Nature Communications, improves on a pair of catalysts discovered last year that more efficiently turn carbon dioxide into carbon monoxide, which can then be made into gasoline and other products. Those catalysts produce carbon monoxide slowly, however, and one is made of silver, so it’s expensive.

But the Illinois researchers have demonstrated that it’s possible to replace the silver with relatively inexpensive carbon fibers while maintaining about the same efficiency. And the technique produces carbon monoxide about 10 times faster. It may be possible to incorporate the catalysts into an “artificial leaf.” Right now, if the process were to run on sunlight, it would require at least two pieces of equipment: a solar panel to generate electricity, and then a reactor to form the carbon monoxide. A leaf-inspired system would absorb energy from the sun and use it to drive the chemical reactions directly, rather than making electricity first!

Monday, December 2, 2013

The piggy box is never full!

The world is losing its forests at a rate of 13 million hectares (32 million acres) a year, contributing one-third of the world's atmospheric carbon dioxide emissions. The share is higher in developing countries where forests are being razed to make way for agriculture. However, this is inevitable as demand for crops keeps increasing. The challenge has been to make the developed world compensate in ways which could limit the deforestation.
REDD+ finance, the money needed to set up and implement a system that pays countries to leave forests standing, has followed a long road since the 2007 U.N. Framework Convention on Climate Change meeting in Bali, Indonesia, where nations pledged to take meaningful action to reduce emissions from deforestation. A 2008 study found it would cost between $17.2 billion and $28 billion per year to cut the global rate of deforestation in half.
According to a recent policy brief from the Overseas Development Institute, $2.72 billion has been pledged for REDD+ since 2007 through five multilateral funds and two bilateral funds, more than half of it to Indonesia and Brazil. About one-tenth of the pledges have been disbursed to projects on the ground. But it has just not been substantial enough.

At the Warsaw meet, the U.S. State Department pledged $25 million last week as part of a major new $280 million funding initiative aimed at slowing deforestation and stemming its effect on world carbon emissions. The United States joined Norway, the United Kingdom and the World Bank in launching the "BioCarbon Fund Initiative for Sustainable Forest Landscapes." The fund will provide incentives to developing countries that are taking steps to limit the chopping and razing of trees under the United Nations' Reducing Emissions from Deforestation and Forest Degradation program, or REDD+.The United States will be the new fund's smallest national donor, compared with Norway's $135 million and the United Kingdom's $120 million. The fund will be administered by the World Bank's BioCarbon Fund, a public-private initiative aimed at finding ways to sequester carbon.
But some observers expressed disappointment that the U.S. and its partners didn't put forward a more substantial sum.
Since the 2009 climate conference in Copenhagen, Denmark, there haven't been any substantial pledges to fund REDD+ past 2012. Although Norway has said it will fund REDD+ through 2020, a concrete commitment has been absent.
REDD+ negotiators expect diplomats in Warsaw this week to approve text on five scientific and technical decisions that will lay the groundwork for finance. These include: human rights and environmental safeguards; the definitions of drivers of deforestation; ways for measuring countries' reference levels, or the base line upon which to measure forest loss; monitoring, reporting and verification of emissions reduction; and the creation of a national forest monitoring system.

Besides the BioCarbon Fund, the World Bank houses two other major coffers for REDD+, the Forest Carbon Partnership Facility and the Forest Investment Fund. The FCPF is divided into two funds, one to help countries get ready to implement a REDD+ program and another to pay for verified emissions reductions.

Forests constitute what is known as the planet's lungs, and when they disappear in chunks, it is the health of the whole biosystem that suffers. Not limited to the area or nation that applies the axe, deforestation affects everyone. That is why it is imperative that initiatives must go beyond rhetoric and mere symbolism.

Making energy truly accessible

A creative way of selling solar energy is gaining traction in sub-Saharan Africa: customers can pay as they go.
Only one in six rural inhabitants in sub-Saharan Africa has access to electricity. For households living off the grid, kerosene lamps are the primary lighting source. The World Bank estimates that breathing kerosene fumes is the equivalent of smoking two packs of cigarettes a day, and two thirds of adult females with lung cancer in developing nations are nonsmokers.
Some observers rightly point out how the poorest people in the world are not just paying a bit more for their energy, they’re paying a disproportionate amount! Across the U.S. and U.K. electricity from a utility costs between 10 to 15 cents per kilowatt-hour (kWh). A villager in rural Kenya or Rwanda, however, pays an equivalent cost of $8 per kWh for kerosene lighting. Often 30 percent or more a family's income is spent on kerosene. Charging a mobile phone is even more expensive. That same villager would pay nearly 400 times more to charge a mobile phone in rural Kenya than in the U.S.

Solar-powered charger kits are a promising alternative, but
many rural families cannot afford the up-front cost of these systems, which start at $50.
With a Pay-As-You-Go model (PAYG) for solar kits, on the other hand, customers can instead pay an up-front fee of around $10 for a solar charger kit that includes a two- to five-watt solar panel and a control unit that powers LED lights and charges devices like mobile phones. Then they pay for energy when they need it—frequently in advance each week—or when they can (say, after a successful harvest). In practice, kits are paid off after about 18 months and subsequent electricity is free to the new owner. PAYG customers are finding that instead of paying $2 to $3 a week for kerosene, they pay less than half that for solar energy.
A US company has integrated an analog modem into their solar charger that “talks” with the customer’s mobile phone to authenticate a transaction.  Companies report that the PAYG business model replicates well from country to country. They reach rural communities by working with local distribution partners within each country, who also make money from each solar kit sale.

Yet challenges remain. Many PAYG start-ups are running into limits of working capital; companies front the initial cost of these solar kits and are not fully reimbursed for 18 months. This leads to cash flow constraints that intensify when customers default. 

But for now, the advantages are more, in terms of energy accessibility and reducing pollution from burning kerosene and firewood!

Monday, November 25, 2013

Upping sea level rise

Sea-level rise in this century is likely to be 70-120 centimeters if greenhouse-gas emissions are not mitigated, a broad assessment of the most active scientific publishers on that topic has revealed. The 90 experts participating in the survey anticipate a median sea-level rise of 200-300 centimeters by the year 2300 for a scenario with unmitigated emissions.

In contrast, for a scenario with strong emissions reductions, experts expect a sea-level rise of 40-60 centimeters by 2100 and 60-100 centimeters by 2300. The survey was conducted by a team of scientists from the USA and Germany.

"While the results for the scenario with climate mitigation suggest a good chance of limiting future sea-level rise to one meter, the high emissions scenario would threaten the survival of some coastal cities and low-lying islands," says Stefan Rahmstorf from the Potsdam Institute for Climate Impact Research. "From a risk management perspective, projections of future sea-level rise are of major importance for coastal planning, and for weighing options of different levels of ambition in reducing greenhouse-gas emissions."

Projecting sea-level rise, however, comes with large uncertainties, since the physical processes causing the rise are complex. They include the expansion of ocean water as it warms, the melting of mountain glaciers and ice caps and of the two large ice sheets in Greenland and Antarctica, and the pumping of ground water for irrigation purposes. Different modeling approaches yield widely differing answers. The recently published IPCC report had to revise its projections upwards by about 60 percent compared to the previous report published in 2007.

Wednesday, November 20, 2013

Can individual action save the day?

At a major United Nations climate summit in Warsaw this week, a plan is being hammered out (in the 19th annual effort) for negotiations on a new climate treaty to be finalized in Paris in two years’ time. Delegates from 195 nations are also seeking to obtain commitments from countries to limit their greenhouse-gas emissions between now and 2020.

The moot question : will it make any difference? The path ahead is rife with disputes between rich and poor countries over funding, and how to allocate and enforce emissions reductions. The conference aims to outline the schedule and to set parameters for negotiations ahead of the next major climate summit in Paris in 2015, when countries hope to forge a treaty to follow the 2009 agreement settled on in Copenhagen. At Copenhagen, negotiations over a formal treaty broke down, but eventually resulted in a set of non-binding pledges — the Copenhagen Accord — for emissions reductions until 2020. The accord also blurred the distinction between developed countries, which were bound by the 1997 Kyoto Protocol to reduce emissions.

The Warsaw talks are split into two main tracks. One focuses on the architecture of a new global climate treaty that would take effect after 2020, when the current Copenhagen commitments expire. The second examines what can be done to strengthen commitments between now and 2020 to increase the chance of limiting global warming to a target of 2
°C above pre-industrial temperatures (see ‘Emissions up in the air?’).

Indigenous leaders from across North America met half a world away and offered a prophecy: The solution will never come via the UN talks. Tribal elders from the United States, Greenland and Mexico spoke of the need for individual action rather than government edicts, and of the difficulty – and urgency – of replacing economic questions with moral ones.

A return to the "old values:" Respect, concern for the future, and sharing – alone can help the world they believe. But as one elder pointed, it is a colossal task to get people to change. “How do you instruct 7 billion people as to their relationship to the Earth?" he asked. "It's very difficult – when you're struggling to protect your people and you're hanging by a thread – to instruct other people."

Friday, November 15, 2013

Solar bonanza

A new solar cell material has properties that might lead to solar cells more than twice as efficient as the best on the market today. The material—a modified form of a class of compounds called perovskites, promises to be a good choice though not experimentally used.

Researchers are making new perovskites using combinations of elements and molecules not seen in nature; many researchers see the materials as the next great hope for making solar power cheap enough to compete with fossil fuels. Perovskite-based solar cells have been improving at a remarkable pace. It took a decade or more for the major solar cell materials used today—silicon and cadmium telluride—to reach efficiency levels that have been demonstrated with perovskites in just four years.

The perovskite material described in the latest Nature has properties that could lead to solar cells that can convert over half of the energy in sunlight directly into electricity according to the center for energy innovation at the University of Pennsylvania.

That’s more than twice as efficient as conventional solar cells. Such high efficiency would cut in half the number of solar cells needed to produce a given amount of power. Besides reducing the cost of solar panels, this would greatly reduce installation costs, which now account for most of the cost of a new solar system.

Unlike conventional solar cell materials, the new material doesn’t require an electric field to produce an electrical current. This reduces the amount of material needed and produces higher voltages, which can help increase power output. While other materials have been shown to produce current without the aid of an electric field, the new material is the first to also respond well to visible light, making it relevant for solar cells.


The researchers also showed that it is relatively easy to modify the material so that it efficiently converts different wavelengths of light into electricity. It could be possible to form a solar cell with different layers, each designed for a specific part of the solar spectrum, something that could greatly improve efficiency compared to conventional solar cells.

Friday, November 8, 2013

Aluminum to the rescue

When it comes ot fuel cells , the challenge of storing the hydrogen has been vexing. Lightweight interstitial hydrides -- compounds in which hydrogen atoms occupy the interstices (spaces) between metal atoms -- have now been proposed as a safe and efficient means for storing hydrogen for fuel cell vehicles. And fuel cells are what many see as the future.

Hydrides using magnesium, sodium and boron have been manufactured, but so far, none have proven practical as a hydrogen repository. An aluminum-based alloy hydride offers a more viable candidate because it has the desired traits of light weight, no toxicity to plants and animals, and absence of volatile gas products except for hydrogen.

Until now, however, only complex aluminum hydrides -- unsuitable for use as a hydrogen storage system -- have been created. In a recent paper in the AIP Publishing journal APL Materials, a joint research group with members from the Japan Atomic Energy Agency (Hyogo, Japan) and Tohoku University (Sendai, Japan) announced that it had achieved the long-sought goal of a simple-structured, aluminum-based interstitial alloy.

“Although its synthesis requires very extreme conditions and its hydrogen content is low, our new compound showed that an aluminum-based alloy hydride is achievable," said Hiroyuki Saitoh, lead author of the APL Materials paper. "Based on what we've learned from this first step, we plan to synthesize similar materials at more moderate conditions -- products that hopefully will prove to be very effective at storing hydrogen."

Wednesday, November 6, 2013

Glaring truth!

Energy consumption continues to grow. The costs of generation and transmission of energy must come down for the increased consumption to be sustainable. Energy must be generated without depleting resources, without causing pollution, and without incurring waste. Transmission of energy too must be efficient. Big challenge. But experts insist it can be an easy solution - onsite generation of electricity using the photovoltaic (PV) method of converting solar energy directly into electrical energy.

Nothing new in that but more and more scientists are focusing on the advantages of solar PV instead of the disadvantages like intermittency, storage, etc. For instance, silicon is the second most abundant element in the earth’s crust. Then consider the power saved. The creation of local DC power grids can save power being lost in the transmission and unnecessary conversion from DC to alternating current (AC) and then back to DC. Most electronic appliances and electric loads operate on DC and by transmitting and converting AC power to DC about 30% of the total power generated is lost.

The use of thin films of semiconductors such as cadmium telluride, amorphous silicon and copper indium gallium arsenide is still to make a major commercial impact. PV modules comprising organic and dye-sensitized solar cells shall not play a role in bulk power generation, without fundamental breakthroughs in material synthesis and performance.

Researchers at Penn University have proposed a new multi-terminal multi-junction architecture for inexpensive PV electricity generation. Efficiency will exceed the currently feasible 25%. The proposed architecture is based on the use of currently commercial crystalline solar cells and thin-film solar cells made of materials (such as copper oxide) that are abundant in Earth's crust. However, the additional manufacturing costs to be incurred thereby remain unknown, according to the researchers.


Empa scientists have developed a new technique for manufacturing high-efficiency, flexible, thin film solar cells from CIGS (copper indium gallium di-selenide) semiconductors. This has enabled them to achieve an efficiency of 20.4% for the conversion of sunlight into electrical energy. As the solar cells are deposited onto plastic foils, they could be produced on an industrial scale using cost-effective roll-to-roll manufacturing. The researchers are presenting a new manufacturing technique for CIGS solar cells, in which tiny quantities of sodium and potassium are incorporated into the CIGS layer.

All the research points to the sun as the future source of energy. More reason why we should be thinking of local micro grids rather than centrally generated power with potential for huge losses in transmission!

Revenue from pricing carbon emissions can exceed loss for plant owners

Stabilizing global warming at around 2 degrees Celsius by cutting greenhouse-gas emissions from fossil fuels would mean to leave much of coal, gas and oil unused underground. Yet the instrument of pricing global CO2 emissions could generate a revenue of 32 trillion US dollars over the 21st century, exceeding by far the 12 trillion US dollars reduction of fossil fuel owners' profits, according to a study now published by scientists of the Potsdam Institute for Climate Impact Research.

"Implementing ambitious climate targets would certainly scale down fossil fuel consumption, so with reduced demand their prices would drop," says Nico Bauer, lead-author of the study. "The resulting profit loss would be overcompensated by revenues from auctioning emissions permits or taxing CO2, which are two of the possible instruments of climate policy."

The distribution of revenues from emissions pricing depends on how climate policies are implemented on a national and international level. "Moreover, revenues from pricing carbon cannot be simply seen as a compensatory fund for the loss of income from fossil fuels," says Bauer. "This is because climate policy results in higher energy prices for households and companies, which lead to a -- rather small -- reduction of economic output. So there might be many appetites for the money raised from CO2 pricing."


We know that fossil fuel owners will lose out on profits, but the big question is who will benefit from the new revenues generated by climate policy? It will fall to policy makers and society at large to decide this, adds Elmar Kriegler, project leader and co-author of the study. "It would be interesting to ask for the effect of using the revenues from carbon pricing to finance infrastructure investments in developing countries."

Tuesday, November 5, 2013

Powered from Space

India’s ambitious Mars Mission saw a successful launch today. The 1350 kg satellite was placed in an elliptical earth orbit from where it will be transfered into a heliocentric one and from there to Mars will be the last leap. The 400 million km odyssey will take around a year roughly, if everything goes smoothly.

The craft carries 850 kilograms of propellant and oxidiser.
Propellant is the chemical mixture burned to produce thrust in rockets and consists of a fuel and an oxidizer. By controlling the flow of propellant to the combustion chamber, the engine can be throttled, stopped, or restarted. The main engine uses the bipropellant combination monomethyl hydrazine and dinitrogen tetroxide for orbit insertion and other manoeuvres. But the craft is largely powered by solar cells.

Some of Nasa’s deep space probes have relied mostly on a certain type of plutonium, plutonium-238. It powers these spacecraft with the heat of its natural decay. But plutonium-238 isn't found in nature; it's a byproduct of nuclear weaponry and tough to lay hands on! Solar power is preferable to plutonium because it is cheaper and has fewer safety concerns, but obviously will not work as the craft moves away from the sun.
Fuel cells, devices that transform the chemical energy of hydrogen into electrical energy through their reaction with oxygen and feed the electricity to run an electric engine, were first employed in space missions in the 1960s. Due to their high efficiency and their water vapor emissions (no CO), hydrogen fuel cells have triggered global research efforts to reduce greenhouse gas and air pollutant emissions. But they are costly and global research at present focuses on the automobile sector.

That is all about fuel for man’s ambitious space ventures. However, a by-product of the space missions throws up energy potentials for the energy-starved earth. For instance, with space shuttles becoming as risk-free as any flight, we could think of setting up solar arrays in space.
Without the obstacles like rain, clouds and nighttime, these would receive more concentrated solar rays than they would on Earth. The panels also wouldn't be subject to the seasonal fluctuations that are unavoidable on Earth. Solar energy becomes ever present!

S
olar panels would either be attached to orbiting satellites or stationed on the moon and the electricity created would be converted into microwaves and beamed down to Earth. Rectifying antennas on the ground would collect the microwaves and convert them back into electricity. Communications satellites already do something very similar when they transmit your cell phone conversations. Some people have even suggested that the solar panels could piggyback on communications satellites. Space-based solar power is a hot favourite as all of the necessary equipment and technology is already developed and understood.

Recent proposals talk of small satellites fitted with solar arrays circling the Earth continuously. They would be more manageable than huge ones and still produce considerable energy output. A satellite less than 1,000 feet (300 meters) across orbiting 300 miles (540 kilometers) above Earth could potentially power 1,000 homes. The major obstacle right now, as with any new technology, is cost. Launching, setting up and maintaining a solar farm on the moon would require vast amounts of manpower and money.


But just as space missions were once the subject of fiction, so also any new technology will seem tough. Not impossible. And energy is what the Blue Planet needs desperately, after food and water.

Tuesday, October 29, 2013

Making the search for gas easier

Gas and oil deposits in shale have no place to hide from an Oak Ridge National Laboratory technique that provides an inside look at pores and reveals structural information potentially vital to the nation's energy needs. Researchers were able to describe a small-angle neutron scattering technique that, combined with electron microscopy and theory, can be used to examine the function of pore sizes.

Using their technique at the General Purpose SANS instrument at the High Flux Isotope Reactor, scientists showed there is significantly higher local structural order than previously believed in nanoporous carbons. This is important because it allows scientists to develop modeling methods based on local structure of carbon atoms. Researchers also probed distribution of adsorbed gas molecules at unprecedented smaller length scales, allowing them to devise models of the pores.

While traditional methods provide general information about adsorption averaged over an entire sample, they do not provide insight into how pores of different sizes contribute to the total adsorption capacity of a material. Unlike absorption, a process involving the uptake of a gas or liquid in some bulk porous material, adsorption involves the adhesion of atoms, ions or molecules to a surface.

This research, in conjunction with previous work, allows scientists to analyze two-dimensional images to understand how local structures can affect the accessibility of shale pores to natural gas. Together, the application of neutron scattering, electron microscopy and theory can lead to new design concepts for building novel nanoporous materials with properties tailored for the environment and energy storage-related technologies. These include capture and sequestration of human-made greenhouse gases, hydrogen storage, membrane gas separation, environmental remediation and catalysis.


Meanwhile, after 10 years of production, shale gas in the United States cannot be considered commercially viable, according to several scientists presenting at the Geological Society of America meeting in Denver on Monday. They argue that while the use of hydraulic fracturing and horizontal drilling for "tight oil" is an important contributor to U.S. energy supply, it is not going to result in long-term sustainable production or allow the U.S. to become a net oil exporter.

Wednesday, October 23, 2013

Pollution kills more than accidents!

Automobile pollution kills more people than automobile collisions do. A recent study on the subject done by researchers at MIT says that the 34,080 American lives that were ended in 2012 by automobile collisions are completely eclipsed by the number of people who died as a result of the pollution from those same automobiles — 58,050. Authored by five researchers at the Massachusetts Institute of Technology, the study found an estimated 200,400 premature deaths attributable to combustion emissions in the US last year. Of those, a bare majority were due to either road transportation or electric power generation.
The study primarily focused on fine particulate matter, or particles with a diameter of 2.5 micrometers or less. These minuscule particles are most likely to cause illnesses like lung cancer and premature deaths more generally.
The researchers found 52,800 yearly premature deaths attributable to emissions related to road transportation, with a similar number — 52,200 — due to electric power generation. They also looked at ozone exposure, but found much lower numbers: 5,250 due to motor vehicles, and another 1,700 caused by electricity production. These represented just more than half of all premature deaths caused by fine particulate matter, with other large contributors being industry (40,800 deaths in 2005) and commercial and residential buildings (41,800 deaths).
The new research was just published in the journal Atmospheric Environment. You can find the abstract here.


E Asia cities at risk from rising sea levels

About 12 million people in 23 East Asian cities are at risk from rising sea levels, severe storms, and more intense drought caused by climate change that could jeopardize $864 billion in assets, a new report from the Asian Development Bank (ADB) warns.

Economics of Climate Change in East Asia notes that while climate adaptation investments can be large, the aggregate cost to protect the most vulnerable sectors -- infrastructure, coastal protection, and agriculture -- would be less than 0.3% of East Asia's gross domestic product every year between 2010 and 2050. The report recommends the People's Republic of China (PRC), Japan, the Republic of Korea, and Mongolia together to invest an annual average of $22.9 billion for climate-proofing in the infrastructure sector, $4.2 billion for coastal protection, and $9.5 billion for the agriculture sector.


The report projects that severe weather related to climate change will intensify, with one-in-20-year flooding predicted to occur as frequently as every four years by 2050. When combined with rising sea levels, this is expected to cause massive swaths of land to disappear, forcing millions to migrate, and wreaking havoc on infrastructure and agriculture. Since 1970, economic losses to the four countries from climate-related natural disasters have amounted to more than $340 billion.

Monday, October 21, 2013

Why is CCS not taking off?

Four large-scale carbon capture projects were launched this year, but regulatory and cost barriers for the technology threaten the world's ability to prevent temperatures from rising to dangerous levels, a new report warns. The annualreport of the Australia-based Global CCS Institute cited a few signs of progress in 2013 -- the four new projects, along with eight existing ones in operation, are preventing 25 million metric tons of greenhouse gases from reaching the atmosphere annually. Yet all of the world's existing and new projects are on natural gas processing plants or other facilities that separate CO2 as part of a normal industrial procedure.
There still are no carbon capture projects operating in the power sector, and there is little movement toward implementing the technology on big industrial emitters like cement manufacturers. Since last year's report, 12 projects were either canceled or put on hold, largely because of the high cost of the technology.
The report noted, for example, that the current CO2 pipeline network will need to be expanded 100 times to carry enough captured greenhouse gas to hold global temperatures to 2 degrees Celsius above preindustrial levels by the end of the century. Countries that are not members of the Organisation for Economic Co-operation and Development (OECD) will account for most of the growth in primary energy demand through 2035, according to the IEA, but there are few projects far along in the planning stage in many of those countries.

Meanwhile, funding support for CCS globally has fallen by more than $7 billion from 2009, "reflecting either changing government priorities or a reliance on carbon price support that has subsequently collapsed," the institute said. In Europe, there have not been new operational projects since 2008.

Cost is not the only challenge. Siting new pipelines to carry CO2 is a "phenomenally difficult" task in many countries, including India. India emits roughly 6 percent of the world's carbon dioxide, according to U.S. EPA.


To boost the number of projects, the report recommends additional financial support for both construction and research, to reduce the cost of CO2 capture. It says there is no one-size-fits-all option -- capital grants, subsidies and ratepayer cost recovery agreements all have been used effectively to boost the technology.

Detractors of CCS say the technology is far too energy intensive to be feasible. The jury is still out on that!

Wednesday, October 16, 2013

What a waste

The world today dumps over 70 percent of food waste into landfills, rather than harnessing it for fuel and electricity. An average city in the developing world generates around 4000 tonnes of waste daily! Over the next 25 years, global energy demand will grow by 50 percent, while global oil supply dwindles at a rapid pace. Waste-to-energy is an obvious solution to meet the world’s burgeoning energy demand, believe experts. The technology is well-known and the only problem is to organise collection, segragation and transportation.

A recent report “Waste-to-Energy Technology Markets”, which analyzes the global market opportunity for WTE, expects waste-to-energy to grow from its current market size of $6.2 billion to $29.2 billion by 2022.

Currently there are some 800 industrial-scale WTE plants in more than three dozen countries around the world, and likely thousands of smaller systems at individual sites. Most employ anaerobic digesters, which make use of microorganisms to break down and convert organic waste into a fuel such as biogas, biodiesel or ethanol. With some 70 percent of food waste around the world still going into landfills, there is a lot of potential feedstock to keep this environmentally friendly carbon neutral fuel source coming. The waste from small slaughterhouses, breweries, dairy farms and coffee shops can power hundreds of typical homes each day if the infrastructure is in place to sort, collect and process the flow of organic material.


If we cannot control the waste we generate, especially food waste, the next best option is to use it effectively instead of letting it go to rot or polluting land and air.

Saturday, October 12, 2013

Chemicals in the e-soup!

Ever think what happens when you discard your one year old phone for a new model? It probably adds to the e-waste heap.
The consumer electronics industry is now a multibillion-dollar juggernaut that churns out new products year-round. In 2012, sales of electronics in the United States topped $200 billion, according to the Consumer Electronics Association, an industry group that represents 2,000 companies, including Sony, Samsung, and Apple. The average American household now owns 24 electronic products, many of which will be rendered obsolete within a few years.

In 2009, the most recent year for which the EPA has data, 2.37 million tons of electronics were ready for “end-of-life management,” yet only a quarter of them were collected for recycling.

Every year, heaps of American e-waste, from smartphones to computers to stereo systems, are shipped to India, China, Ghana, Pakistan, Peru, and other developing countries. By some estimates, 80 percent of the U.S. e-waste collected ends up on foreign shores, where regulations are lax and incentive for risk high.

The goods are generally auctioned off in bulk to scrap companies and smelters. These companies pay locals—often including children—meager wages to strip smidgens of gold, copper, and palladium from the discarded devices. Sometimes, this involves concocting a noxious stew of cyanide and nitric acid, then burning the remaining plastic in crude firepits. Throughout the process, workers are exposed to lead, mercury, and cadmium, among other toxic substances.

From mining to manufacturing to recycling, consumers, corporations, and governments need to rethink the life of our devices from beginning to end.

The European Union is ahead in the game and from last year imposed a strict directive requiring that by 2019 member countries collect 65 percent of the weight of all electronics put on sale in the preceding three years or 85 percent of all e-waste generated per year. Under the EU’s policy, retailers will be required to take e-waste from consumers. Companies—retailers, manufacturers, and recyclers—found to be in violation could be hit with stiff fines.

But some believe that the first step is to make manufacturers come out with a list of chemicals used in the process of manufacture. Nobody knows the number of chemicals used in the manufacturing of electronic products. It’s probably in the range of several thousand. Some are very standard, run-of-the-mill chemicals, but others are exotics … and many are extremely hazardous.

Sunday, October 6, 2013

Clean costs

Clean energy is the obvious choice for a planet faced with global warming. But going clean comes with a price – a price that can be costly.

Because electricity and heat account for 41 percent of global carbon dioxide emissions, curbing climate change will require satisfying much of that demand with renewables rather than fossil fuels. But solar and wind come with their own up-front carbon costs. Photovoltaics require much more aluminum—for panel frames and other uses—than other technologies do, according to a 2011 study at Leiden University in the Netherlands.

Alloys for wind turbines demand lots of nickel. Those metals are carbon culprits because they are produced in large amounts by high-energy extracting and refining processes.

The demand for metals, and their already significant carbon footprint, may grow with a switch to green energy. Given all the resources needed for new infrastructure, an analysis last year found that large solar installations take one to seven years to “break even” with coal power on the greenhouse scorecard. Wind farms take from less than one year up to 12 years. All the more reason to make the switch sooner than later??

India stepping on the shale wagon

India cleared the way for shale-gas exploration last week. The country meets nearly three-fourths of its energy needs through imports and it has been working on its coal and shale-gas policies for more than two years. The country has the world's fourth largest coal deposits as well as significant untapped shale-gas and oil potential.
In the first phase, the country seeks to allow two state-run companies—Oil & Natural Gas Corp. and Oil India Ltd -- to explore for and produce shale oil and gas in blocks they already control. Later, the government will allow other state-backed companies as well as private companies into shale-gas exploration and production, said one cabinet minister who didn't wish to be named.
India has about 63 trillion cubic feet of recoverable shale-gas reserves—more than 20 times the size of India's biggest-ever gas discovery and enough, if proven, to run the country's gas-fired power stations for 20 years or more, analysts say.
This comes at a time when concerns are rising over the environmental impacts of fracking – the process of drilling the shale deposits- in the US where shale gas is now being harvested. Worries about triggering quakes and pollution of groundwater have been some of the major areas of debate. Can India handle the same?

The heat is on!

Sea levels are creeping up at the fastest rate in 2,000 years. Concentrations of CO2 in the atmosphere have reached "levels unprecedented in at least the last 800,000 years" (or before modern humans evolved). Most importantly "human influence on the climate system is clear" and "continued emissions of greenhouse gases will cause further warming." Those are some of the key messages in the "Summary for Policymakers" of the physical science of global warming from the Intergovernmental Panel on ClimateChange released on September 27.

"The planet is red" in a global map of the change in average surface temperatures, noted Swiss climate scientist Thomas Stocker, co-chair of IPCC Working Group I responsible for this summary at a press conference. "The world is warming."

Ice all over the world is melting, particularly in the Arctic, a trend that will continue unabated. Ocean circulation looks set to change, with unpredictable effects, and the oceans will become more acidic as well.  Almost all of the world's coastlines will be affected by sea level rise. And developed countries and emerging economies have
burned through more than half of the fossil fuels possible to keep total concentrations of CO2 in the atmosphere at a level that gives the world a chance to keep global warming below 2 degrees Celsius difficult.
Interestingly, the IPCC has shifted from talking about concentrations in the atmosphere, like 400 parts-per-million, to total carbon budget in gigatons. Since 1880, 531 gigatons have been emitted and emissions should not exceed 800 gigatons of C for a better than 50-50 chance at keeping global temperature rise below 2 degree C.
"We cannot emit more than 1000 billion tons of carbon," Stocker says.

In the time since the 2007 version of this report, the human effect on the climate has grown more than 40 percent stronger, thanks to continued emissions of greenhouse gases and more precision in measurements, with carbon dioxide leading the charge. The good news is that extreme global warming by century's end, anything above 3 degrees C or more, seems "extremely unlikely," in the words of the IPCC.

The report notes that the current "pause" in new global average temperature records since 1998—a year that saw the second strongest El Nino on record and shattered warming records—does not reflect the long-term trend and may be explained by the oceans absorbing the majority of the extra heat trapped by greenhouse gases as well as the cooling contributions of volcanic eruptions.

Even if CO2 emissions stopped tomorrow, climate change would continue. In other words, humanity is in the process of setting the Earth's thermostat. The world has already warmed by roughly 0.85 degree C since 1880 and further heat extremes are "virtually certain." So the question is: how much hotter can we stand? Or as United Nations Secretary General Ban-ki Moon put it in a video address to the IPCC press conference: "The heat is on. Now we must act."

Friday, September 20, 2013

Yes, it is due to global warming!

A glance through climate change news will show the growing gap between believers and deniers even today. Is the climate disruption we are witnessing linked to global warming, is still contested. New research released yesterday links human-caused climate change to six of 12 extreme weather events from 2012. Teams of scientists from around the world examined the causes behind extreme weather events on five continents and in the Arctic. Their results were published as a special report in the Bulletin of the American Meteorological Society.

One of the stronger linkages between
global warming and severe weather was found in an analysis of last year's high July temperatures in the northeastern and north-central United States. The Stanford team found that climate change had made the likelihood of such a heat wave four times more likely than in a world without elevated levels of greenhouse gases. They were able to determine this by running models with current levels of greenhouse gases as well as ones that reflected preindustrial levels and examining the relative likelihood of the heat wave.
Others looked at 2012's hot spring temperatures over the eastern United States and also found that human influences contributed about 35 percent to late spring heat that year.
In some other parts of the world, climate change was linked, although in a small way, to extreme precipitation events. New Zealand experienced an extreme two-day rainfall in December 2011; researchers said 1 to 5 percent more moisture was available for that event due to climate change, which is increasing the amount of water vapor in the atmosphere.
Australia also experienced record rainfall in early 2012, and while La Niña, a natural variation, was behind much of that, researchers found that human-caused climate change increased the chance of the above-average rainfall by 5 to 15 percent.


This is the second time the Bulletin of the American Meteorological Society has collected information on the previous year's weather extremes and tried to tease out the role of climate change in those events. The researchers involved in the effort stressed that the science of attribution, or of linking specific events to climate change, is still young and evolving.

New AC technologies to cut power use

The U.S. expends roughly 185 billion kilowatt-hours of energy each year on home cooling, the most by any nation in the world. Air conditioner sales are growing globally by roughly 20 percent per year, with the newly affluent in China and India leading the way. How do we beat the heat without increasing that heat through global warming caused by burning fossil fuels to power the air-conditioner? The U.S. Advanced Research Projects Agency for Energy, ARPA–E, hopes to cut this hot forecast by reducing the energy required for air-conditioning.

Conventional air-conditioners employ refrigerants such as chlorofluorocarbons to absorb heat from the room to be cooled. That heat is then expelled outside, requiring electrically powered pumps and compressors. One idea to conserve energy is to replace coolant fluids and gases—which are often super-powered greenhouse gases capable of trapping more than 1,000 times more heat than CO2—with solid materials, such as bismuth telluride. 

A new device uses electricity to change a thermoelectric solid to absorb heat, and could lead to cheaper air-conditioners or refrigerators. Such refrigerators, which lack moving parts and are therefore less likely to break down, can be lifesavers in remote, rural areas for keeping medicines cool or food fresh.

Another approach is to employ specialty membranes to cool air by condensing water. These technologies are being developed by companies and now have acquired backing from the U.S. Navy, which requires efficient air-conditioners and dehumidifiers for both troops and equipment in hotspots such as Iraq and Afghanistan. "A 30 percent improvement in efficiency means 30 percent less fuel to drag to the front," Martin notes, adding that the Navy program aims for units that use 20 to 50 percent less fuel.


More efficient air-conditioners can provide cooling that could prove vital for people trying to adapt to more extreme heat waves in the future, whether in the U.S. or India. Meanwhile, a simple approach to cut down the HVAC bills would be to keep the knob a level higher than freezing temperatures - a practice in many places!

Tuesday, September 10, 2013

Storing RE not always sensible

Renewable energy holds the promise of reducing carbon dioxide emissions. But there are times when solar and wind farms generate more electricity than is needed by consumers. Storing that surplus energy in batteries for later use seems like an obvious solution, but a new study from Stanford University suggests that might not always be the case. The costs involved in terms of energy are too high.

Grid-scale batteries make sense for storing surplus solar energy, but not for wind they found. The study, which is supported by GCEP, is published in the online edition of the journal Energy and Environmental Science.

The Stanford team looked at several emerging technologies, including five battery types -- lead-acid, lithium-ion, sodium-sulfur, vanadium-redox and zinc-bromine. Batteries with high energetic cost consume more fossil fuels and therefore release more carbon dioxide over their lifetime. If a battery's energetic cost is too high, its overall contribution to global warming could negate the environmental benefits of the wind or solar farm it was supposed to support.
The researchers compared the energetic cost of curtailing solar and wind power, versus the energetic cost of grid-scale storage. Their calculations were based on a formula known as "energy return on investment" -- the amount of energy produced by a technology, divided by the amount of energy it takes to build and maintain it.

Using that formula, the researchers found that the amount of energy required to create a solar farm is comparable to the energy used to build each of the five battery technologies. The results were quite different for wind farms. The scientists found that curtailing wind power reduces the energy return on investment by 10 percent. But storing surplus wind-generated electricity in batteries results in even greater reductions -- from about 20 percent for lithium-ion batteries to more than 50 percent for lead-acid.


As the team notes, it is important for society to be energy-smart about implementing new technologies.  When plunging into new technologies, policymakers and investors need to consider the energetic cost as well as the financial cost of new technologies.

Tuesday, September 3, 2013

Productivity to go down as the globe warms

A new NOAA study projects that heat-stress related labor capacity losses will double globally by 2050 with a warming climate. Recent studies project a collapse in labor productivity from business-as-usual carbon emissions and warming — with a cost to society that may well exceed that of all other costs of climate change combined.  A 2 percent drop in productivity per degree rise is how the US study sees it. How about in hotter climes then??

Is it possible to reduce emissions 50 percent globally by 2050s? Only one country, France, has ever reduced greenhouse emissions at the pace we’d have to keep up between now and 2050. Over a remarkable period of 30 years, France went from getting less than 1 percent of its power from nuclear power plants (which emit no carbon dioxide directly) to getting about 80 percent from them. During the period of the fastest nuclear build-out, France managed to reduce emissions at a rate of 2 percent per year, says David Victor, co-director of the Laboratory on International Law and Regulation at the University of California, San Diego. But the transition was tough.


This kind of transition for the rest of the world could be tough to almost improbable. Look at what’s happened in the United States. A major recession slowed energy consumption, and at the same time technological advances unlocked huge amounts of natural gas, leading utilities to shut down coal plants in favor of natural-gas plants that emit half as much carbon dioxide. In just one year, 2009, emissions dropped by an impressive 6.7 percent. But that was only across one year. If you look back to 2000, the average reduction was less than 1 percent a year, less than half of what’s needed to meet emissions goals! Looks like the human race better get set for tough times ahead.

Random wins over order!

The US electrical grid is in danger of breaking down, thanks to orderly networks!

A mathematical study of spatial networks by physicists in Israel and the U.S. says that the research builds on earlier work by incorporating a more explicit analysis of how the spatial nature of physical networks affects their fundamental stability. The upshot, published August 25 in Nature Physics, is that spatial networks are necessarily dependent on any number of critical nodes whose failure can lead to abrupt—and unpredictable—collapse. The electric grid, which operates as a series of networks that are defined by geography, is a prime example. Whenever you have such dependencies in the system, failure in one place leads to failure in another place, which cascades into collapse.

Focussing on idealized scenarios, the team found that randomly structured networks—such as social networks—degrade slowly as nodes are removed, which in the real world might mean there is time to diagnose and address a problem before a system collapses. By contrast, the connections of orderly lattice structures have more critical nodes, which increase the instability. The problem is that such orderly networks are always operating near an indefinable edge. To reduce that risk, they recommends adding a small number of longer transmission lines that provide short cuts to different parts of the grid.


The 2003 blackout stemmed from a combination of bad vegetation management—the first three lines tripped after sagging into trees but were all within their load rating—and a series of monitoring and communications breakdowns. Vegetation requirements have since been standardized, and a new generation of sensors is providing grid operators with more information about what is happening across the grid at any given moment.

Abundant and cheap source

University of Alberta researchers have found that abundant materials in Earth's crust can be used to make inexpensive and easily manufactured nanoparticle-based solar cells. At the university’s National Institute for Nanotechnology, the team has designed nanoparticles that absorb light and conduct electricity from two very common elements: phosphorus and zinc. Both materials are more plentiful than scarce materials such as cadmium and are free from manufacturing restrictions imposed on lead-based nanoparticles.

The research supports a promising approach of making solar cells cheaply using mass manufacturing methods like roll-to-roll printing (as with newspaper presses) or spray-coating (similar to automotive painting). Nanoparticle-based 'inks' could be used to literally paint or print solar cells or precise compositions, the scientist said. The team was able to develop a synthetic method to make zinc phosphide nanoparticles, and demonstrated that the particles can be dissolved to form an ink and processed to make thin films that are responsive to light.


The team is now experimenting with the nanoparticles, spray-coating them onto large solar cells to test their efficiency. The research in this field is tremendous as can be seen with the studies being published. The day is not far off when the planet will be truly living off its star!

Saturday, August 17, 2013

The surge gains strength

The Brazilian state of São Paulo — the economic and industrial heart of the country — is currently aiming to possess a total of at least 1 GW of solar energy capacity by the year 2020, a goal which is very achievable, according to a solar atlas of the region that was recently released by the state’s energy secretariat. The state of São Paulo possesses twice the maximum global solar irradiation of the solar powerhouse Germany.

SãoPaulo, which in addition to being the economic heart of the country is also the most populous state in Brazil, has a total solar power generation potential of 12 TWh per year in the areas with the absolute highest annual solar radiation, according to the new solar atlas. The areas in question total 732 square kilometers — 0.3% of the state’s total area of 248,209 square kilometers. It’s estimated that these areas could host at least 9,100 MW (9.1 GW) of installed capacity.


São Paulo is already well on its way to achieving its aforementioned goal of possessing 1 GW of solar energy capacity by 2020 — 207 MW of thermal solar capacity are already installed. The rest of the 1 GW target capacity will be split up thusly: a further 592 MW of thermal solar capacity, 50 MW of photovoltaic solar capacity, 50 MW of concentrated solar power, and 100 MW set aside for passive solar energy exploitation in the form of solar architecture projects.

Indian government announced a $7.9 billion investment to double its transmission capacity – designed to increase access to power from wind and solar projects. India’s installed solar energy has jumped from a mere 17 megawatts in 2010, when India’s National Solar Mission was announced, to over 1200 megawatts today.
The second phase of JNNSM programme envisages development of cumulative capacity of 1,000 MW for off-grid solar power and targets 15 million sq mt collector area. The targets include improved energy access in remote areas, heating or cooling applications that would encourage employment generation opportunities, replacement of diesel and kerosene as in Telecom Towers, solar cities and solar cookers and steam generating systems.

Not only do these clean energy projects increase India’s energy supply, they also create much needed jobs. As India’s economy grows and develops, its energy consumption likewise is increasing rapidly: it increased 64 percent from 2001-02 to 2011-12 and is projected to grow an additional 72 percent by 2021-22, according to the Indian Planning Commission. To support India’s burgeoning renewable energy ecosystem, NRDC and the Council on Energy, Environment and Water (CEEW) are striving to bolster the case for clean energy by telling this story of job creation and economic benefits.

The surge should pick up likewise in all places that get good sunlight. 

Tuesday, August 13, 2013

New kid on the solar block

A new type of solar cell, made from a material that is dramatically cheaper to obtain and use than silicon, could generate as much power as today’s commodity solar cells. Solar cells can be made very cheaply but have the downside of being relatively inefficient. Lately, more researchers have focused on developing very high efficiency cells, even if they require more expensive manufacturing techniques. The new material could deliver solar cells that are highly efficient but also cheap to make.

Perovskites have been known for over a century, but no one thought to try them in solar cells until relatively recently. Very good at absorbing light the new solar cells use less than one micrometer of material to capture the same amount of sunlight. The pigment is a semiconductor that is also good at transporting the electric charge created when light hits it.

One group has produced the most efficient perovskite solar cells so far—they convert 15 percent of the energy in sunlight into electricity, far more than other cheap-to-make solar cells. Based on its performance so far, and on its known light-conversion properties, researchers say its efficiency could easily rise as high as 20 to 25 percent, as good as the record efficiencies (typically achieved in labs) of the most common types of solar cells today. Perovskite in solar cells will likely prove to be a “forgiving” material that retains high efficiencies in mass production, since the manufacturing processes are simple.


Perovskites will have difficulty taking on silicon solar cells. The costs of silicon solar cells are falling, and some analysts think they could eventually fall as low as 25 cents per watt, which would eliminate most of the cost advantage of perovskites and lessen the incentive for investing in the new technology. But it might be possible to paint perovskites onto conventional silicon solar cells to improve their efficiency, and so lower the overall cost per watt for solar cells.

Monday, August 5, 2013

Combining solar PV & thermal could be the way

The Advanced Research Projects Agency–Energy in the US is devoting $30 million to several demonstration projects that will attempt to combine photovoltaics with solar thermal. The effort seeks to solve the important problem of intermittency of solar electricity.
Currently, storing electricity from solar panels is either prohibitively expensive or, in some areas, unfeasible. Solar thermal power, which concentrates sunlight to heat water and make steam for turbines, can store energy by keeping heat in insulated containers. But overall, solar thermal power is twice as expensive as power from solar panels.

According to ARPA-E, there are several ways the two types of solar power might be combined. 
Some solar power systems involve concentrating sunlight on tiny, super-efficient solar cells. As they’re currently configured, the heat from the concentrated sunlight is quickly extracted and allowed to dissipate into the atmosphere. If it could be collected instead, it could be stored and used to generate electricity later. The challenge is that this approach would require operating solar cells at much higher temperatures than is normal, and this can damage them. Researchers are looking at ways to make solar cells more resistant to high temperatures.
Another possibility is to split up the solar spectrum. Solar cells are very good at converting certain wavelengths of light into electricity—but not others. It may be possible to redirect wavelengths that can’t be used efficiently, and to use these to heat up water and produce steam.

Yet another approach is being developed by Todd Otanicar, a professor of mechanical engineering at the University of Tulsa. He uses nanoparticles suspended in a translucent fluid to absorb certain wavelengths but allow others to pass through to a solar cell. As the nanoparticles absorb sunlight, they heat up, and the fluid can be used to generate steam.


ARPA-E is also considering funding novel energy storage technologies that use both heat and electricity. Adding heat to electrolysis, for example, might improve the economics of splitting water to produce hydrogen. The hydrogen could then be run through a fuel cell to generate electricity. Heat could also aid other electrochemical reactions, such as those that can be used to make liquid fuels for vehicles.

Friday, August 2, 2013

Positives of fracking

"Geothermal is homegrown, reliable and clean," says Rohit Khanna, program manager at the World Bank for its Energy Sector Management Assistance Program. That is a big part of the reason it is being pursued in developing countries such as Chile, Indonesia, Kenya and the Philippines.
Australia's first enhanced geothermal system, spicily named Habanero, began producing power in May, and Europe has brought three such power plants online. A geothermal power plant in Larderello, Italy, has churned out electricity this way in Tuscany for more than a century, and big power plants can be built this way.
By some estimates, the U.S. could tap as much as 2,000 times the nation’s current annual energy use of roughly 100 exajoules (an exajoule equals a quintillion, or 1018 joules) via enhanced geothermal technologies. With respect to electricity, the DoE concludes at least 500 gigawatts of electric capacity could be harvested from such EGS systems. Even better, hot rocks underlie every part of the country and the rest of the world. The Geysers in California can produce 850 megawatts of electricity alone.
The idea is simple: pump water or other fluids down to the hot rocks beneath the surface. Heat from the rocks turns the water to steam. The steam rises and turns a turbine that spins a magnet to make electricity.
Some places have the natural bounty of hot rocks and cracks in them. But such sites are not plenty.  That's where fracking, the controversial practice of pumping fluid underground to shatter shale and release oil or gas, can help. Fracking “enhances” geothermal by making cracks in hot rocks where none existed, allowing heat to be harvested from Earth’s interior practically anywhere, although this reduces the total power produced because of the need to pump water through the system.
Yet, geothermal’s abundant, renewable, clean potential for making electricity largely languishes, producing "less than 1 percent of global energy," according to a recent perspective in Science. Indeed, only 6 percent of naturally occurring geothermal resources have been tapped to date, according to Bloomberg New Energy Finance (BNEF).
The reason is simple: money. In addition to the $6-million to $8-million risk of drilling a dry hole or a well that does not produce steam as it should there is the multimillion-dollar expense of building a power plant on top of those wells that do produce steam as they should. That adds up to a total cost for a geothermal power plant of roughly $90 per megawatt-hour,
Gradient holes have to be drilled to explore a particular area. Explosions need to be set off at the surface to send seismic waves through the rock that allow for surveying the underground landscape—a technique familiar from the oil and gas industry. It can take years and millions of dollars to do this exploration with the prospect of earning that money back slowly via electricity sales—or all those funds could be lost.
 BNEF puts the odds of successfully completing a geothermal well at 67 percent, which means one third of all geothermal projects fail. The analyst outfit has called for a "global geothermal exploration drilling fund" of some $500 million provided by investment agencies like the World Bank.


Another problem: some EGS projects have been associated with small earthquakes, much like oil and gas drilling and wastewater disposal. That has caused some projects to be abandoned.