Arctic Summer

The condition of the Arctic sea ice is exciting lots of comment this year. All this is in sharp contrast to last year. Recent headlines are taking note that the sea ice surrounding the pole itself is one year ice and the related patch is huge. Of course this is an artifact of the winds. If the winds fail to change the configuration, it is safe to say that the pole will be ice free this August for a couple hundred miles in all direction.

More importantly, for the past several weeks, the areal extent of the ice has mirrored last years chart. We can in fact expect a reduction over last year once we hit the end of season. The winds are also clearly able to move material around much more freely than in past years as is reasonable. This means that the long term ice is breaking up and is free to intermingle better with the younger ice leading to a mechanical acceleration of the ice loss.

I have made the observation that last years melt indicated a complete annual clearing of the Arctic as early as 2012. To my surprise, last years winter has not changed this prognosis one bit. This is holding true despite the fact that this year’s growing season is around two weeks late. The raspberries have just started to ripen in Vancouver when we should have been picking for two weeks. By the way, we need to be nervous about the fall grain harvest. We should be fine, but an early frost combined with a two week late ripening of the grain is quite possible.

An open question is whether or not we have a new climate regime or not of warm arctic summers followed by chilly northern winters. It is way too soon to really know what the reconstructed northern climate will look like now that the sea ice is well on the way to be reconfigured.

And that is what really needs to be recognized. Possibly for the first time since the medieval optimum, the long term arctic sea ice is disappearing. There will be a winter sea ice regime that will be likely fully removed every summer. Last year, strong westerly winds cleared the western arctic. This year we seem to be getting it from the eastern arctic or something like that. It is clearly very different.

So here we are, having passed through a very overdue cold winter, finding no effect on the ongoing sea ice removal in the arctic. I would speculate that breakup conditions have been reached and that it would now require a powerful northern climate reversal to even stop this process.

From now on, the only remaining variable will be the frequency of warm northern winters which can then accelerate the summer ice removal. This year, I see no reason to expect much real change over last year, but had conditions mirrored the past year we would have seen another sharp ice removal phase. Even without that, the annual surplus heat in the northern hemisphere which has been impacting the arctic for at least thirty years will continue to do its work.

In the meantime, the summer cruise ships heading for the pole could well have a real treat this summer. An open sea at the poles is just too good to miss. And I imagine that the press will be there in force.

Solar Power Arrives

Over the past year I have alluded to the pending change taking over the solar energy business. You may wish to reread these posts.

http://globalwarming-arclein.blogspot.com/2007/10/solar-revolution-slowly-unfolding.html

http://globalwarming-arclein.blogspot.com/2008/01/industrial-solar-cells-and-scientific.html

The breakthrough is happening now. Nanosolar which is partially funded by the founders of Google has successfully produced a printing tool that is now producing by using print technology sheets of photovoltaic material on a metal substrate. This is ahead of my expectations and the stage is now set for a rapid decline in the cost per watt of installed power. From now on, all the remaining problems are merely perfecting what is clearly working.

The apparent energy yield is peaking at 13%, which suggests that the usual yield is comparable to that of the amorphous silica that we use on calculators or close to 10%. As the technology is advanced, there are promising strategies that suggest 30% is possible.

Regardless, this is enough to fully justify making a tool to produce this product by the mile.

The tool is now doing 100 feet per minute and their objective is to hit 2000 feet per minute.

The real importance of printed solar energy panels is to drive the cost of solar energy from the current $6.00 per installed watt to below $.50. This is absolutely huge. It makes all other forms of static energy including coal obsolete and simply more expensive. They are already shipping at $1.00 per watt.

The tool, which came in at under $2,000,000 and is working, generates enough product in a year to replace a nuclear power plant. All other forms of alternative static energy are thus toast. I really cannot say this any more strongly. The entire power industry is about to find out that their most valuable assets are the local distribution systems. These are about to be fed by local solar plants good for perhaps 5000 homes at a time.

Right now, with the machine running, the product design team is picking the low hanging fruit. As costs drop, there will be many machines made and the new infrastructure will quickly emerge everywhere. Remember how India and China jumped over the need for land lines in the telecommunication industry? The rest of the world is about to be able to jump past the need for high voltage power systems and power plants.

This should be in full roll out mode within a couple of years as everyone understands the implications. And yes we are all going to be driving electric cars very soon. It suddenly became too easy and way too cheap.

The site that you need to visit can be googled under the word nanosolar. The enthusiasm is totally justified and these boys have already been shipping product for six months. It will soon be front page news.

Alkaline Soils and Charcoal Success

I had a brief correspondence with Dr N. Sai Bhaskar Reddy from India on his trials on alkaline soils which I am copying here.

I think that sizing is not a particularly critical factor, except for convenience. This is a bit of a surprise as I would have expected that fine grained powder would maximize the overall homogeneity of the soil. What appears to be happening is that the root system responds to the presence of the charcoal even at some tangible remove. This clearly suggests that while fines are likely preferred in special cases, they are not at all necessary.

My reading on the terra preta soils showed that the majority of carbon was very fine grained as would be expected from a corn stover. This also accounts for the high percentage of carbon in the soil. There never was a reason to quit adding carbon.

On the other hand, using wood charcoal is obviously totally feasible with a preliminary screening to create an easy to spread product.

One of the concerns with fresh charcoal is that there is still sometimes a residue of acids that seem to slow integration into the soils. It seems that alkaline soils would naturally offset this particular temporary effect.

The important observation is that from fields that were essentially barren, he is abruptly getting good results. This gives us a simple protocol to reverse alkaline damage to soils. I do not know if the source of the alkalinity is actually removed over time but we now know that it may no longer matter.

The take home lesson is to take your poorest field and get it treated with a dressing of charcoal, however obtained. You will quickly gain confidence and the results will be very pronounced. So far, absolutely no special nonconforming soils have been discovered, but the real test will come when someone tries this out on a salt rich soil. I am not optimistic regarding those conditions but it may still be possible to make something work even there were common sense says no.

One thing that I should comment on, that is not too obvious is that the effect of powdering causes the surface area in contact with the soil to climb exponentially. A simple grind has a huge increase in the availability of active contact. Taking the grind down to the nanometer scale generates a massive increase in effectiveness. The problem is usually that it is not cost effective to reach this scale and I suspect that the biology quickly surrounds the particle, perhaps then inhibiting the effects.

The corn char derived terra preta likely demonstrate this effect when compared to far coarser wood derived charcoals. A soil containing 15% finely powdered as in the terra preta should do far better than our wood based soils of the same carbon percentage.

Dear Robert Klein,

In the alkaline soils, I got very good results for two seasons. The soil condition has improved and expected to get better results in the next 4 years. The charcoal located within a depth of 8 inches, breaks during ploughing and other activities done in the farm. Regarding using powder charcoal / lumps of charcoal, I prefer lumps of charcoal for the following reasons. A part from the reason mentioned by you.

- The recent report clearly shows that the roots prosper about a piece of charcoal.

- Lump can provide better environment

- Lump is heavy so less chance of moving due to wind or water away from the field

- Adds texture to the soil

- Soil microbes and soil fungus would find convenient place in a lump of charcoal and can live as a community.

There could be many more reasons.

With regards,

Dr. N. Sai Bhaskar Reddy


On 6/24/08, Robert Klein <arclein@yahoo.com> wrote:

Hello Dr Reddy


I am curious how your experiments have been working out on alkaline soils.


As an aside, you are using wood charcoal obviously. Can you get it to break up in the soil when you hoe the ground?


I am thoroughly convinced that the amazonians used corn stover primarily for their biochar and that would have yielded a finely powdered carbon. Check with me if you want to know how this was done.


The recent report clearly shows that the roots prosper about a piece of charcoal. I actually knew that from my own boyhood. Whoever thought that that would be important?

regards

bob klein(arclein)

Greenland Ice Cores

This is an important review of the Greenland ice core data. For some reason, and not the reason that I espouse today, in the early nineties, I saw fit to locate the raw ice core data and download it on my old 386. This was before browsers so that meant locating the appropriate bulletin board and getting the text file. Now you have to pay for it.

I was able to satisfy myself of the existence of a climate break about 12,900 years ago. I also realized that this was rapidly approaching the limits of data retention by the cores. Resolution was breaking down and becoming increasingly unreliable. That is worth remembering, though I anticipate that the number of cores has increased markedly and that is somewhat improving resolution.

We now have three apparently well defined climate breaks. The first at 14,700 years was a sharp global warming of around ten degrees or so that arrived over night and then fairly quickly reversed and disappeared restoring former conditions. This has been associated with a huge fresh water pulse out of the Antarctic, although I do not know how they know that. It does make sense that this was driven by events out of the Antarctic.

The second event 12,900 years ago we have already described as the Pleistocene Nonconformity, in which the crust shifted thirty degrees, placing the icecap out of the polar region. This obviously implied a climate shift that would be reflected as a precipitation change immediately upon the events occurrence. The ice core record shows just that and in fact we can split the record into before and after uniform precipitation averages. This shrouds the importance of the other two events and makes them easy to ignore as I did when I first reviewed the data.

The third event 11,700 years ago must coincide with the flushing of the monster lake Agassiz. This huge sea of melt water broke out and drained probably within a year, causing a worldwide coastal flooding event that may well be one of the sources of the cultural history of global floods.

It continues to amaze me how successful oral histories were able to transmit the gist of real events over thousands of years. You would expect natural skepticism and story telling insertions to wreck most or these transmittals. Yet this is not so at all. These ancient tales are constantly the inspiration of new explorations and scholarly investigations that keep bearing fruit. Even the tales of Eldorado has dug up a huge Indian civilization in the Amazon were our scholarship said such was impossible. My single regret is that we have inadvertently lost so much over the years that may have been instructive from this lode.

What I want to emphasize to my readers is that major climatic shifts all have reasonable apparent nonclimatic causation. In each case a significant shift of material took place. The Antarctic case sounds like a major ice sheet collapse that shifted the climatic circulation around. The second and third speak for themselves.

What is totally marvelous is that this has created a stunningly stable climatic system for the Northern Hemisphere which is not going to revisit the ice age for a likely million years or so. In fact, the only engine besides a super volcano that is capable of truly disturbing the climate is another ice sheet collapse and the evidence now suggests that could actually warm the climate rather than cool it. Both such events are ultimately temporary, but the victims will not be able to complain.

June 23, 2008

Ice Core Reveals How Quickly Climate Can Change

Weather patterns can permanently shift in as little as a year, according to the records preserved in an ice core from Greenland

By David Biello

Roughly 14,700 years ago the weather patterns that bring snow to Greenland shifted from one year to the next—a pattern of abrupt change that was repeated 12,900 years ago and 11,700 years ago when the earth’s climate became the one enjoyed today—according to records preserved in an ice core taken from the northern island. These speedy changes—transitions from warming to cooling and back again—in the absence of changes in greenhouse gas could presage abrupt, catastrophic climate change in our future.

"What made these abrupt climate changes were circulation changes, and these changes took place from one year to the next more or less," says glaciologist Sune Olander Rasmussen of the Centre for Ice and Climate at the University of Copenhagen, who was part of a team that analyzed annual data from ice tubes extracted from as deep as 10,000 feet (3,085 meters) beneath the ice sheet, which were collected by the North Greenland Ice Core Project, a drilling expedition.


The researchers looked at three variables in the core: the amount of dust, the kind of hydrogen and the kind of oxygen in the ice. The amount of dust from year to year reveals that less of the grit traveled all the way to Greenland from the deserts of Asia (where the dust that settles over Greenland originates) around the time these transitions began, the team reports in Science.


"If things are starting to change in the dust first then we are looking for a [climate change] trigger somewhere outside of Greenland," Rasmussen says. "That could be monsoon changes," since different rainfall patterns in Asia would affect dust levels in the atmosphere.

Roughly five years after this change in dust levels, the levels of heavy hydrogen ensconced in the ice indicate that weather patterns were shifting and driving precipitation over Greenland that had originated in evaporated water from a different area of the ocean than had previously been the source of the island’s rain. And this change happened in as little as a year. "During the glacial period, abrupt warmings show change of the atmospheric circulation from year to year," says glaciologist Dorthe Dahl-Jensen, also of the University of Copenhagen, who participated in the study as well.

Following this abrupt shift, as much as 20 degrees Fahrenheit (10 degrees Celsius) of warming occurred over the subsequent decades—a change that ultimately resulted in at least 33 feet (10 meters) of sea-level rise as the ice melted on Greenland.


Greenland can change quickly, even living up to its name, according to another paper in this week's Science. Sediment cores from the ocean show that forests of spruce and even fern grew on Greenland just 125,000 years ago. That means Greenland’s ice sheet—potentially responsible for as much as 75 feet (23 meters) of sea-level rise if it all melts—has grown and shrunk far more frequently than previously known.


"The question that arises from such findings is: How come the Greenland ice sheet at such a low latitude has remained so stable during the present interglacial [period] until now?" says study co-author and geochemist Claude Hillaire-Marcel of the University of Quebec in Montreal. "In view of the past instability—and sensitivity to temperature—of Greenland ice, serious concerns about its future under global warming stress do emerge."

Understanding that threat may require traveling even farther back in time via ice, to the transition to the last such warm period 130,000 years ago—the Eemian—when it was nine degrees F (five degrees C) warmer across Greenland. An ice core, known as NEEM (for North Greenland Eemian Ice Drilling), that could address that question is being extracted now as part of the ongoing International Polar Year. "The circulation changes in a few years. The temperature change is happening over decades," Rasmussen notes. "The more we force a system, the more likely it is that we will get some kind of response that is violent."

Oil from Bugs

“Ten years ago I could never have imagined I’d be doing this,” says Greg Pal, 33, a former software executive, as he squints into the late afternoon Californian sun. “I mean, this is essentially agriculture, right? But the people I talk to – especially the ones coming out of business school – this is the one hot area everyone wants to get into.”

He means bugs. To be more precise: the genetic alteration of bugs – very, very small ones – so that when they feed on agricultural waste such as woodchips or wheat straw, they do something extraordinary. They excrete crude oil.

Unbelievably, this is not science fiction. Mr Pal holds up a small beaker of bug excretion that could, theoretically, be poured into the tank of the giant Lexus SUV next to us. Not that Mr Pal is willing to risk it just yet. He gives it a month before the first vehicle is filled up on what he calls “renewable petroleum”. After that, he grins, “it’s a brave new world”.

Mr Pal is a senior director of LS9, one of several companies in or near Silicon Valley that have spurned traditional high-tech activities such as software and networking and embarked instead on an extraordinary race to make $140-a-barrel oil (£70) from Saudi Arabia obsolete. “All of us here – everyone in this company and in this industry, are aware of the urgency,” Mr Pal says.

What is most remarkable about what they are doing is that instead of trying to reengineer the global economy – as is required, for example, for the use of hydrogen fuel – they are trying to make a product that is interchangeable with oil. The company claims that this “Oil 2.0” will not only be renewable but also carbon negative – meaning that the carbon it emits will be less than that sucked from the atmosphere by the raw materials from which it is made.

LS9 has already convinced one oil industry veteran of its plan: Bob Walsh, 50, who now serves as the firm’s president after a 26-year career at Shell, most recently running European supply operations in London. “How many times in your life do you get the opportunity to grow a multi-billion-dollar company?” he asks. It is a bold statement from a man who works in a glorified cubicle in a San Francisco industrial estate for a company that describes itself as being “prerevenue”.

Inside LS9’s cluttered laboratory – funded by $20 million of start-up capital from investors including Vinod Khosla, the Indian-American entrepreneur who co-founded Sun Micro-systems – Mr Pal explains that LS9’s bugs are single-cell organisms, each a fraction of a billionth the size of an ant. They start out as industrial yeast or nonpathogenic strains of E. coli, but LS9 modifies them by custom-de-signing their DNA. “Five to seven years ago, that process would have taken months and cost hundreds of thousands of dollars,” he says. “Now it can take weeks and cost maybe $20,000.”

Because crude oil (which can be refined into other products, such as petroleum or jet fuel) is only a few molecular stages removed from the fatty acids normally excreted by yeast or E. coli during fermentation, it does not take much fiddling to get the desired result.

For fermentation to take place you need raw material, or feedstock, as it is known in the biofuels industry. Anything will do as long as it can be broken down into sugars, with the byproduct ideally burnt to produce electricity to run the plant.

The company is not interested in using corn as feedstock, given the much-publicised problems created by using food crops for fuel, such as the tortilla inflation that recently caused food riots in Mexico City. Instead, different types of agricultural waste will be used according to whatever makes sense for the local climate and economy: wheat straw in California, for example, or woodchips in the South.

Using genetically modified bugs for fermentation is essentially the same as using natural bacteria to produce ethanol, although the energy-intensive final process of distillation is virtually eliminated because the bugs excrete a substance that is almost pump-ready.

The closest that LS9 has come to mass production is a 1,000-litre fermenting machine, which looks like a large stainless-steel jar, next to a wardrobe-sized computer connected by a tangle of cables and tubes. It has not yet been plugged in. The machine produces the equivalent of one barrel a week and takes up 40 sq ft of floor space.

However, to substitute America’s weekly oil consumption of 143 million barrels, you would need a facility that covered about 205 square miles, an area roughly the size of Chicago.

That is the main problem: although LS9 can produce its bug fuel in laboratory beakers, it has no idea whether it will be able produce the same results on a nationwide or even global scale.

“Our plan is to have a demonstration-scale plant operational by 2010 and, in parallel, we’ll be working on the design and construction of a commercial-scale facility to open in 2011,” says Mr Pal, adding that if LS9 used Brazilian sugar cane as its feedstock, its fuel would probably cost about $50 a barrel.

Are Americans ready to be putting genetically modified bug excretion in their cars? “It’s not the same as with food,” Mr Pal says. “We’re putting these bacteria in a very isolated container: their entire universe is in that tank. When we’re done with them, they’re destroyed.”

Besides, he says, there is greater good being served. “I have two children, and climate change is something that they are going to face. The energy crisis is something that they are going to face. We have a collective responsibility to do this.”

Oil Price Choke Point

I think this article by columnist Dick Morris, more than all the other hand wringing going on, has got it. The great transition out of the fossil fuel paradigm is underway. The political support is now in place to have the US economy and thereafter the global economy completely retooled away from any reliance on oil in particular and eventually natural gas and coal.

I expected it to take much higher prices to reach this trigger point, but it appears that we are here. The massive financial resources of the modern global economy will now be harnessed to shift us out of the fossil fuel energy trap.

Of course, if the Brussard fusion energy system works out, the transition will even be abrupt. The transition will appear to be abrupt anyway. The automakers are today cranking out electrics and hybrids. The next generation is next year. It is amazing to me that even manufacturers appear to have gotten on to internet time.

On top of all that, those who have actually read through this blog over the past year will have seen us discover several nifty strategies that are even happening now and many others if successful, that will be a major part of the energy equation within twenty years at most.

To begin with, conversion of the diesel truck fleet over to LNG is a must because this releases a major portion of our oil demand. LNG is not going into anything other than capacity shortage for a very long time. It is still a fossil fuel, but it is a good quick fix for the developing squeeze on global oil supplies. The last oil crisis pushed oil out of the home heating business. This one needs to shift the haulage industry out.

The conversion of the automotive fleet into less oil dependency is underway. My guess is that it will be able to cut demand there by about half over the next five years.

Several millions of barrels per day of demand can be fairly cut without any special action taken and in a simple response to the current price regime. And that is what we need to give us the five years we need to mobilize the alternatives.

What I have found always delightful, is the way that a generally free liberal economy is able to do when facing a clear cut challenge. It is amazing how swiftly decisions can be made and implemented in the face of sudden death. Now try to imagine any bureaucracy performing like that.

I had a revealing conversation yesterday. A Canadian businessman who has made his life for the past decade in Hong Kong told me that about two years ago the Chinese government veered away from attempting to involve the state and is now supporting the same style throughout China. It was very much a case of recognizing that China needed more of what Hong Kong has. This also suggests that within a generation we will see Taiwan climb back aboard as it will not matter anymore. I also now expect electoral systems to start working their way vigorously through the body politic. There is just too much heat to contain and that will blow it off. I also suspect that the political leaders of China are ready to transition to a fully democratic system as the middle class is now emergent and very conservative.

In any case, as I have said before, the oil price regime has shifted from the twenty year $30 mark to the plus $100 mark. This is a four fold increase in the cost structure and is now sitting at the upper range. We cannot afford this. The rest of the globe cannot afford this. What is more, we now know that we cannot depend solely on the oil industry. During the seventies oil shock brought on by the initial declines in US production, there was always the Middle East. All those reserves are no longer enough to satisfy projected global energy demand.

A lot of alternative energy strategies become feasible when cheap oil is $100 per barrel. We will soon see that price again because right now, consumers of oil are rapidly adjusting and demand is dropping. So unless we get a serious supply shock of a couple of millions of barrels per day, we will retreat from the $140 range. The long side speculators will get creamed and we will have a serious oversold market very quickly. The range may even open up to $60 to $140 before stabilizing at the $100 mark.

For those who like history, the seventies oil crisis peaked at $40 and crashed to $12. That was still four times the sixties bench mark of less than $3 and the real market shook out around $20 for the next twenty years. As I said, an accident could take us to $200, but it is now more likely we will see $60 first.

McCain Scores Big With Offshore Oil Drilling Proposal

Thursday, June 19, 2008 2:06 PMBy: Dick Morris & Eileen McGann

John McCain has drawn first blood in the political debate following Barack Obama's victory in the primaries. His call yesterday for offshore oil drilling — and Bush's decision to press the issue in Congress — puts the Democrats in the position of advocating the wear-your-sweater policies that made Jimmy Carter unpopular.

With gas prices nearing $5, all of the previous shibboleths need to be discarded. Where once voters in swing states like Florida opposed offshore drilling, the high gas prices are prompting them to reconsider. McCain's argument that even hurricane Katrina did not cause any oil spills from the offshore rigs in the Gulf of Mexico certainly will go far to allay the fears of the average voter.

For decades, Americans have dragged their feet when it comes to switching their cars, leaving their SUVs at home, and backing alternative energy development and new oil drilling. But the recent shock of a massive surge in oil and gasoline prices has awakened the nation from its complaisance. The soaring prices are the equivalent of Pearl Harbor in jolting us out of our trance when it comes to energy.

Suddenly, everything is on the table. Offshore drilling, Alaska drilling, nuclear power, wind, solar, flex-fuel cars, plug-in cars are all increasingly attractive options and John McCain seems alive to the need to go there while Obama is strangely passive. During the Democratic primary, he opposed a gas tax holiday and continues to be against offshore and Alaska drilling and squishy on nuclear power.

That leaves turning down your thermostat and walking to work as the Democratic policies.

McCain has also been ratcheting up his attacks on oil speculators. With the total value of trades in oil futures soaring from $13 billion in 2003 to $260 billion today, it is increasingly clear that it is not the supply and demand for oil which is, alone, driving up the price, but it is the supply and demand for oil futures which is stoking the upward movement.

The Saudis have made a fatal mistake in not forcing down the price of oil.

We could have gone for decades as their hostage, letting their control over our oil supplies choke us while enriching them. But they got greedy and let the price skyrocket. The sudden shock which has sent America reeling is just the stimulus we need for a massive movement away from imported oil and toward new types of cars.

The political will for major change in our energy policy is now here and those, like Obama, who don't get it need to rethink their positions. To quote FDR, “this great nation calls for action and action now” on the energy issue.

What has been a back-burner problem now has moved onto center stage and McCain has put himself in the forefront.

The Democratic ambivalence stems from liberal concerns about climate change. The party basically doesn't believe in carbon-based energy and, therefore, opposes oil exploration.

That's why Obama pushes the windfall profits tax on oil companies — a step that tells them “you drill, you find oil, and we'll take away your profits.” But Americans have their priorities in order: more oil, more drilling and alternative energy sources, flex-fuel cars, plug-in vehicles and nuclear power.

With his willingness to respond to the gas price crisis with bold measures, McCain shows himself to be a pragmatist while Obama comes off as an ideologue to puts climate change ahead of making it possible for the average American to get to work.

Of course, the high price of gas makes it inevitable that the U.S. will lead the world in fighting climate change.

With $5 gas, Americans will switch en masse to cars that burn less gasoline. Already we have cut our oil consumption by 500,000 barrels a day in the past year (about a 3 percent cut).
The move away from oil will be exponential from here on out, dooming radical Islam and reversing climate change at the same time. But while we are getting new cars, we need more oil and McCain has flanked Obama on this issue. Big time.

Brussard Fusion Energy Advancing

This post by Alan Boyle on the recent progress made on the pioneering work commenced by the late Robert Brussard is heartening.

I have always suspected that solutions to the fusion riddle would take the form of complex geometries with simple convergence points. Unfortunately the only useful mathematical modeling tool available is intuition, and Robert had it more than most.

I expect that my own work on higher ordered cyclic functions and their derivative Pythagoreans will go a long way toward formalizing the geometry and giving grad students a headache.

The cold fusion model could only work if the curvature geometry about densely bound atoms permitted appropriate geodesics that allowed for the possibility of a fusion interaction. That it may have shown up empirically is nice. Designing something that actually works will not be done empirically unless we are simply lucky.

Everyone forgets that the tokomak gained favor for exactly one reason. Natural symmetry permitted a high level of modeling to be conducted before the advent of real computer power.

Today we actually have the computer power to tentatively model complex geometries and using higher ordered cyclic functions it become possible to achieve convergence on exact solutions. This type of work is possible only recently.

Right now they have a machine working smoothly and the prognosis for scaling looks excellent. The proponents could not ask for better news to give the funders. The next stage should also go ahead, and if the funders are out of their depth, replacements are very findable.

It also sounds like a ten fold scaling will make or break this model. If it works, it would also be the industrial prototype. It would be lovely to wake up one day knowing that a fusion power plant coming to your town shortly will supply your electric car for a small charge.

Fusion quest goes forward

Posted: Thursday, June 12, 2008 7:05 PM by Alan Boyle

Emc2 Fusion's Richard Nebel can't say yet whether his team's garage-shop plasma experiment will lead to cheap, abundant fusion power. But he can say that after months of tweaking, the WB-7 device "runs like a top" - and he's hoping to get definitive answers about a technology that has tantalized grass-roots fusion fans for years.

With $1.8 million in backing from the U.S. Navy, Nebel and a handful of other researchers have been following up on studies conducted by the late physicist Robert Bussard before his death last October - studies that Bussard said promised a breakthrough in fusion energy.

Nebel, who is on leave from Los Alamos National Laboratory, picked up Bussard's mantle at Emc2 Fusion Development Corp. in Santa Fe, N.M., and is trying to duplicate the results that were reported from the last machine Bussard built. The WB-6 device supposedly worked by setting up a high-voltage electrical field that was configured in just the right way to get ions slamming into each other, creating a fusion-fueled plasma.

Unfortunately, WB-6 was destroyed during one of its last scheduled test runs in 2005, and Bussard was never able to build another device. Fortunately, Nebel's five-person team has succeeded in building a new, improved device on a shoestring budget.

"We're kind of a combination of high tech and Home Depot, because a lot of this stuff we make ourselves," Nebel told me today. "We're operating out of a glorified garage, but it's appropriate for what we're doing."

The Emc2 team has been ramping up its tests over the past few months, with the aim of using WB-7 to verify Bussard's WB-6 results. Today, Nebel said he's confident that the answers will be forthcoming, one way or the other.

"We're fully operational and we're getting data," Nebel said. "The machine runs like a top. You can just sit there and take data all afternoon."

So was Bussard correct? Will it be worth putting hundreds of millions of dollars into a larger-scale demonstration project, to show that Bussard's Polywell concept could be a viable route to fusion power?

No answers just yet


Nebel said it's way too early to talk about the answers to those questions. For one thing, it's up to the project's funders to assess the data. Toward that end, an independent panel of experts will be coming to Santa Fe this summer to review the WB-7 experiment, Nebel said.

"We're going to show them the whole thing, warts and all," he said.

Because of the complexity, it will take some interpretation to determine exactly how the experiment is turning out. "The answers are going to be kind of nuanced," Nebel said.

The experts' assessment will feed into the decision on whether to move forward with larger-scale tests. Nebel said he won't discuss the data publicly until his funders have made that decision.

For now, Nebel doesn't want to make a big deal out of what he and his colleagues are finding. He still remembers the controversy and the embarrassments that were generated by cold-fusion claims in 1989.

"All of us went through the cold-fusion experiences, and before we say too much about this, we want to have it peer-reviewed," he said.

At the same time, he can't resist talking about how well WB-7 is operating. "I've been very pleased, frankly, with the sorts of things we've been getting out of it," Nebel said.

High hopes for low-cost fusion


Nebel may be low-key about the experiment, but he has high hopes for Bussard's Polywell fusion concept. If it works the way Nebel hopes, the system could open the way for larger-scale, commercially viable fusion reactors and even new types of space propulsion systems.

"We're looking at power generation with this machine," Nebel said. "This machine is so inexpensive going into the 100-megawatt range that there's no compelling reason for not just doing it. We're trying to take bigger steps than you would with a conventional fusion machine."

Over the next decade, billions of dollars are due to be spent on the most conventional approach to nuclear fusion, which is based on a magnetic confinement device known as a tokamak. The $13 billion ITER experimental plasma project is just starting to take shape in France, and there's already talk that bigger budgets and longer timetables will be required.

If the Polywell system's worth is proven, that could provide a cheaper, faster route to the same goal - and that's why there's a groundswell of grass-roots interest in Nebel's progress. What's more, a large-scale Polywell device could use cleaner fusion fuels - for example, lunar helium-3, or hydrogen and boron ions. Nebel eventually hopes to make use of the hydrogen-boron combination, known as pB11 fusion.

"The reason that advanced fuels are so hard for conventional fusion machines is that you have to go to high temperatures," Nebel explained. "High temperatures are difficult on a conventional fusion machine. ... If you look at electrostatics, high temperatures aren't hard. High temperatures are high voltage."

Most researchers would see conventional tokamak machines as the safer route to commercial fusion power. There's a chance that Bussard's Polywell dream will prove illusory, due to scientific or engineering bugaboos yet to be revealed. But even though Nebel can't yet talk about the data, he's proud that he and his colleagues at Emc2 have gotten so far so quickly.

"By God, we built a laboratory and an experiment in nine months," he said, "and we're getting data out of it."

Industrial Charcoal Breakthrough

Industrial Charcoal Breakthrough

I have copied this material from Renewable Resources Research Laboratory. It is very important, even though they are still in the post prototyping stage and certainly have a lot more fussing around to do.

We have explored traditional methods and so called conventional methods for producing char coal and biochar and have learned both their strengths and shortcomings. The challenge faced was to come up with a basic design that cou8ld be operated at the farm gate and also be scaled to handle any feedstock volume. We seem to have at least the first clear cut demonstration of this principal.

The Q&A part brings home that our economy needs to focus on maximizing charcoal production. The market is clearly unlimited when we accept non woody plant waste char as a soil re mediator.

This also means that forest management can now focus on harvesting waste wood as a matter of course. After all, we are sending a ten ton truck out into the forest every day with a chipper and taking the annual surplus out of the forest. Forest management demands this for best practice. The daily yield is then brought directly to the local flash carbonizer and either kept in inventory to dry or processed.

The carbonizer uses pressure to force a fast burn. This was a bit unexpected, but the ramifications are obvious. It takes perhaps thirty minutes for the heat to fully penetrate the feedstock. The process is not taken to the point of full reduction which leaves a substantial residue of gases and tars. I assume the light gases provide most of the heat. These gases and residues will become continuously available and can be immediately used to fire a boiler for power generation. Any more than that is likely to be swapping dollars.

The point of this is that we have a reliable productivity for the whole spectrum of feed stocks. The principal feedstock can be waste wood fiber. Seasonal sources of agricultural waste can then be folded into fuel stream. Even manure is a good potential feedstock although moisture content might be a problem.

What is very important is that a farmer can bring his load in, have it treated and then take the product back to the farm for soil remediation. I think that we will learn that carbonized cattle manure is a very excellent soil additive.

That sewage waste also makes an excellent product is no surprise and has the added benefit of eliminating the cultural objection to human waste been added to soils.

My earlier arguments for the development of a distributed two lung incinerator system all apply here.

Here we are using pressure but not trying to go to the high pressure methodology that has romanced so many.

I suspect that this is the enabling technology that will now allow industrial scale production of soil charcoal and also a major supply of charcoal suitable for metallurgy and perhaps occasional replacement of coal.

Can you imagine the boreal forests been properly managed for a sustainable crop of waste wood and also a sustainable crop of cattails? This technology makes those types of ideas actually work.

Renewable Resources Research Laboratory

The Renewable Resources Research Laboratory (R3Lab) is a test-bed for the development of innovative technologies and processes for the conversion of biomass into fuels, high-value chemicals, and other products. A consistent theme of the lab's research throughout its history has been the search for new uses for Hawaii's abundant agricultural crops and by-products.

Currently the R3Lab is perfecting the operation of the catalytic afterburner that cleans the effluent of the Flash Carbonization™ Demonstration Reactor. Also, we are producing Flash Carbonization™ charcoal for use in carbon fuel cell research, terra preta and carbon sequestration studies, and metallurgical process research. Finally, we are fabricating an aqueous carbonate/alkaline biocarbon fuel cell that we hope to begin testing soon.

Earlier in its life the R3Lab engaged in research on conversion of biomass into gaseous fuels (hydrogen), and liquid fuels (ethanol). Reprints of publications are available upon request to Prof. M.J. Antal.

Biocarbons (charcoal)

News Item: Recently we began Flash Carbonization™ studies of raw sewage sludge produced in Honolulu's Ewa sewage sludge treatment plant. We were surprised by the ease with which air-dry sewage sludge can be converted into charcoal. We obtained charcoal yields of about 30% (dry basis) from the sewage sludge. The charcoal contained 45-51% ash and 40% fixed-carbon. Studies of the use of sewage sludge charcoal as a soil amendment (i.e., terra preta application) with the side-benefit of carbon sequestration are now beginning at UH. Results of these studies will be reported at the forthcoming AIChE meeting in Philadelphia (November, 2008).

A recent article appeared in the Honolulu Advertiser newspaper about the Flash Carbonization™ process. The following is a question and answer explanation relative to that article.

Q1: You mention that the University has licensed the Flash Carbonization™ technology to several companies that plan to use it for the commercial production of charcoal. Can you tell us who these companies are?

A1: Our three current licensees are Carbon Diversion Corp., which has operations here in Hawaii; the Kingsford Products Company, which most people recognize as the largest manufacturer of BBQ charcoal in the world; and Pacific Carbon and Graphite. Licenses are also being discussed with other companies on the US mainland, in Canada, and elsewhere in the world.

Q2: Isn't charcoal merely a barbeque fuel?

A2: No! Charcoal is the sustainable fuel replacement for coal. Coal combustion is the most important contributor to climate change. Coal combustion adds about 220 lb of CO2 to the atmosphere for every million BTU of energy that it delivers; whereas crude oil adds 170 lb per million BTU, gasoline adds 161 lb per million BTU, and natural gas adds 130 lb of CO2 to the atmosphere per million BTU of delivered energy. On the other hand, the combustion of charcoal - sustainably produced from renewable biomass - adds no CO2 to the atmosphere! Thus, the replacement of coal by charcoal is among the most important steps we can take to ameliorate climate change.

Q3: Do we burn coal to generate electrical power here in Hawaii?

A3: Yes! In the year 2000 we operated a 180 MW coal fired power plant on Oahu, a 22 MW power plant on the Big Island, and a 12 MW coal and bagasse fired power plant on Maui. The HC&S Puunene power plant on Maui could be the perfect starting point for replacement of coal by charcoal. The highest priority for knowledgeable people who care about the environment is the replacement of coal by cleaner, renewable fuels.

Q4: Why doesn't the combustion of charcoal add to the CO2 burden of the atmosphere and thereby cause climate change?

A4: The combustion of charcoal does not add to the CO2 burden of the atmosphere because charcoal is produced from waste wood, crop residues, and other renewable biomass that would otherwise decompose (i.e. rot) in a landfill or in the ground and become CO2. Thus the combustion of charcoal is a small part of nature's great carbon cycle upon which life depends.

Q5: Does the replacement of coal by charcoal have other benefits?

A5: Yes! Coal is laden with mercury and sulfur. Mercury is a deadly toxin. Mercury from the combustion of coal in China has been found in fish taken from the Great Lakes in the USA. Thus mercury emissions can be windborne and carried across continents and oceans. New regulations concerning the release of mercury to the atmosphere may greatly increase the cost of electric power generation by coal combustion. Similarly, because coal is also laden with sulfur, the combustion of coal leads to the release of sulfur oxides into the atmosphere. Sulfur oxides are a principal cause of acid rain. In contrast, charcoal contains no mercury and virtually no sulfur. In fact, our drug stores sell charcoal tablets to eat as an aid for digestion! Moreover, on a pound per pound basis, charcoal contains much more energy than most coals.

Q6: Are you suggesting that charcoal would be a better choice than ethanol for the sustainable production of electric power here in Hawaii?

A6: Yes! From the standpoints of resource utilization, energy efficiency, and economics; charcoal is preferred over ethanol as a fuel for electric power generation.

Here in Hawaii we have an abundance of macadamia nut shells and husks, green wastes including tree trimmings, wood logs, coconut shells and husks, and increasing amounts of invasive species (e.g. gorse wood, strawberry guava) that should be contained or eradicated. These biomass feedstocks cannot be converted into ethanol in a practical process, but they are all ideal feedstocks for the production of charcoal.

Likewise, as a result of seed corn production, we also have significant amounts of corn stover including cobs. If this stover is converted into charcoal, the charcoal retains 59% of the energy content of the stover. This energy conversion efficiency (i.e. 660 lb charcoal per ton of dry stover) has been proven in the commercial scale Flash Carbonization™ reactor that is in operation on the UH campus. On the other hand, if the corn stover is converted into ethanol, the conversion efficiency is projected to be only 43% (i.e. 85 gal of ethanol per ton of stover). We emphasize that this efficiency is an optimistic projection, since the conversion of corn stover into ethanol is unproven on a commercial scale. Thus a ton of corn stover will deliver 37% more energy if it is converted into charcoal instead of ethanol. That's 37% more energy available for the generation of electric power!

Furthermore, because of tax incentives ethanol does not appear to be an expensive fuel. But appearances can be deceptive, especially in light of the relatively low energy content of a gallon of ethanol. The current rack price nationwide of a gallon of ethanol is $2.40. Reflecting the low energy content of ethanol this price is $3.66 per gallon of gasoline equivalent (i.e. $31.75 per million BTU). Note that this price includes no taxes. The comparable price of imported charcoal is $279/ton, or $0.79 per gallon of gasoline equivalent (i.e. $10.48 per million BTU). Thus the price of ethanol is 3 times more expensive than charcoal! Also, note that the production of charcoal enjoys no Federal tax credits; nevertheless, on an energy basis charcoal is about 87% the price of crude oil at $70 per bbl ($12 per million BTU). Given the high price of ethanol, Hawaiian consumers of electric power should contemplate the very large increase in their energy cost adjustment that will appear if ethanol begins to be used as a boiler fuel to generate electricity!

Q7: What about HECO's plan to use biodiesel produced from imported palm oil to fuel a 110 MW power plant in the Campbell Industrial Park?

A7: Biodiesel fuel (B100) is manufactured from vegetable oils. Anyone who purchases vegetable oils for salads or cooking knows that these oils are expensive. Thus, common sense leads us to expect that B100 - manufactured from soy, canola, rapeseed, or palm oil - will be expensive. The Union of Concerned Scientists predicts that the price of B100 will be double that of diesel fuel. The National Renewable Energy Laboratory estimates the price of B100 from soy oil will exceed $2.40 per gallon, and from canola oil will exceed $3.00 per gallon. In Seattle the recent price of B100 was $3.29 per gallon.

If we make the optimistic assumption that B100 will cost $3.00 per gallon, HECO will be paying about $25 per million BTU (or $2.89 per gallon of gasoline equivalent without taxes) for its boiler fuel. On an energy basis, this is 2.4 times the comparable price of charcoal. As with ethanol, Hawaiian electric power consumers will be shocked by the energy cost adjustment that will be added to their bill when B100 is burned to generate electric power.

Q8: Is charcoal being used for the commercial production of electric power anywhere?

A8: Yes! The largest charcoal producer in Europe, the Carbo Group BV, has sold substantial quantities of wood charcoal to ESSENT for co-firing in their Borselle coal fired station.

Q9: If charcoal is so inexpensive, can a business make a profit producing charcoal?

A9: Yes! The cost of producing a ton of charcoal in the USA is usually much less than $200, depending upon the local cost of biomass and labor. The wholesale price of charcoal imports during 2006 was $279 per ton. Obviously, the production of charcoal is a very profitable enterprise.

Q10: Does charcoal have any uses besides fuel for barbeque and electric power generation?

A10: Yes! Iron, steel, and ferrosilicon alloys are all produced using a carbon reductant. Almost one pound of carbon is consumed to produce a pound of steel. In the USA coal ("coke") is used as the carbon reductant and this use of coal adds substantial amounts of CO2 to the atmosphere and is an important contributor to climate change. Brazil and Norway use charcoal instead of coal to produce iron, steel, and ferrosilicon alloys. As the steel industry moves to reduce its carbon footprint, the demand for charcoal as a substitute for reductant coal will explode.

Also, here in Hawaii charcoal is an important ingredient in potting soils, and is the preferred rooting medium for orchids. Moreover, the addition of charcoal to soil has been shown to greatly enhance the growth of corn and other cash crops. This use of charcoal to enrich the soil is attracting much attention around the world as a practical means for permanently sequestering carbon from the atmosphere.

Q11: Will Kingsford manufacture charcoal here in Hawaii?

A11: No. A local company, Carbon Diversion Corp., has exclusive rights to manufacture charcoal using the UH Flash Carbonization™ process here in Hawaii and in parts of the Pacific basin. Carbon Diversion is headed by Michael Lurvey; a prize winning MBA graduate of the UH Business School, and a Vietnam veteran.

Q12: What does this all mean for Carbon Diversion?

A12: The markets for charcoal as a boiler fuel for the sustainable production of electricity, a reductant for the sustainable production of metals, and a soil amendment for sustainable agriculture and carbon sequestration are enormous. We believe that Carbon Diversion is destined to become one of the Exxons of the 21st century.

Flash Carbonization™ process

Research at the University of Hawaii (UH) has led to the discovery of a new Flash Carbonization™ process that quickly and efficiently produces biocarbon (i.e., charcoal) from biomass. This process involves the ignition of a flash fire at elevated pressure in a packed bed of biomass. Because of the elevated pressure, the fire quickly spreads through the bed, triggering the transformation of biomass to biocarbon. Fixed-carbon yields of up to 100% of the theoretical limit can be achieved in as little as 20 or 30 minutes. (By contrast, conventional charcoal-making technologies typically produce charcoal with carbon yields of much less than 80% of the theoretical limit and take from 8 hours to several days.) Feedstocks have included woods (e.g., leucaena, eucalyptus, and oak), agricultural byproducts (e.g., macadamia nutshells, corncobs, and pineapple chop), wet green wastes (e.g., wood sawdust and Christmas tree chips), various invasive species (e.g., strawberry guava), and synthetic materials (e.g., shredded automobile tires). Results of many of these tests are described in a series of technical, peer-reviewed, archival journals paper that can be obtained by request to Prof. M.J. Antal.

We are now testing a commercial-scale, stand-alone (off-the-grid) Flash Carbonization™ Demonstration Reactor ("Demo Reactor") on campus (see photos below). The first successful test occurred on 24 November 2006. A canister full of corn cobs was carbonized in less than 30 min. This test proved that the Flash Carbonization™ process can be scaled-up to commercial size.

Recently HNEI received a two-year $215,000 research grant from the Consortium on Plant Biotechnology Research (CPBR) for "Flash Carbonization™ Catalytic Afterburner Development." The CPBR funding will enable us to make progress towards the elimination of smoke and tar from the reactor's effluent.

After we satisfy all emissions regulations, the University will team with its licensee for the State of Hawaii (Carbon Diversion Corporation) and use the equipment to convert green wastes into charcoal. Large, dense, green waste feedstocks (e.g., tree logs, coconut shells) will be marketed as barbeque charcoal. Lighter material (e.g., tree trimmings and macshells) will be marketed as orchid potting soil (see below). Some charcoal may also be marketed as an ultra-clean coal. Synthetic materials (e.g., shredded automobile tires and other shredded synthetics) may also be carbonized. When it is fully operational, the Demo Reactor will have the capacity to produce about 10 tons of charcoal per 24 hr day and provide employment to two or three workers. The capital cost of the Flash Carbonization™ Demonstration Reactor (pictured above), gantry, and two canisters was about $200,000 (including delivery to Honolulu).

The Flash Carbonization™ technology is protected by U.S. Patent No. 6,790,317. The UH has applied for patents on the Flash Carbonization™ process in many other countries, and these patents are pending. The first license for charcoal production was signed in 2003. Since then Carbon Diversion Corp. has acquired an exclusive license for the State of Hawaii and various islands in the Pacific region, and the Kingsford Products Co. has acquired a license. Inquiries from other firms, in the US and around the world, are welcome. The University's licensing approach has been to grant territorial exclusivity with regard to production of Flash Carbonization™ charcoal and non-exclusive rights to sell such charcoal worldwide. All licensing activity is handled by the Office of Technology Transfer and Economic Development (OTTED).

Based on this prior experience, we recommend that a potential licensee take the following steps to determine if the Flash Carbonization™ process represents an attractive technology for adding value to locally available biomass feedstocks.

Contact Professor Michael J. Antal, Jr. and provide information on the proposed region for practice of the technology, the feedstock characteristics, etc. The potential licensee should have the ambition and the ability to produce and market at least 10,000 tons per year of charcoal.

Visit Professor Antal and Richard Cox (OTTED) to discuss license terms. The potential licensee should have significant engineering expertise.

Test the proposed feedstock's carbonization behavior in the Lab Reactor. This test costs $1000 and typically requires about 1 month to complete (including the shipping time of the biomass sample to Hawaii).

If you plan to use the FC process to produce charcoal in a region where the UH holds patents (ie., the USA, Canada, Japan, Australia, and most of the EU), or if you plan to sell FC charcoal in a region where the UH has patent protection, then you must negotiate a license with OTTED. You should begin with a term sheet that summarizes the key elements of the license. These elements will include the territory for practice of the technology, the charcoal markets, the license fee, the running royalty rate, and milestones. Thereafter, you should negotiate a license with OTTED.

HNEI will assist licensees of its Flash Carbonization™ technology by offering apprenticeship training that includes a parts list, drawings, an operator's manual, and intensive training in the operation of Flash Carbonization™ reactors that are now being operated and tested on campus. The apprenticeship program can accommodate more than one apprentice from a single licensee; it is scheduled at the mutual convenience of the licensee and HNEI, and it has a typical duration of 3 weeks. Note that the apprenticeship program is only offered to licensees of the technology.

Biocarbon Fuel Cells

HNEI researchers have fabricated and tested a moderate-temperature, aqueous-alkaline "direct" carbon fuel cell that "burns" charcoal as its fuel and directly generates electricity. The exciting results of this research are described in a technical paper that has just been published. The abstract of the paper appears below. Reprints of this paper are available upon request to Prof. M. J. Antal. Currently Prof. Antal and his team are fabricating an aqueous carbonate/alkaline biocarbon fuel cell that has been designed to achieve higher voltages and power densities. We hope to begin testing this new carbon fuel cell later this year.

Abstract

Because the carbon fuel cell has the potential to convert the chemical energy of carbon into electric power with an efficiency approaching 100%, there has been a keen interest in its development for over a century. A practical carbon fuel cell requires a carbon feed that conducts electricity and is highly reactive. Biocarbon (carbonized charcoal) satisfies both these criteria, and its combustion does not contribute to climate change. In this paper we describe the performance of an aqueous-alkaline biocarbon fuel cell that generates power at temperatures near 500 K. Thermochemical equilibrium favors the reduction of oxygen on the cathode at temperatures below 500 K; whereas the chemical kinetics of the oxidation of carbon by hydroxyl anions in the electrolyte demands temperatures above 500 K. Nevertheless, an aqueous-alkaline cell operating at 518 K and 35.8 bar was able to realize an open-circuit voltage of 0.57 V, a short-circuit current density of 43.6 mA/cm², and a maximum power of 19 mW by use of a 6 M KOH/1M LiOH mixed electrolyte with a catalytic Ag screen/Pt foil cathode and an anode composed of 0.5 g of compacted corncob charcoal previously carbonized at 950 °C. A comparison of Temperature Programmed Desorption (TPD) data for the oxidized biocarbon anode material with prior work suggests that the temperature of the anode was too low: carbon oxides accumulated on the biocarbon without the steady release of CO2 and active sites needed to sustain combustion; consequently the open-circuit voltage of the cell was less than the expected value (1 V). Carbonate ions, formed in the electrolyte as a product of the reaction of CO2 with hydroxyl ions, can halt the operation of the cell. We show that the carbonate ion is not stable in hydrothermal solutions at 523 K and above; it decomposes by the release of CO2 and the formation of hydroxyl anion. Consequently it should be possible to regenerate the electrolyte by use of reaction conditions similar to those employed in the fuel cell. We believe that substantial improvements in performance can be realized from an aqueous-alkaline cell whose cathode is designed to operate at temperatures significantly below 500 K, and whose biocarbon anode operates at temperatures significantly above 500 K.

High-Yield Activated Carbons from Biomass

Activated carbons made from biomass (i.e., coconut shells) charcoal are used to purify water and air. The R3Lab has developed an air oxidation process that produces high-yield activated carbons from biomass charcoal. This work was supported by the National Science Foundation.



Hydrogen Production from Biomass

A conventional method for hydrogen production from fossil fuels involves the reaction of water with methane (steam reforming of methane) at high temperatures in a catalytic reactor. Research sponsored by the U.S. Department of Energy led to the development of a process by the R3Lab for hydrogen production by the catalytic gasification of biomass in supercritical water (water at high temperature and pressure). This "steam reforming" process produces a gas at high pressure (>22 MPa) that is unusually rich in hydrogen. Researchers at the Institut fur Technische Chemie CPV, Forschungszentrum Karlsruhe in Karlsruhe, Germany are commercializing a biomass gasification process that employs the conditions identified by the R3Lab in its pioneering work. However, research on this topic within HNEI halted after a U.S. Department of Energy economic study projected dismal economics for the process



Biomass Pretreatments for the Production of Ethanol and Cellulose

The R3Lab has been a leader in the development of a pretreatment process that employs hot liquid water to render lignocellulosic biomass susceptible to simultaneous saccharification and fermentation for the production of ethanol. This process can also be used to produce microcrystalline cellulose from biomass. The research was supported by the U.S. Department of Agriculture and the Consortium for Plant Biotechnology Research.

Contact: Michael J. Antal, Jr.


Renewable Resources Research Laboratory