Dr. Gideon Polya on End of Coal Power

Dr Gideon Polya makes and documents the very good point that alternative energy strategies are converging in cost and price with the established fossil fuel protocols. All these systems had the potential to do this or they would simply not have been pursued in the first place. That thirty years of development and research effort is visibly paying of is gratifying.

Dr Gideon Polya is an apologist for a fair bit of left of center political thinking, but he has come through here with a tidy bit of home work on the developing trends in alternative energy systems.

The most visible at present are the now thousands of windmills built and been built. They are easily on the way to providing ten percent of global grid power. Further technical progress such as the axial motor promises to add a couple of points there and the addition of vanadium redox batteries can expand its usage as a sole source power system for large industrial installations. These ideas are all possible because it is simply becoming price competitive.

Cheap solar installations are also soon to become very visible and the same comments apply.

Tide and wave are still at the stage that windmill power was at thirty years ago and we will be waiting those thirty years before it may become important.

All these systems needed some form of non consumable energy storage. That is becoming available with the emergence of vanadium redox batteries.

What is important in this presentation is that all this is now becoming directly competitive with coal fired power plants. Bluntly put, coal is now on the road to extinction and it will happen over a generation at most. The coal industry was preserved as a power source solely because of price and easy availability. Windmills are even more available because of both modular design and much less stringent permitting needs. The same will hold true for cheap solar now coming on stream.

I could find no way to link the art work, but you may go to this link to see it.

http://mwcnews.net/content/view/26137&Itemid=1

“One Day Pathétique” Symphony Painting

Tchaikovski’s final “Pathétique” Symphony is actually Joyous and Hopeful as is the energy cost cross-over point - the best renewable and geothermal power options now cost the same as coal power.

I have painted a huge (1.3 metre x 2.9 metre) painting entitled “One Day Pathétique” that captures Pyotr Ilyich Tchaikovski’s final Symphony Number 6, the so-called “Pathétique”. Tchaikovski was delighted with this symphony but because he died a fortnight after its first performance the symphony has become known as the “Pathétique”. There is, however, a joyous and hopeful interpretation of this great work as opposed to the traditional sad, pathos-laden view. One Christmas morning I listened to Tchaikovski’s Symphony Number 6 as the sun was rising over the beautiful Yarra River Valley and I suddenly realized why Tchaikovski was so pleased with his final symphony – it clearly describes one day in the life of a human being. The Symphony is in 4 Movements and is famous for starting off very gently (as one slowly wakes up) and for concluding so softly that you strain to hear the last notes (as one drifts off to sleep after a full day). In between, the Symphony successively evokes the early morning, morning activities, noon time glory, the march of shadows in the late afternoon, and the glory of the stars. Once you realize the One Day interpretation everything fits so beautifully that you can really understand why Tchaikovski was so delighted with the symphony and regarded it as his finest work.

My “One Day Pathétique” Symphony Painting is also conveniently divided into 4 Sections (evoking the 4 Movements of the Symphony) which are determined by the 4 major overlapping circles of a classical Italian Renaissance Double Golden Rectangle geometry. I use beautiful young maidens as figurative, almost abstract expressionist elements to tell the story of a lovely day. Reading Left to Right, the maidens slowly wake up (as do birds and flower buds); they then go into the woods gathering wood, berries and flowers; midday sees them busily engaged in the meadows (as are rabbits and birds); and in the evening the glorious stars come out, the birds roost, night birds fly out to hunt, and the maidens rest and finally drift off to sleep.

My “One Day Pathétique” Symphony Painting is about joy, beauty, nature and hope – and is ideal for illustrating the Good News in the current Climate Emergency. Just as my painting was inspired by the Sun coming up over the beautiful Yarra Valley, so have I been galvanized by our local Yarra Valley Climate Action Group that is trying to help save the Planet from man-made global warming. Many of my colleagues in the Yarra Valley Climate Action Group (and in its umbrella Climate Emergency Network and Climate Movement national umbrella groups) are deeply despondent about the notorious Australian inaction in the mounting Climate Emergency. Australia is the world’s biggest coal exporter and one of the world’s worst per capita greenhouse gas polluters but its extreme right wing, pro-Coal Government is strong on rhetoric but lacking in action. The $100 billion per annum Australian coal industry has effectively squashed sensible action and determined the resolutely pro-Coal stance of the major political parties (the Federally-governing Labor and the Opposition Liberal-National Party Coalition, these being collectively known as Lib-Labs). I am nevertheless optimistic and tell my colleagues that science, technology, sensible risk management and sensible business practice will eventually triumph over neocon greed, irresponsibility, denial, ignoring, climate scepticism, climate racism, climate terrorism, climate genocide, dishonesty and spin.

Man-made (anthropogenic) greenhouse gas (GHG) pollution from fossil fuel burning, methanogenic livestock production, other agriculture (notably major crop-based biofuel generation) and deforestation have lifted the atmospheric GHG concentration to a dangerous level. Top US climate scientist Dr James Hansen (Director, NASA Goddard Institute for Space Studies) has declared that “If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm (see:
here ), a view recently essentially endorsed by the Head of the Nobel Prize-winning UN IPCC, Dr Rajendra Pachauri.

In 2007 I published an assessment of the relative costs of power from a variety of sources (see:
here ). A key observation was that a Canadian Ontario Government-commissioned study had found that the “true cost” of coal burning-based power (taking the human and environmental impact into account) was 4-5 times the “market price”. Already in 2007 it was apparent that the cost of power from all the lower-cost renewable energy systems (in cents per kilowatt hour) was LOWER than the “true cost” coal burning-based power.

However, as summarized below, it is now apparent that a crucial CROSS-OVER POINT has now been reached at which the cost of power from a range of lower-cost renewable sources is about the SAME as the “market price” of coal burning –based power. In the analysis below the COST OF POWER (e.g. as measured in units such as US cents per kilowatt hour or US$ per megawatt hour) is given for a variety of non-carbon energy sources, together with an estimate of the MAGNITUDE of the various renewable and/or non-carbon energy resources. For detailed documentation of the information given below the reader is directed to a summary provided (with numerous links) by the Yarra Valley Climate Action Group entitled “
CROSS-OVER POINT: Best Renewable and Geothermal Power NOW for SAME COST as Fossil fuel-based Power”.

1. Wind Power

According to NOVA Science in the News (published by the prestigious Australian Academy of Science, 2008): “advances in wind power science and technology are reducing the cost of wind power to a point at which it is becoming competitive with many other energy sources (at about 8 Australian cents per kilowatt hour)” [i.e. 5.6 US cents per kilowatt hour [kWh] or US$56 per megawatt hour [MWh]). According to the British Wind Energy Association (BWEA) the average cost of onshore wind power in the UK (2005) was 3.2 p/kWh [i.e. 5.2 US cents per kilowatt hour or US$52 per megawatt hour (MWh)] . According to the US Energy Information Administration the cost per unit of energy produced from wind was estimated in 2006 to be comparable to the cost of new generating capacity in the United States for coal and natural gas: wind cost was estimated at US$55.80 per MWh [megawatt hour], coal at US$53.10 per MWh and natural gas at US$52.50 per MWh.

According to a Stanford University study “global wind power generated at locations with mean annual wind speeds ≥ 6.9 m/s at 80 m is found to be ~72 TW (~54,000 Mtoe [million tons of oil equivalent]) for the year 2000. Even if only ~20% of this power could be captured, it could satisfy 100% of the world’s energy demand for all purposes (6995-10177 Mtoe) and over seven times the world’s electricity needs (1.6-1.8 TW)”.

There is huge potential for off-shore wind power. According to Research and Markets (May 2008; summarizing the Global Wind Power Report 2008): “Wind is the world’s fastest-growing energy source with an average annual growth rate of 29% over the last ten years. In 2007, the global wind power generating capacity crossed 94 gigawatts (GW). This represents a twelve-fold increase from a decade ago, when world wind-generating capacity stood at just over 7.6 gigawatts (GW). Being an emerging fuel source a decade ago, wind energy has grown rapidly into a mature and booming global industry. Further, the power generation costs of wind energy have fallen by 50%, moving closer to the cost of conventional energy sources. The future prospects of the global wind industry are very encouraging and it is estimated to grow by more than 70% over the next five years to reach 160 gigawatts (GW) by year 2012”.

2. Concentrated Solar Power with Energy Storage

The US solar energy company Ausra uses a form of Concentrated Solar Thermal (CST) technology called Compact Linear Fresnel Reflector (CLFR) technology. In short, solar energy is collected and concentrated in a sophisticated way and used to generate steam to drive turbines and hence generate electricity. A key feature is that solar energy is stored, enabling Ausra CLFR plants to generate electricity 24 hours per day. An Ausra factory producing 700 megawatt (MW) of solar collectors annually opened in Nevada in 2008. Ausra is involved in joint construction of a 177 megawatt (MW) CLFR plant for California. According to Ausra (2008): “Ausra's innovations in collector design dramatically reduce the cost of solar thermal generation equipment and bring solar power to prices directly competitive with fossil fuel power. Using Ausra's current solar technologies, all U.S. electric power, day and night, can be generated using a land area smaller than 92 by 92 miles”.

Solar energy hitting the Earth is roughly 10,000 times greater than the energy we consume globally. Global electricity production (2005) was 17,400 TWh. Exciting new research developments on hydrogen fuel cells (at Monash University, Australia) and efficient electrolysis (at the Massachusetts Institute of Technology) presage an efficient, solar energy-based, hydrogen fuel cell-run transportation system within a decade.

3. Wave power

The cost of wave power by the CETO system (a sea bed-fixed pump linked to a buoyant actuator) is about that of wind power. There are further big cost efficiencies if wave power is used for cogeneration of potable water. A Carnegie Corporation submission to an Australian Parliamentary Committee (2007) states: that “CETO can offer zero-emission base-load electricity generation capacity at a cost comparable to existing wind power [i.e. about US$50 per MWh] and the capacity to provide potable water to major population centres using 100% clean energy”.

Further, this Submission states: “The World Energy Council has estimated that approximately 2 Terawatts (TW), about double current world electricity production, could be produced from oceans via wave power … It is estimated that 1 million gigawatt hours (GWh) of wave energy hits Australian shores annually, or more than four times Australian’s total annual electricity consumption of 210,000 gigawatt hours (2004 figures)”.

4. Hydro power

According to the New Zealand Ministry of Economic Development (2002) various New Zealand hydroelectric power systems provided power for 4-10 NZ cents per kilowatt hour [2.4-5.9 US cents per kilowatt hour i.e. US$24-59 per megawatt hour].

According to BNET (2007): “Hydropower currently accounts for approximately 20% of the world's electricity production, with about 650,000 MW (650 GW) installed and approximately 135,000 MW (135 GW) under construction or in the final planning stages … It is estimated that only about a quarter of the economically exploitable water resources has been developed to date”.

5. Geothermal power

According to Professor John Veevers (Department of Earth and Planetary Sciences, Macquarie University, Sydney, Australia): “The [Australian hot rocks] geothermal resource extends over 1000 square kilometres … Modelled costs are 4 Australian cents per kilowatt hour, plus half to 1 cent for transmission to grid [4.5 Australian cents per kWh = 3.2 US cents per kWh or US$32 per MWh]. This compares with 3.5 cents for black coal, 4 cents for brown coal, 4.2 cents for gas, but all with uncosted emissions. Clean coal, the futuristic technology of coal gasification combined with CO2 sequestration or burial, yet to be demonstrated, comes in at 6.5 cents, and solar and wind power at 8 cents” see “The Innamincka hot fractured rock project” in “Lies, Deep Fries & Statistics”, editor Robyn Williams, ABC Books, Sydney, 2007; also see energy cost-related chapters in this book by Dr Gideon Polya “Australian complicity in Iraq mass mortality”, Dr Mark Diesendorf “A sustainable energy future for Australia”, and by Martin Mahy “Hydrogen Minibuses”).

According to the Report of an interdisciplinary panel of Massachusetts Institute of Technology (MIT) experts entitled “The Future of Geothermal Energy” (2006) : “EGS [Enhanced Geothermal Systems] is one of the few renewable energy resources that can provide continuous base-load power with minimal visual and other environmental impacts … The accessible geothermal resource, based on existing extractive technology, is large and contained in a continuum of grades ranging from today’s hydrothermal, convective systems through high- and mid-grade EGS resources located primarily in the western United States) to the very large, conduction-dominated contributions in the deep basement and sedimentary rock formations throughout the country. By evaluating an extensive database of bottom-hole temperature and regional geologic data (rock types, stress level, surface temperature etc), we have estimated that the total EGS resource has to be more than 13 million exajoules (EJ) [13 million EJ x 277.8 TWh per EJ = 3611.4 million TWh]. Using reasonable assumptions regarding how heat would be used from stimulated EGS reservoirs, we also estimated the extractable portion to exceed 0.2 million EJ (0.2 million EJ x 277.8 TWh per EJ = 55.56 million TWh) ... With technological improvements, the economically extractable amount of useful energy could increase by a factor of 10 or more, thus making EGS sustainable for centuries” (see Chapter1, p1-4).

Nuclear power can be dismissed as a serious future option in this analysis because the overall nuclear power cycle (from mining to waste disposal) currently has a major CO2-polluting component (equivalent to that of a modern gas-fired power plant); the cost of nuclear power via the UK's newest Sizewell B plant is 15 Australian cents per kilowatt hour [10.5 US cents per kilowatt hour or US$105 per MWh; required high grade uranium ore is a very limited resource; and long-term safe storage of waste and security issues are unresolved. Biofuel from land-based crops (notably canola, palm oil, sugar and corn) is highly CO2 polluting from mechanisms such as de-forestation and loss of soil carbon. Indeed the biofuel perversion that is legislatively mandated in the US, the UK and the EU is making a huge contribution to global food price rises that in turn are threatening the lives of “billions” of people according to UK Chief Scientific Advisor Professor John Beddington FRS (see “Global Food Crisis. US Biofuel & CO2 threaten billions”:
http://mwcnews.net/content/view/21277/42/ ).

In Summary, while the World has arguably already reached “peak oil” and uranium, gas and coal resources are limited, the solar energy hitting the Earth is roughly 10,000 times greater than the energy that Man consumes globally. Geothermal resources are immense. Already developed and implemented geothermal power technologies and low-cost renewable energy technologies directly dependent on solar energy (concentrated solar thermal power) or indirectly dependent on solar energy (hydro, wind and wave power) have reached a CROSS-OVER POINT at which the cost of power in cents per kilowatt hour are COMMENSURATE with the current “market cost” of fossil fuel burning –based power. Further, the “true cost” of coal burning-based power (i.e. taking the environmental and human impact into account) is 4-5 times the “market cost”. Exciting new research developments at Monash University, Australia, and at the Massachusetts Institute of Technology, USA, presage the possibility of an efficient, solar energy-based, hydrogen fuel cell-run transportation system within a decade.

Former US Vice President and Nobel Laureate Al Gore has recently (mid-2008) called for 100% renewable electric power with ten years: “Today I challenge our nation to commit to producing 100 percent of our electricity from renewable energy and truly clean carbon-free sources within 10 years” (see:
here). Carbon-free power is now technically and economically feasible at a “market cost” commensurate with the “market cost” of fossil fuel burning-based power.

The science, technology and economics thus indicate that the urgent need (enunciated by NASA’s Dr James Hansen and his colleagues) to reduce atmospheric CO2 concentration from the current 387 ppm to no more than 350 ppm can be realized NOW with low-cost renewable energy and geothermal energy implementation coupled with cessation of fossil fuel burning and de-forestation, minimization of agricultural methanogenesis, massive re-afforestation and return of carbon as biochar to the world’s soils.

Just as there is an Optimistic Interpretation of Tchaikovski’s “Pathétique” Symphony, so we see that with honesty and goodwill the World can successfully address the acutely serious Climate Emergency in a vigorous and timely fashion.

Dr Gideon Polya, MWC News Chief political editor, published some 130 works in a 4 decade scientific career, most recently a huge pharmacological reference text "Biochemical Targets of Plant Bioactive Compounds" (CRC Press/Taylor & Francis, New York & London, 2003), and is currently writing a book on global mortality ---

October Sitrep

This month has been a period of intense activity on both the political and economic fronts and has made it impossible to stand by and be quiet. Let us hope that with the election this coming week and the general stabilization of the banking system that w have a lot less to talk about. It has been quite enough.

Whoever wins the White House is going to face revenue shortfalls and a winding down of the wars in the Middle East. Pakistan is now confronting its own demons and the transfer of combat resources from Iraq will shred the Taliban. They will get to do a lot of dying and a likely collapse of will among their supporters. The heat will be simply too much.

The internal situation will be of working to restart the stalled economy. Once it is clearly turning over properly and all the indicators are going in the proper direction, then it will be possible to entertain new programs. We will actually know the real costs of this financial collapse and what resources are available.

We are seeing massive volatility in the securities markets at present. It is not unprecedented and is caused by the market shaking off a lot of even normal credit. Hedge funds are been massively deleveraged and thus widening spreads. Supposedly these funds had matched maturities but lousy liquidity. Most likely they borrowed short and lent long and the music just stopped.

At the same time, the trillions lent out in foreign loans are also without offsetting fund support and we are teetering on a global collapse of sovereign credit not unlike what happened to South America thirty years ago. This was financed by the euro dollar market. Again a fast contraction is possible with devastating effects in the global economy. The sub prime disaster has given us a whiff of gunpowder from which we are emerging bruised but perhaps sort of intact. The same will not be true if we can not figure out how to shore up the global situation. What is worse, we actually have little control or influence on it and have to hope that the Europeans have done enough. They certainly moved fast enough.

Credit contraction is underway around the globe and unfortunately must continue for some time at least until all standing lenders are convinced that they are still in business and sufficient transparency exists for a lender to lend to another lender.

So while everyone is dreaming of a fast rebound, the reality is that there is simply not enough information out there yet to support such confidence. I suspect that we have a grinding two years in which every country is working to stabilize the internal situation.

Walden Pond and Malaria Threat

This is one of those stories that somewhat annoy me just because global warming is used as deus ex machima. A little more effort is surely warranted. Every researcher is pitching his favorite scheme as a crusade on global warming and we are getting these breathless pronouncements.

In this case a quick trip to Google maps quickly shows that this very modest kettle lake is and has been encroached on all sides at a modest distance with roads and freehold woodland.

Backyard biomes compete with local flora and the local flora that require rare conditions are obviously under pressure. A young boy walking through the woods can stomp a rare species out of existence when you do not have an adjacent reservoir of the plants in question. A glance at the map shows a sharply reduced ecology surrounding the pond and thus we understand the change in biodiversity. This has nothing to do with global warming.

The next story is about the threat of an expansion of biodiversity. Yes the SE USA is a prime prospect for malaria reinfestation. If it happens it will because we did not spray and did not move to irradiate the outbreak.

DDT is not a toxin of choice for dealing with a chronic situation but a DDT saturation of an affected area will eliminate the problem quickly at which point the environment can recover without that mosquito.

And surely we must now have an oil based insecticide that can do the job DDT did. DDT could be sprayed on water and the mosquito larvae would rise to the surface to breathe and be killed.

Global warming changing Walden Pond

Published: Oct. 29, 2008 at 2:13 AM

CAMBRIDGE, Mass., Oct. 29 (UPI) -- Two-thirds of the plants writer
Henry David Thoreau chronicled at Walden Pond in Massachusetts have disappeared due to global warming, a U.S. study contends.

The Harvard University report said some of the hardest-hit plants include lilies, orchids, violets, roses and dogwoods. Plants that have thrived in the warmer temperatures include mustards, knotweeds and various non-native species.

"Some plants around Walden Pond have been quite resilient in the face of climate change, while others have fared far worse. Closely related species that are not able to adjust their flowering times in the face of rising temperatures are decreasing in abundance," Charles C. Davis, assistant professor of organismic and evolutionary biology in Harvard's Faculty of Arts and Sciences, said Monday in a news release.

The report said about 27 percent of all species Thoreau recorded in the 1850s around Walden Pond in Concord, Mass., are now locally extinct and another 36 percent are so sparse extinction may be imminent.

"The species harmed by climate change are among the most charismatic found in the New England landscape," Davis said.

King: Malaria may return with global warming

Malaria is the most fatal of infectious diseases. Its daily death toll tops the number of casualties on Sept. 11th, 2001 and dwarfs the combined mortalities from AIDS and Tuberculosis.

Americans are increasingly aware of the magnitude of this scourge. More and more elementary schools, church groups and bar mitzvahs are donating their collections to purchase malaria nets. The United Nations created its Global Fund to Fight AIDS, Tuberculosis, and Malaria in 2001. The Gates Foundation invested $168.7 million in the Malaria Vaccine Initiative.

But Americans are not yet fully cognizant of malaria’s relationship to climate change: A warming climate might facilitate the return of malaria to the world’s temperate regions.

History might help our nation once again appreciate the relationship between health and the environment. Malaria fell off the radar of many industrialized, wealthier nations as soon as they successfully eliminated the disease within their own borders. This neglect allowed malaria to not only resurge within endemic countries but also threaten nations long considered malaria-free.

For instance, malaria was successfully eradicated from the Southeastern United States in 1951. But since malaria remains endemic in most of the Southern Hemisphere, including places as close to the U.S. as southern Mexico, the Center for Disease Control and Prevention acknowledges that reintroduction of malaria into the United States remains a constant risk — one of the principal reasons that the CDC is located in Atlanta, Ga., not in Washington, D.C.

Now, warming global temperatures — which allow malaria-infected mosquitoes to survive in new areas and for longer periods — could endanger hundreds of millions in the coming decade.

American public opinion surveys reflect our inadequate comprehension of the effect of climate on health. The nation’s public health system has allowed for an unacceptable disconnect between climate change and human health. Information and recognition are the first steps to cure, and those first steps have already been taken by many countries. For instance, the Intergovernmental Panel on Climate Change, which co-won the 2007 Nobel Peace Price, warned the public in its acceptance speech of an altered distribution of infectious diseases due to rising temperatures.

In contrast to the
international community’s efforts on this front, the United States has performed relatively few health-climate studies. The paucity of federally-funded research on the links between climate change and human health reflects the ignorance of the American public.

Americans must now strive to avoid becoming victims of our own success. Sound scientific study will give individuals and businesses the necessary behavior modification tools to stem the resurgence of a malaria pandemic. When it comes to developing a better understanding of the relationship between malaria and climate change, ignorance is no longer an option.

Leslie P. King, M.D., M.P.H., is the founding director of Flying Physicians International. She currently is completing a one-year mid-career masters degree at the Yale School of Forestry & Environmental Studies, focusing on communications of the impacts of climate change on human health.

Rust Belt Solar Revival

The demise of the rust belt was always an exaggeration, just as the fear that German industry could not compete. All industrial production is about intelligent design and then taking a few cents of raw material, pounding it a few times and selling it for dollars.

The industrial heartland that also includes Ontario is one of the globe’s great industrial engines.

It has supported high wages and high taxation. This put it at a disadvantage when competing with low tax and low wage economies. It still managed to become richer and more efficient as did German industry.

The next generation is seeing the wages of China and India rising and their economies begin to properly internalize as happened in Europe. There is still plenty of cheap labor around but never so concentrated and organized as that of China and India. In fact China and India are now beginning to compete for those workers.

What this article makes clear is that solar energy and wind energy represents a massive manufacturing effort. A lot of this had not really registered. The modern windmill showed that it is possible to build hardware robust enough to stand up against all that Mother Nature can throw at it. We are now going to build thousands of them and install them everywhere that makes any sort of sense. After all, free fuel works for everyone.

The solar industry is an even bigger manufacturing opportunity. The nanosolar thin film is actually a minor part of the manufacturing bill and its cost is now dropping sharply, exponentially increasing manufacturing demand.

We will be installing hundreds of square miles of solar panels over the next twenty years. The hardware for all this must be manufactured and will surely generate a manufacturing boom for the rust belt.

I personally think that while the rest of the globe still has energy options to postpone conversion to alternative energy, North America and Europe does not. The Europeans recognized this twenty years ago and made it public policy to stay ahead of the curve.

In North America, we have ignored the inevitable day of reckoning until it arrived and sucked the cash liquidity out of the economy. Now we need to catch up in a hurry. That means a massive build out of windmills and conversion of our trucking industry over to LNG fuel over the next five years. This will slash our dependence on oil and commence the reduction of our reliance on coal fired and natural gas fired power plants.

This all translates into a major manufacturing boom in the rust belt, even before we factor in the rapid build up of solar energy.

Our thermal plants will end up been standby power sources to pick up the slack on bad weather days.

Logistics Resurrects the Rust Belt

by Lara L. Sowinski

October 28, 2008

As in recent presidential elections, much has been said by politicians about the tens of thousands of manufacturing job losses that have occurred in the region of the U.S. known as the Rust Belt—those states that have a long history in industrial manufacturing, particularly steel and auto production, as a driver for their economies. But politicians usually don’t tell the whole story, and such is the case with how the jobs were lost and, more importantly, what’s been occurring in recent years to turn the situation around.The “secret,” according to Bill LaFayette, Ph.D., and vice president, economic analysis, for the Columbus (Ohio) Chamber (www.columbus.org), is in large part due to logistics, and he and many others in the business, government, and academic sectors are pulling together to get that secret out in the open.

For starters, “There’s an initiative at the state level, the Ohio Skills Bank, to align public colleges and universities as well as secondary education providers with economic development priorities. The goal is to make sure the state’s workforce has the skills that employers really need now and in the future. And, transportation and logistics is one of the main focus sectors of this initiative, especially in the Columbus region,” says LaFayette. “Part of the plan includes making sure university classes and credits are on par around the state, so the workforce can be more mobile,” he adds.

State lawmakers also recognize the role of logistics in the larger economic development picture. Ohio is in the midst of a major tax reform, which along with numerous other advantages, is helping the state become more competitive against others that it often goes head-to-head with, including Illinois, Indiana, Wisconsin, Michigan, and Minnesota, explains Matt McCollister, vice president, economic development, Columbus Chamber.

In a recent article in the Wall Street Journal, Ohio Governor Ted Strickland asserted that the tax reform would yield significant results for the state.

“By 2010, Ohio will be one of only two states without a general tax on corporation profits or a property tax on business machinery, equipment, and inventories. This year is the last for Ohio’s business property tax; next year is the last for the corporation profits tax. And, Ohio’s personal income tax rates are falling by 21 percent across the board.”

“Between 2005 and 2007, Ohio’s per capita state tax burden has already fallen to 38th in the nation, from 27th, according to the Federation for Tax Administrators. When the new tax cuts are phased in, Ohio’s business taxes will be the lowest in the Midwest.”The Governor also pointed out that exports are up sharply. Last year, the state’s exports totaled more than $42 billion — an 11.1 percent increase over 2006—making it the only state in which exports have grown each year since 1998.

Moving from concept to creation

It’s great to have a vision, but it’s even better to put it into action, and the major logistics players in the Columbus region and throughout the state are beginning to see the results of this combined effort from the various interests in a number of ways.

Undoubtedly, one of the most important projects has been the Norfolk Southern railroad’s Heartland Corridor, a three-year railway improvement project scheduled for completion in 2010 that will significantly increase the speed of containerized freight moving in double-stack trains between the East Coast and Midwest. Currently, double-stack trains are routed through Harrisburg, Pennsylvania or Knoxville, Tennessee. However once it’s completed, the Heartland Corridor will move double-stack trains from Norfolk, Virginia’s seaports to Chicago, via West Virginia and Ohio.

The centerpiece of the Heartland Corridor is the Rickenbacker Intermodal Terminal located just outside of Columbus, which opened in March.

“The construction of the Rickenbacker terminal punctuates Norfolk Southern’s commitment to serve the growing intermodal demands of central Ohio and Midwest shippers,” said Wick Moorman, Norfolk Southern’s chief executive officer, earlier this year. “Rickenbacker, one of five Norfolk Southern intermodal terminals in Ohio, will anchor our Heartland Corridor when that project is completed.”

Elaine Roberts, A.A.E., president and CEO of the Columbus Regional Airport Authority, added that, “We have already witnessed the start of the intermodal terminal’s economic impact with new industrial development in the Rickenbacker area. We expect 20,000 new jobs over the next 30 years as a direct result of the new intermodal facility.”

The initial footprint of the Rickenbacker Intermodal Terminal will comprise approximately 175 acres with a handling capacity of more than 250,000 containers and trailers annually. However, it was designed to accommodate expansion as traffic volumes grow.At the same time, officials at the Columbus Regional Airport Authority are optimistic that the Rickenbacker International Airport (www.rickenbacker.org), a former military airport that boasts some of the longest runways in the country, will figure more prominently for air cargo shippers. One big draw is the relatively short taxi times and very low landing fees, along with easy entry to the cargo apron with direct plane-to-truck access so cargo can be off-loaded and ready for transport within an hour of arrival.The airport is currently under-utilized, say officials. Although the airport can handle up to 1 million metric tons of cargo, only about 100,000 metric tons are coming in now. In addition, there’s no regularly scheduled air cargo service at the moment, just charters. Nonetheless, airport officials say they’ll continue to aggressively pursue more business, which may come about sooner than expected should DHL’s hub in Wilmington, Ohio close down.

Another rail project that will bring more capacity to central Ohio is CSX’s National Gateway. Similar to the Heartland Corridor, the rail project will also link Mid-Atlantic ports to the Midwest with double-stack routes, which will transit through Maryland, Virginia, North Carolina, Pennsylvania, Ohio, and West Virginia.

The railroad plans to expand an existing intermodal terminal near Columbus and build a new terminal at Marion, Ohio. The total cost of the public-private partnership is estimated at $700 million.

In the meantime, the formation of the Columbus Region Logistics Council is a further example of how members of the business community, government, and academia are taking action to develop logistics throughout the region.

Battelle, the huge consulting, research and development organization, was tapped to put together a long-range strategic plan, a ‘logistics roadmap,’ that was delivered in 2007, explains Ben Ritchey, vice president, transportation market sector, Battelle. “One of the results was the Columbus Region Logistics Council,” he says. It’s a volunteer organization comprised of shippers, freight forwarders, developers, and transportation companies. Some of the members include ODW Logistics, Honda of America, Exel, Limited Brands, Ohio State University, CSX, and Norfolk Southern.

Part of the logistics roadmap calls for: fostering a logistics-friendly business environment; continuing to develop and enhance advanced logistics infrastructure; infusing world-class logistics technology into regional industry; and building a high-skill workforce for competitive advantage.

“The benefit of this four-pronged strategy lies in focusing appropriate investments and activities that will most readily achieve job and business growth, build infrastructure, develop a talented workforce, and enable technology adoption that sets our regional logistics industry apart from competing markets,” notes Ritchey.

However, lack of capital is still a concern, he acknowledges. Yet, the promise of public-private partnerships, especially for “last mile” projects, is a reason to stay enthusiastic, says Ritchey.

Transitioning to emerging industries

While the logistics industry is a key part of the broader economic development activity in Ohio, several other emerging industries are taking root there and in surrounding Rust Belt states, namely solar.

The U.S. is poised to become the manufacturing mecca for the $18 billion solar industry and nearly all of the current solar manufacturing capacity is in the Midwest. In fact, with the exception of Nanosolar’s thin film facility in San Jose, California and Ausra’s plant in Las Vegas, all solar panels manufactured in the U.S. are made in the Rust Belt.

First Solar, a $22 billion solar panel manufacturer based in Perrysburg, Ohio, announced in August that it plans to expand its manufacturing operations and development facilities near Toledo.

The investment will add approximately 500,000 square feet of manufacturing, research and development, and office space, and will add at least 134 new jobs to the company’s current workforce of 700 at its Perrysburg facility. First Solar is collaborating with state and local leaders on a comprehensive incentive package for these two projects. These incentives are central to First Solar’s expansion plans in Ohio and are subject to approval by state and local authorities.

The expansion is expected to be completed in the second quarter of 2010 and will increase the annual capacity at the Perrysburg facility to approximately 192 megawatts. In addition, First Solar will construct a separate facility to support increased development activities associated with its advanced thin film solar module manufacturing technology.“Scaling our manufacturing capacity while taking advantage of existing infrastructure will incrementally lower the manufacturing cost per watt at a rate comparable to our lowest cost facility in Malaysia,” said Bruce Sohn, president of First Solar. “The expansion of our operations in Ohio is a direct result of the outstanding achievements of our associates and a strong, ongoing partnership with state and local leaders.”

“The state of Ohio is proud to support industry leaders like First Solar who are using renewable energy to power the future,” said Ohio Governor Ted Strickland. “In making this significant investment and expansion in Toledo, First Solar is helping us to send a message to the world that Ohio is reinventing itself as the leader in the advanced energy industry.”State regulations require that at least 25 percent of the electricity sold in Ohio to be generated from new and advanced technologies by 2025.

According to Gov. Strickland, “Already, Ohio has more alternative energy-related projects under way than any other state. The state’s extensive manufacturing supply chain provides thousands of products to the alternative energy industry. And, Ohio is home to the largest fuel cell supply chain in the country. Our welders, machinists, electricians, and iron and steel workers are retooling and transferring their skills to retrofitting buildings, building mass transit, installing wind and solar power, and manufacturing energy-efficient cars and trucks.”

“Ohio now leads the Midwest in the growth of venture capital investments in the biosciences; we rank first nationally in per capita clinical trials and operate the largest center for stem cell and regenerative medicine between the coasts. In the U.S. News & World Report rankings, Ohio leads the nation with four of the country’s top 15 children’s hospitals. The Cleveland Clinic, meanwhile, has spun off two dozen start-up companies in the past decade, and averages 200 inventions each year.”

Roy Spencer on the Pacific Decadal Oscillation

This recently released paper is very compelling in that it is able to assign the observed temperature data to the Pacific Decadal Oscillation (PDO). It should have always been a priority to establish the importance of this variable, simply because the Pacific represents a uniform surface covering half the globe. Anything happening there in the atmosphere must naturally be the most significant atmospheric effect and we have already had El Nino and La Nino to integrate into our models.

Suddenly, this work reveals that the Northern Hemispheric warming that we have experienced over the past two decades is linked directly to shifts in the Pacific that make convincing sense and could even be anticipated. This is the natural driver of normal climate variation which we certainly have not broken out of at all.

This paper is critical of the IPCC modeling and proves that it is rubbish through a well developed critique, including a challenge to show him wrong.

We are going from an unconvincing argument for CO2 forcing to a sound demonstration that its effect is minor. It is time for the enthusiasts to beat a retreat before Mother Nature shows up at the party and ruins it all. Of course we will then hear extensively from the sunspot crowd.

The evidence supported a decadal shifting of heat back and forth across the equator in one way or the other and that makes a great deal of sense because it gives the Earth a mechanism with which to keep the heat content of the globe in balance. Otherwise we need to rely on a centuries’ long cycle that readjusts the ocean currents. It is simply too slow.

I now feel that we are getting a real handle on the climatic levers and switches. Now if I could figure out which North Pacific volcano caused the Little Ice Age we would be almost home free.



Global Warming as a Natural Response to Cloud Changes Associated with the Pacific Decadal Oscillation (PDO)

byRoy W. Spencer, PhD

Principal Research Scientist The University of Alabama in Huntsville (posted October 19, 2008; updated 10/20/08, 6:35 a.m.)

(what follows is a simplified version of a paper I am preparing for submission to Geophysical Research Letters)

Abstract

A simple climate model forced by satellite-observed changes in the Earth's radiative budget associated with the Pacific Decadal Oscillation is shown to mimic the major features of global average temperature change during the 20th Century - including two-thirds of the warming trend. A mostly-natural source of global warming is also consistent with mounting observational evidence that the climate system is much less sensitive to carbon dioxide emissions than the IPCC's climate models simulate.

1. Introduction

For those who have followed my writings and publications in the last 18 months (e.g. Spencer et al., 2007), you know that we are finding satellite evidence that the climate system could be much less sensitive to greenhouse gas emissions than the U.N.'s Intergovernmental Panel on Climate Change (IPCC, 2007) climate models suggest.

To show that we are not the only researchers who have documented evidence contradicting the IPCC models, I made the following figure to contrast the IPCC-projected warming from a doubling of atmospheric carbon dioxide with the warming that would result if the climate sensitivity is as low as implied by various kinds of observational evidence.

Fig. 1. Projected warming (assumed here to occur by 2100) from a doubling of atmospheric CO2 from the IPCC climate models versus from various observational
indicators.

http://www.weatherquestions.com/PDO-and-20th-Century-warming-Fig01.jpg

The dashed line in Fig. 1 comes from our recent apples-to-apples comparison between satellite-based feedback estimates and IPCC model-diagnosed feedbacks, all computed from 5-year periods (Spencer and Braswell, 2008a). In that comparison, there were NO five year periods from ANY of the IPCC model simulations which produced a feedback parameter with as low a climate sensitivity as that found in the satellite data.

The discrepancy between the models and observations seen in Fig. 1 is stark. If the sensitivity of the climate system is as low as some of these observational results suggest, then the IPCC models are grossly in error, and we have little to fear from manmade global warming. [I am told that the 1.1 deg. C sensitivity of Schwartz (2007) has more recently been revised upward to 1.9 deg. C.]

But an insensitive climate system would ALSO mean that the warming we have seen in the last 100 years can not be explained by increasing CO2 alone. This is because the radiative forcing from the extra CO2 would simply be too weak to cause the ~0.7 deg. C warming between 1900 and 2000... there must be some natural warming process going on as well.

Here I present new evidence that most of the warming could actually be the result of a natural cycle in cloud cover forced by a well-known mode of natural climate variability: the Pacific Decadal Oscillation.

2. A Simple Model of Natural Global Warming

As Joe D'Aleo and others have pointed out for years, the Pacific Decadal Oscillation (PDO) has experienced phase shifts that coincided with the major periods of warming and cooling in the 20th Century. As can be seen in the following figure, the pre-1940 warming coincided with the positive phase of the PDO; then, a slight cooling until the late 1970s coincided with a negative phase of the PDO; and finally, the warming since the 1970s has once again coincided with the positive phase of the PDO.


Fig. 2. Five-year running averages in (a) global-average surface temperature, and (b) the Pacific Decadal Oscillation (PDO) index during 1900-2000.

http://www.weatherquestions.com/PDO-and-20th-Century-warming-Fig02.jpg

Others have noted that the warming in the 1920s and 1930s led to media reports of decreasing sea ice cover,
Arctic and Greenland temperatures just as warm as today, and the opening up of the Northwest Passage in 1939 and 1940.

Since this timing between the phase of the PDO and periods of warming and associated climate change seems like more than mere coincidence, I asked the rather obvious question: What if this known mode of natural climate variability (the PDO) caused a small fluctuation in global-average cloud cover?

Such a cloud change would cause the climate system to go through natural fluctuations in average temperature for extended periods of time. The IPCC simply assumes that this kind of natural cloud variability does not exist, and that the Earth stays in a perpetual state of radiative balance that has only been recently disrupted by mankind's greenhouse gas emissions. This is an assumption that many of us meteorologists find simplistic and dubious, at best.

I used a very simple energy balance climate model, previously suggested to us by Isaac Held and Piers Forster, to investigate this possibility. In this model I ran many thousands of combinations of assumed: (1) ocean depth (through which heat is mixed on multi-decadal to centennial time scales), (2) climate sensitivity, and (3) cloud cover variations directly proportional to the PDO index values.

In effect, I asked the model to show me what combinations of those model parameters yielded a temperature history approximately like that seen during 1900-2000. And here's an average of all of the simulations that came close to the observed temperature record:

Fig. 3. A simple energy balance model driven by cloud changes associated with the PDO can explain most of the major features of global-average temperature fluctuations during the 20th Century. The best model fits had assumed ocean mixing depths around 800 meters, and feedback parameters of around 3 Watts per square meter per degree C.

http://www.weatherquestions.com/PDO-and-20th-Century-warming-Fig03.jpg

The "PDO-only" (dashed) curve indeed mimics the main features of the behavior of global mean temperatures during the 20th Century -- including two-thirds of the warming trend. If I include transient CO2 forcing with the PDO-forced cloud changes (solid line labeled PDO+CO2), then the fit to observed temperatures is even closer. It is important to point out that, in this exercise, the PDO itself is not an index of temperature; it is an index of radiative forcing which drives the time rate of change of temperature. Now, the average PDO forcing that was required by the model for the two curves in Fig. 3 ranged from 1.7 to 2.0 Watts per square meter per PDO index value. In other words, for each unit of the PDO index, 1.7 to 2.0 Watts per square meter of extra heating was required during the positive phase of the PDO, and that much cooling during the negative phase of the PDO.

But what evidence do we have that any such cloud-induced changes in the Earth's radiative budget are actually associated with the PDO? I address that question in the next section

3. Satellite Evidence for Radiative Budget Changes Forced by the Pacific Decadal Oscillation

To see whether there is any observational evidence that the PDO has associated changes in global-average cloudiness, I used NASA Terra satellite measurements of reflected solar (shortwave, SW) and emitted infrared (longwave, LW) radiative fluxes over the global oceans from the CERES instrument during 2000-2005, and compared them to recent variations in the PDO index. The results can be seen in the following figure:

Fig. 4. Three-month running averages of (a) the PDO index during 2000-2005, and (b) corresponding CERES-measured anomalies in the global ocean average radiative budget, with and without the feedback component removed (see Fig. 5). The smooth curves are 2nd order polynomial fits to the data.

http://www.weatherquestions.com/PDO-and-20th-Century-warming-Fig04.jpg

But before a comparison to the PDO can be made, one must recognize that the total radiative flux measured by CERES is a combination of forcing AND feedback (e.g. Gregory et al., 2002; Forster and Gregory, 2006). So, we first must estimate and remove the feedback component to better isolate any radiative forcing potentially associated with the PDO.

As Spencer and Braswell (2008b) have shown with a simple model, the radiative feedback signature in globally-averaged radiative flux-versus-temperature data is always highly correlated, while the time-varying radiative forcing signature of internal climate fluctuations is uncorrelated because the forcing and temperature response are always 90 degrees out of phase. This allows some measure of identification and separation of the two signals.

The following figure shows what I call "feedback stripes" associated with intraseasonal fluctuations in the climate system. The corresponding feedback estimate (line slope) of 8.3 Watts per square meter per degree C was then used together with three-month anomalies in tropospheric temperature from AMSU channel 5 remove the estimated feedback signal from the radiative flux data to get the "forcing-only" curve in Fig. 4b.

Fig. 5. Three-month running averages of global oceanic CERES radiative flux changes versus tropospheric temperature changes (from AMSU channel 5, see Christy et al., 2003) for the time period in Fig. 4. The average feedback estimate (see sloped lines) was then used together with the AMSU5 data to estimate and remove the feedback component from the CERES radiative fluxes, leaving the radiative forcing shown in Fig. 4b.

[NOTE: This feedback estimate is not claimed to necessarily represent long-term climate sensitivity (which in this case would be very low, 0.44 deg. C); it is instead the feedback occurring on intraseasonal and interannual time scales which is merely being removed to isolate the forcing signal. It should be remembered that, according to our current paradigm of climate variability, there can only be two sources of radiative variability: forcing and feedback. As shown by Spencer and Braswell (2008a, 2008b), the usual practice of fitting a regression line to all of the data in Fig. 5 would result in a regression slope biased toward zero (high sensitivity) because the presence of any time-varying internal radiative forcing -- which is always occurring -- would be mistakenly included as part of the feedback

When the feedback is removed, we see a good match in signal.] Fig. 4 between the low-frequency behavior of the PDO and the radiative forcing (which is presumably due to cloud fluctuations associated with the PDO). Second-order polynomials were fit to the time series in Fig. 4 and compared to each other to arrive at the PDO-scaling factor of 1.9 Watts per square meter per PDO index value.

It is significant that the observed scale factor (1.9) that converts the PDO index into units of heating or cooling is just what the model required (1.7 to 2.0) to best explain the temperature behavior during the 20th Century. Thus, these recent satellite measurements - even though they span less than 6 years -- support the Pacific Decadal Oscillation as a potential major player in global warming and climate change.

4. Discussion & Conclusions

The evidence continues to mount that the IPCC models are too sensitive, and therefore produce too much global warming. If climate sensitivity is indeed considerably less than the IPCC claims it to be, then increasing CO2 alone can not explain recent global warming. The evidence presented here suggests that most of that warming might well have been caused by cloud changes associated with a natural mode of climate variability: the Pacific Decadal Oscillation.

The IPCC has simply assumed that mechanisms of climate change like that addressed here do not exist. But that assumption is quite arbitrary and, as shown here, very likely wrong. My use of only PDO-forced variations in the Earth's radiative energy budget to explain two-thirds of the global warming trend is no less biased than the IPCC's use of carbon dioxide to explain global warming without accounting for natural climate variability. If any IPCC scientists would like to dispute that claim, please e-mail me at roy.spencer (at) nsstc.uah.edu.

If the PDO has recently entered into a
new, negative phase, then we can expect that global average temperatures, which haven't risen for at least seven years now, could actually start to fall in the coming years. The recovery of Arctic sea ice now underway might be an early sign that this is indeed happening.

I am posting this information in advance of publication because of its potential importance to pending EPA regulations or congressional legislation which assume that carbon dioxide is a major driver of climate change. Since the mainstream news media now refuse to report on peer-reviewed scientific articles which contradict the views of the IPCC, Al Gore, and James Hansen, I am forced to bypass them entirely.

We need to consider the very real possibility that carbon dioxide - which is necessary for life on Earth and of which there is precious little in the atmosphere - might well be like the innocent bystander who has been unjustly accused of a crime based upon little more than circumstantial evidence.

REFERENCESChristy, J. R., R. W. Spencer, W. B. Norris, W. D. Braswell, and D. E. Parker (2003), Error estimates of version 5.0 of MSU/AMSU bulk atmospheric temperatures, J. Atmos. Oceanic Technol., 20, 613- 629.

Douglass, D.H., and R. S. Knox, 2005. Climate forcing by volcanic eruption of Mount Pinatubo. Geophys. Res. Lett., 32, doi:10.1029/2004GL022119.

Forster, P. M., and J. M. Gregory (2006), The climate sensitivity and its components diagnosed from Earth Radiation Budget data, J. Climate, 19, 39-52.

Gregory, J.M., R.J. Stouffer, S.C.B. Raper, P.A. Stott, and N.A. Rayner (2002), An observationally based estimate of the climate sensitivity, J. Climate, 15, 3117-3121.

Intergovernmental Panel on Climate Change (2007), Climate Change 2007: The Physical Science Basis, report, 996 pp., Cambridge University Press, New York City.

Schwartz, S. E. (2007), Heat capacity, time constant, and sensitivity of the Earth's climate system. J. Geophys. Res., 112, doi:10.1029/2007JD008746.

Spencer, R.W., W. D. Braswell, J. R. Christy, and J. Hnilo (2007), Cloud and radiation budget changes associated with tropical intraseasonal oscillations, Geophys. Res. Lett., 34, L15707, doi:10.1029/2007GL029698.

Spencer, R.W., and W.D. Braswell (2008a), Satellite measurements reveal a climate system less sensitive than in models, Geophys. Res. Lett., submitted.

Spencer, R.W., and W.D. Braswell (2008b), Potential biases in cloud feedback diagnosis: A simple model demonstration, J. Climate, November 1.

Obama and Amending the Constitution

One of the interesting side stories of the candidacy of Barrack Obama is the persistent story that he was born in Kenya and that his mother immediately returned with the new born to her Hawaiian home to claim and register birth in the USA. The problem, of course, is that this story rings true. This has been a consistent practice among mothers who had the option and we can be sure that she planned to come to the USA before the due date and at least certainly did so. That the child may have not been born in the USA thus becomes a real possibility that is not eliminated by the judicious stonewalling and obfuscation been practiced by the campaign on this and other Obama records. In fact it provides a clear motive to not release any documentation except under duress and clarifies his peculiar behavior over this matter.

This particular feature of the constitution should have been eliminated long ago and it serves no practical purpose whatsoever and merely demonstrates a lingering fear that the American electorate will not get it right. If we end up with a duly elected president who is legitimately disqualified from taking office, then we have a real constitutional crisis. What makes it so choice is that I suspect that everything could be legal up the point in which he is sworn into office.

Obama has been clearly brought up and has lived his adult life as an American citizen. It would be an injustice to deny his right to stand for office on what is a technicality. This is a good example of the law and in this case the constitution been an ass.

On the other hand, this is an excellent time and place in which to push through the appropriate constitutional change on this matter. In fact, we likely have no better time. The Democrats will have a majority in both houses and the presidency itself and are in a position to push through such a constitutional amendment. I expect that the Republicans will find ample reason to support it also. And that will end the static over his mother’s choices.

Since it appears that we are about to be graced with an Obama presidency, which in view of his tactical reticence is shaping up so far to be a mystery, a few thoughts are perhaps in order.

The man is a lawyer by training and a polished speaker who should become a gifted legislator. Because of his age, he may find that an attractive option. Bill Clinton should consider the same option. I have grown up in a world were the top post is held by a parliamentarian and find it quite appropriate.

He was delivered into office as a Chicago machine politician with all the vote rigging and baggage that goes with that. If you wish to believe otherwise read this article:

Somehow that machine has kept their candidate inside a media cocoon that has made no issue out of his many politically incorrect lapses from the past. This will likely come back to haunt him and certainly gives the machine a lot of undue influence.

He must now try to rise above all that in the same way that Kennedy was able to. Kennedy’s daddy was a sweetheart and it probably helped that he was struck down before his influence might have been felt.

He has been associated with a lot of the classic far left political clichés. This is perfectly harmless so long as he actually does not think that they are real policy. The wacko left and the wacko right both have curious positions that politicians need to at least give a nod to. Implementing those weird positions is an excellent method of mobilizing ninety percent of the population against you.

His foreign policy credentials are on a par with those of George Bush, Bill Clinton and Jimmy Carter on entering the white House. Until he has had time on the job, he will not be deft and the challenge will initially be to not make any obvious blunders. Expect more sympathy for Africa in particular and the third world in general. Just do not expect more money.

My real fear is that he does not understand the Laffer curve and is unable to rein in the very stupid people in his own camp who think that they now have a mandate to jack up taxes. If we can stay the course on taxation policy and implement the regulatory governors put in place seventy years ago and removed a mere ten years ago, we will be fine. If we can then implement a number of other useful programs such as universal health care properly by making the states run it as was done in Canada, the country will emerge much stronger from the disaster.

Race issues will be on the agenda whether by choice or otherwise. It needs to be ignored because it is actually resolving itself through simple age. What must not be ignored is poverty. Once again read my item on minimum wage and home ownership.

Poverty and Medical care are both resolved the same way. You establish a service floor for one hundred percent of the population. Otherwise, your system will be naturally gamed in such a way that a third of the population will be disadvantaged while a third pays way too much for the privilege of service. By establishing a floor you are simply moving the game to a higher more naturally stable level while lowering the cost.

Should he win, it is likely that he will face a savage drop in governmental revenues and the need to accommodate a deepening and unavoidable recession over the length of his term. It will be interesting to see how his reputation fares and how long the media wolves will then hold off.

Arctic is Melting even in Winter

This item says a lot about what is happening with the sea ice and informs us that there is more to the story.

The sea ice is actually collapsing in the same way spring breakup hits and suddenly cleans out the river.

We had forgotten that an accelerating melt will tend to create a layer of warmer fresh water on the surface of the denser salt water. This layer appears to be a lot more influential than we supposed and likely carries over the warming impact of previous seasons in some way.

I do not think that mixing is encouraged with the ice protecting most of the fresh water layer from the winds.

What this does suggest is that the actual sea ice removal that we are experiencing may have to go to completion in order for the fresh water layer to be completely remixed. From there on we would see the winter sea ice been easily built up again because of the lack of much fresh water that could insulate the forming ice from the normal salt water.

These may be useful thoughts worth modeling to see were they take us.

We certainly are watching the sea ice go, while possible ignoring supposedly colder weather.

Arctic is melting even in winter

The polar icecap is retreating and thinning at a record rate

Jonathan Leake

The Arctic icecap is now shrinking at record rates in the winter as well as summer, adding to evidence of disastrous melting near the North Pole, according to research by British scientists.

They have found that the widely reported summer shrinkage, which this year resulted in the opening of the Northwest Passage, is continuing in the winter months with the thickness of sea ice decreasing by a record 19% last winter.

Usually the Arctic icecap recedes in summer and then grows back in winter. These findings suggest the period in which the ice renews itself has become much shorter.

Dr Katharine Giles, who led the study and is based at the Centre for Polar Observation and Modelling at University College London (UCL), said the thickness of Arctic sea ice had shown a slow downward trend during the previous five winters but then accelerated.

She said: “After the summer 2007 record melting, the thickness of the winter ice also nose-dived. What is concerning is that sea ice is not just receding but it is also thinning.”

The cause of the thinning is, however, potentially even more alarming. Giles found that the winter air temperatures in 2007 were cold enough that they could not have been the cause.

This suggests some other, longer-term change, such as a rise in water temperature or a change in ocean circulation that has brought warmer water under the ice.

If confirmed, this could mean that the Arctic is likely to melt much faster than had been thought. Some researchers say that the summer icecap could vanish within a decade.

The research, reported in Geophysical Research Letters, showed that last winter the average thickness of sea ice over the whole Arctic was 26cm (10%) less than the average thickness of the previous five winters.

However, sea ice in the western Arctic lost about 49cm of thickness. This region saw the Northwest Passage become ice-free and open to shipping for the first time in 30 years during the summer of 2007.

The UCL researchers used satellites to measure sea-ice thickness from 2002 to 2008. Winter sea ice in the Arctic is about 8ft thick on average.

The team is the first to measure ice thickness throughout the winter, from October to March, over more than half of the Arctic, using the European Space Agency’s Envisat satellite.

Giles’s findings confirm the more detailed work of Peter Wadhams, professor of ocean physics at Cambridge University, who has undertaken six voyages under the icecap in Royal Navy nuclear submarines since 1976 and has gathered data from six more voyages.

The vessels use an upward-looking echo-sounder to measure the thickness of sea ice above the vessel. The data gathered can then be compared with previous years to find changes in thickness.

Wadhams published his first paper in 1990, showing that the Arctic ice had grown 15% thinner between 1976 and 1987.

In March 2007 he went under the Arctic again in HMS Tireless and found that the winter ice had been thinning even more quickly; it was now 50% of the 1976 thickness.

“This enormous ice retreat in the last two summers is the culmination of a thinning process that has been going on for decades, and now the ice is just collapsing,” Wadhams said.

The scale of the ice loss has also been shown by other satellite-based observations that are used to measure the area of the Arctic icecap as it grows and shrinks with the seasons.

In winter it normally reaches about 5.8m square miles before receding to about 2.7m square miles in summer.

In 2007, however, the sun shone for many more days than normal, raising water temperatures to 4.3C above the average. By September the Arctic icecap had lost an extra 1.1m square miles, equivalent to more than 12 times the area of Britain.

That reduced the area of summer ice to 1.6m square miles, 43% smaller than it was in 1979, when satellite observations began.

At the heart of the melting in the Arctic is a simple piece of science. Ice is white, so most of the sunlight hitting it is reflected back into space. When it melts, however, it leaves open ocean, which, being darker, absorbs light and so gets warmer. This helps to melt more ice. It also makes it harder for ice to form again in winter. The process accelerates until there is no more ice to melt.

Wadhams said: “This is one of the most serious problems the world has ever faced.”

Home Ownership and Minimum Wage

Lending practices represent hundreds of years of bitter experience. Shifting them for political gain is now providing another bitter lesson and giving us the second great global financial panic.

The solution to the lack of growth in the housing market could never come from changing the lending rules. It has to come from a proper reorganization of the relationship between government and the labor market itself.

Let me spell this out. The labor market is a market that is currently structured with a deliberate built in inefficiency. A minimum wage is established by fiat that is arbitrary, but makes it impossible for a lot of useful activity to work around the edges and promotes the existence of a large cadre of idle workers who are in no position to support debt.

Grant that individuals who are limited by health or other issues cannot work and must become wards of the state.
Everyone else is available to do either light or heavy work at minimum wage with a premium provided for heavy work.

This work is provided by the state automatically the moment that an individual is laid off from other work. This is not work fare. It is temporary to even permanent work provided by the state at minimum wage to pursue goals understood to serve the common good. A classic example is tree planting and historically waste cleanup got a lot of attention. Once the government knew it was in the business of deploying temporary human capital, it could get very good at it.

Most importantly, the minimum wage is tied directly to the monthly cost of supporting the standard mortgage on say five hundred feet of living space per earner with the appropriate multipliers.

This makes every working person a qualified buyer of a home or condo and the linkage between this wage and measured cost of built space will make it hard for the system to become distorted. If housing prices drop then wages drop, attracting more buyers and more demand for the workers and the opposite is also true.

By linking this wage directly to the capital cost of space and thus directly to the real cost of producing that space, we have created a firm floor for both the housing industry and the labor market that should result in home ownership maximizing and producing a compensatory increase in tax revenue since everyone is either working or living off his savings.

Quite simply we make the customers good so that the banks can do what they do best. This may not sound like a scheme that is revenue neutral but I suspect that it can be fairly quickly once we learn how to properly deploy the temporary labour pool to proper economic effect.

It could turn out that many folks enjoy working on farm projects and that then permits farm operators to entertain value added enterprises that requires the availability of labour.

We will be surprised at what we want to do once labor is readily available for normally labor intensive tasks.
And you will be quite sure that a lot of high end employers will grab available talent just to take on projects that have lacked priority in the normal course of business once a system like this is set up and it will not be life or death to either party.

By establishing a real floor for the labor market, mobility will naturally improve, unwilling unemployment will disappear and an efficient job brokering system will also emerge.

And since everyone qualifies fairly quickly to have a mortgage, home ownership should approach ninety percent which is the objective.

CIGS Solar Cells Efficiency Rises

This is an informative bit of commentary that confirms my surmise that the labs are rushing into production on the basis of producing ten percent efficiency. That is the bare minimum for the beginning of application work.

Nanosolar is not planning limited quantities and they just announced that they have achieved operational breakeven on their present sales. Normally, I am skeptical of early days pronouncements about sales and earnings, but this company is clearly running hard to become the front runner and actually expect that their advisors would insist that they under promise and over produce.

It is also clear that their press coverage is nicely expanding and I have seen no challenges to date. But then their backers are intimidating. Obviously an early IPO is in the works for this company.

The more important point is that 20 percent efficiency is a done deal which makes it directly competitive the best silica and the theoretical thirty percent target is becoming very real.

Other lab work is also opening up exploitation of the full visible spectrum and also the infrared spectrum, allowing much more energy to be converted.

Nanosolar has clearly broken the cost barrier of $1.00 per watt or less. Now we need to maximize efficiencies to maximize the ease of application.

Amazingly, this is still all happening offstage from the mainstream media. Yet in 1970, I explained to my cousin that we would have a machine on a desk inside of a decade and that communication between machines would come shortly after. I then suggested that all human knowledge would become machine based and accessible in the nineties. And this would generate an explosion of knowledge and innovation. I merely failed to imagine that anyone would care besides us academic types.

We are about to be swamped with local cheap energy and the nice problem of finding ways to use it. We can now proceed by actually terraforming the Earth and turning most of it into a well managed garden.

Michael Kanellos

Rumor: New Record in CIGS Efficiency October 23, 2008 at 9:34 AM

The National Renewable Energy Labs (NREL) has upped the bar in copper indium gallium selenide (CIGS) solar cells once again, sources tell me. NREL scientists have developed a CIGS cell with 20.2 percent efficiency, inching past the 19.9 percent cell the lab announced in March. I’ve called NREL but haven’t heard word back on confirmation.

That number helps explain why VCs and investment banks continue to pour money into CIGS. Potentially, CIGS cells have the ability to convert more sunlight into power than other thin-film technologies like cadmium telluride and amorphous silicon. Cadmium telluride solar cells have a theoretical maximum of around 19.6 percent and commercial cad tel cells have an efficiency of around 10 percent. (We said nine earlier.) CIGS can also be printed, say advocates, on cheap, flexible substrates that can be integrated into building products. Cad tel solar cells, to date, need a glass substrate, which limits cad tel to rooftop applications.

CIGS cells are being produced in limited quantities around the 10 percent efficiency mark and theoretically CIGS cells could get into the mid-20 percentile or even low-30 percentile range someday.

Some of the leading CIGS companies include: Nanosolar, Solyndra, SoloPower, HelioVolt and Miasole. There are newcomers too:
Telio Solar and NuvoSun.

Still, the stakes are high. Nearly a billion dollars have been invested in CIGS startups in the past couple of years and most of them have yet to start commercial production. Many companies have had to delay their CIGS solar cells due to manufacturing issues. (Chemically, the elements don’t play well together either — it’s the solar equivalent of trying to pull off a Guns N’ Roses reunion.) Even the large companies doing CIGS and CIS cells like Honda haven’t exactly been cranking these things out of the factory.

Terry Corcoran on Greenspan Policy Failure

The bottom line is that the unprecedented credit collapse that we have experienced is completely a failure of governmental policy. The governors put into the system by Franklin Roosevelt with the assistance of some pretty knowledgeable folks were simply taken off toward the end of the Clinton administration and once again it was all allowed to run its course in the face of plenty of wiser heads.

When I saw them been removed, I wondered what they were thinking. Now we learn that the motive was to put a group of financially weak voters into houses, almost regardless of the cost. Now the bill has come due, and all these folks are out on their ass. How could this have had any other outcome?

When the history of these days is written, this failure will carry all the blame. An economy maintaining a four percent growth for the preceding twenty years as a result of Reagan’s tax reform was diverted into the business of destroying capital.

The tragedy is that once the horse was out of the barn, it was clearly impossible to get it back under control until the credit system itself collapsed under the weight of bad assets. I really do not think that congress or the president or the fed had the tools to actually stop the train wreck. In fact the fed facilitated the wreck by providing cheap money, although that only served to speed up the train by a couple of years.

The nation’s major banks have all lost their capital and now the nation’s reserves and possibly the capital of the globe’s banks. They could not stop it either.

Quantum of Failures

Forget the markets: massive government failure is behind world financial chaos

Terence Corcoran, Financial Post Published: Saturday, October 25, 2008

In Quantum of Solace, the new James Bond movie due next month, our hero takes on the latest threat to mankind and the survival of the free world: a businessman. Of course!

So it has been in film for decades. Now, apparently, it is true in life. Businessman as scourge is the dominant ideology of our time. We all know that most people see market-driven bankers, brokers and corporate bosses as the root of evil, corrupted and greedy--even though we all ravenously consume the products produced by market-driven business.

The concept of corporate evil is embedded in the bones of our culture; it is now taken for granted that markets and capitalism are flawed in practice and must be regulated and controlled, if not destroyed. Even in 1960, in the deep freeze of the Cold War with communism and the Soviet Union, Ian Fleming's Dr No has James Bond taken captive on a Caribbean island. No statist Fidelistas on this island. As Bond enters the underground lair of Doctor No, he stops dead in his tracks: "It was the sort of reception room the largest American corporations have on the President's floor in their New York skyscrapers."

Economics is like the movies. It's filled with papers, text books and arcane treatises referencing market failure, the general idea being that unfettered capitalism inevitably generates unwanted outcomes and, in some cases, total disaster. It's a core Marxist theme, a staple of Keynesian economics, that permeates the work and thought of just about every economic school of the ideological spectrum.

From major league economists to top rank demagogues, the message from the current economic crisis is clear. "It is the end of capitalism," said Iran's President Mahmoud Ahmadinejad, as good an authority as any these days. To steal a famous phrase, it seems like "We're all Muslim extremists now."

This week, Alan Greenspan, of all people, joined the market-failure parade. He has his reasons, of course. Better the market should take the hit for creating the mortgage and financial crisis than anything he did to monetary policy as chairman of the U. S. Federal Reserve during the great U. S. housing boom. More about Mr. Greenspan later.

Bashing markets is fun, convenient and politically popular. But it also takes a certain willful disregard of events and policy to reach the conclusion that capitalist failure brought the world economy to its knees. Only conscious effort and stubborn, even malicious, obtuseness could lead anyone to conclude that, on the basis of recent events, the sources of the financial meltdown are corporate, that big business made us do it.

The role of bankers and other market players in causing the current crisis is undeniable, but it is limited compared with the massive role played by governments. From U. S. housing policies to financial regulators to central bank actions all over the world, the evidence is overwhelming that the causes of today's turmoil can be traced to massive government failure.

This failure of policy, moreover, is now being compounded. From the G7 to the G20 and beyond, governments are scrambling to push through measures and policies that cannot possibly have been thought through or properly evaluated. Bad interventions are inevitably being poured on to extinguish the fires created by previous bad interventions.

How did we get to this perilous juncture? One starting point is as good as another. But if the proximate trigger for the financial meltdown was the U. S. subprime housing fiasco, the roots of that crisis offer pure examples of government failure and regulatory policy run amok.

The front page of The New York Times last Sunday told the story of Henry Cisneros, the top housing official in the Bill Clinton administration during the 1990s. As The Times described it, Mr. Cisneros "loosened mortgage restrictions so first-time home buyers could qualify for loans they could never get before." He then went on to join the boards of a major U. S. home builder and of Countrywide Financial, the mortgage lender at the epicentre of the U. S. subprime mortage collapse. (For a list of some of the references used in the writing of this article, see www.financialpost.com/fpcomment.)

The Clinton administration set out to boost home ownership in the United States, then languishing at 64%. It began earnest work on the project under Mr. Cisneros, the first Hispanic to head the Department of Housing and Urban Development (HUD). As a direct result of HUD policy, families no longer had to prove five years of income. All they needed was three. Lenders no longer had to interview borrowers face to face.
Reacting to these and other more significant policy changes, lenders such as Countrywide Financial in California set up units to service these so-called subprime borrowers. "We were trying to be creative," Mr. Cicernos told the Times.

Mr. Cicernos sat on Countrywide's board for years. He left last year, shortly before the mortgage giant, on the brink of bankruptcy, was taken over by Bank of America. It had $170-billion in mortgage assets, most of them subprime. Countrywide CEO Angelo R. Mozilo is now the target of regulators, politicians and the media.

During Mr. Cicernos' time on the board he sat on Contrywide's regulatory committee, overseeing compliance with law and regulation, but he says he does not now recall seeing reports that one in eight of Countrywide's loans was "severely unsatisfactory" due to shoddy underwriting.

But Mr. Cicernos, the Clinton administration, Countrywide, and eventually the Bush administration, were merely vehicles for radical mortgage lending ideas promoted by housing activist groups and political operators. The full horror story of the regulation-induced breakdown in U. S. mortgage markets is to be told in a forthcoming new book, Housing America: Building Out of a Crisis, from the Independent Institute in Oakland, California.

The idea that the U. S. subprime mortgage collapse is the product of unscrupulous agents and lenders roaming the country in a deregulated market searching for ignorant buyers is overwhelmed by contrary facts. In "Anatomy of a Train Wreck: Causes of the Mortgage Meltdown" (a chapter in Housing America), University of Texas economics professor Stan J. Liebowitz documents the deliberate government policies that were brought in to destroy U. S. mortgage lending standards, pillage banks and socialize risk.

It begins with the idea, long held among activists, that U. S. mortgage lending practices prevented home ownership among low income and certain racial groups. Banks and other lenders, following normal and prudent lending standards, were accused of discrimination. The solution: Relax lending standards. One step along that road was the 1970 U. S. Community Reinvestment Act, forcing banks to lend equally to all geographic areas, regardless of risk.

A later pivotal event in the decline in lending standards, according to Prof. Liebowitz, came after 1992 when the Federal Reserve Bank of Boston conducted a study that purported to show that racial and income discrimination in mortgage lending continued to exist. It accused lenders of applying "arbitrary or unreasonable measures of creditworthiness."

Prof. Liebowitz says the study was based on "horribly mangled data" and was riddled with errors. But it was politically correct, and it rose to become the basis for new national mort-gage lending standards that ignored essential principles of mortgage lending. Lack of credit history should not be a negative factor. Traditional income-to-loan ratios of 28-to-36% should not apply to low-income individuals. Maybe 5% would do. Lenders should not discriminate against certain sources of income, such as short-term unemployment insurance benefits.

The new risk paradigm "comports completely with common sense," said Boston Fed President Richard Syron, a former head of Freddie Mac, the U.S. government mortgage backer.

The impact of these new "equal opportunity" lending guidelines - described as "flexible underwriting standards" --was dramatic, says Prof. Liebowitz. "As you might guess, when government regulators bark, banks jump. Banks began to loosen lending standards. And loosen and loosen and loosen, to the cheers of politicians, regulators and GSEs."

GSEs are Government Sponsored Entities, U. S federal agencies charged with lending and insuring mortgages. Fannie Mae and Freddie Mac-- long charged with facilitating home ownership -- became out-of-control participants in the sub-standard lending explosion that followed. With the Boston Fed and other risk studies apparently showing that low-income mortgages issued under new lax standards were just as sound over time as prime mortgages, by 2001 and 2002 the U. S. housing market had become a subprime rampage.

Politicians in Congress fueled the explosion. The incestuous relationships between Congress and the GSEs is now well known, but still conveniently underplayed. For years, Fannie Mae and Freddie Mac-- right up to their multi-trillion-dollar bankruptcy and seizure by the U. S. government earlier this year -- roared out of control. They funded politicians' interests, kept mortgage interest rates low and became government-backed agencies for trillions of dollars in risky mortgage lending.

Between 1995 and 2007, the combined balance sheet of Fannie Mae and Freddie Mac, including Mortgage Backed Securities (MBS), rose from $1.4-trillion to $4.9-trillion, an annual increase of almost 15%. At $4.9-trillion, the value of these risks in the two GSEs is slightly less than the total public debt of the U. S. government. About $1-trillion dollars of Fannie/Freddie activity involved exposure to subprime and lower-grade mortgages. For an overview of the rise and fall of Fannie Mae and Freddie Mac, including their corrupt links to Congress, see The Last Trillion-Dollar Commitment: The Destruction of Fannie Mae and Freddie Mac, by Peter Wallison and Charles Calomiris, for the American Enterprise Institute.

In 2004, then Fed Chairman Alan Greenspan warned of the looming risk in the government-backed mortgage lenders. Fannie and Freddie, he said, were taking on trillions in mortgage assets without properly accounting for, or charging for, the risk embedded in the new lax lending standards. In 2005, Mr. Greenspan explicitly warned of "systemic risk" if the two operations failed to change their ways.

Some tightening of Fannie and Freddie activity did occur, in part as a reaction to accounting scandals within the agencies, but growth exploded again in 2007. At the end of 2007, the two agencies accounted for 75% of new residential mortgage lending. James Lockhart, head of the Office of Federal Housing Enterprise Oversight, which regulates the two organizations, estimated they would soon be financing or guaranteeing 90% of new mortgages, a near monopoly.

On the surface, the mortgage push succeeded. Home ownership expanded from 65% to 69%, although most observers now believe the whole exercise was undesirable, an impossible extension of the American Dream. There inevitably must be essential financial and economic limits to home ownership. The apparent success was based on fundamentally unsound regulatory policy and massive amounts of government-backed funding.
The rapid and dramatic rise in ownership, fueled by mortgage credit, produced big increases in home prices. As prices rose, the risk flaws were overshadowed. This gave rise to even greater use of credit, as speculators and buyers piled onto a credit and ownership machine that seemed to offer no risk and guaranteed gains. No-down-payment mortgages or even 110% mortgages were easily obtained. Rising prices, fuelled by easy credit, spilled the housing bubble into the prime U. S. housing market.

Not all funding came via government and regulatory overreach. Hundreds of billions were raised through MBS and other risk-distribution vehicles by private players. But here too heavy doses of regulatory input and support played a significant role. The mortgages issued under new sloppy and risky lending standards were assembled and packaged and then sold as AAA-rated securities.

Rating agencies were given their power over investment and lending decisions by government. Government required financial institutions, such as insurance and investment funds, to investment in securities rated by National Recognized Statistical Rating Organizations, approved by the SEC. Only three met with SEC approval: S&P, Moody's and Fitch. Lack of genuine market competition in the ratings business, and its dependence on regulation for authority, have long been seen as hazards no regulator would take on.

The ratings firms, assured of business, fell into the lending assessment trap laid out by the Boston Fed and others. They became part of the social program. The new mortgage myth claimed that subprime mortgages issued under lax standards to low income Americans were no more risky than prime mortgages, especially when they were packaged into large agglomerations. Says Prof. Liebowitz: "Given that government-approved rating agencies were protected from competition, it might be expected that these agencies would not want to create political waves by rocking the mortgage boat, endangering a potential loss of their protected profits." Ratings inflation ensued, with AAA and other high ratings accorded to all manner of high-risk mortgage products.

Investment houses became part of the regulatory imperative. A sales pitch from Bear Stearns, circa 1998, tells investors that the old mortgage lending rules have been replaced and that mortgages granted under Community Reinvestment Act provisions are as safe as prime mortgages. "Do we automatically exclude or severely discount...loans [with poor credit scores]? Absolutely not," said Bear Stearns.

This giant circle of risk, constructed around regulation and government policy, worked all too well. In worked, mostly, because of rising home prices that covered over the risk.

Crucial to the story is the role of another U. S. government agency, the Federal Reserve under Alan Greenspan. Mr. Greenspan's low-interest rate policy -which brought the Fed funds rate to 1% through much of 2003-- helped push home values up even higher. As values rose, the lax lending standards started to look even better. Look, Ma, no loan failures!

The new paradigm seemed to be working even better than expected, making rating agencies, Fannie, Freddie, investment dealers, bankers and MBS packagers and buyers even more aggressive and more confident that the new flexible lending standards--built on the premise of ever-rising home prices--were bulletproof. Countrywide Financial, whose board former HUD director Henry Cisneros sat on, was by 2007 the largest provider of loans purchased by the government-controlled Fannie Mae, accounting for 29% of its business.

Greenspan now downplays his role in propelling the mortgage boom or the economy and the risk profile of ever higher debt. He blames a mysterious "tsunami of risk" and, in testimony before a Congressional committee Thursday, market failure. He claimed "shocked disbelief" that bankers and institutions could have failed to properly assess risks in mortgage securities.

Blaming bankers gets Mr. Greenspan (author of last year's best-selling memoir, The Age of Turbulence) off the hook for what others clearly feel he bears much responsibility. That certainly is the view of Anna Schwartz, the 92-year-old coauthor with Milton Friedman of the 1963 classic Monetary History of the United States.

In an interview with The Wall Street Journal last Saturday, Ms. Schwartz noted that the house-price boom began with the very low interest rates in the early years of this decade under Mr. Greenspan. "Now," she said, "Alan Greenspan has issued an epilogue to his memoir." In it, Ms. Schwartz says Mr. Greenpan concedes "it's true that monetary policy was expansive. But there was nothing that a central bank could do in those circumstances. The market would have been very much displeased, if the Fed had tightened and crushed the boom. They would have felt that it wasn't just the boom in the assets that was being terminated."
In other words, says Ms. Schwartz, Mr. Greenspan "absolves himself. There was no way you could really terminate the boom because you'd be doing collateral damage to areas of the economy that you don't really want to damage." Ms Schwartz adds, "I don't think that that's an adequate kind of response to those who argue that absent accommodative monetary policy, you would not have had this asset-price boom.",
The mortgage and financial crisis now sweeping the world is the product of a colossal build-up of unintended consequences brought on by government policy and regulation. Other regulatory rules accelerated the meltdown, including post-Enron mark-to-market accounting rules and international bank capital standards that were brought in without adequate thought or preparation.

What we are witnessing today, as governments pile on massive new rounds of intervention, is a growing pyramid of government failures. It will take more than James Bond to break the pyramid.