Archive for January 17th, 2010

High Tech Makeover in Store for Nation’s Power Transmission Lines

The nation\'s power transmission lines are out of date and lose a significant amount of energy. Scientists working at the U.S. Department of Energy’s Brookhaven Laboratory are all abuzz over a new bit of evidence that could help the U.S. save a good chunk of the energy that is currently lost through power transmission lines.  The DOE estimates that the nation’s antiquated transmission and distribution systems together were losing about 9.5% as of 2001, and things haven’t gotten any better since then.

The Brookhaven breakthrough involved evidence that electronic liquid crystal states can exist within a high temperature superconductor.  In practical terms, that means that it may be practical to develop power lines that lose no power at all.  There’s a long way to go before the rubber hits the road on this one, though.  The next step is to see if the material maintains its capabilities in real conditions that can be applied to the Smart Grid of the future.

Read more of this story »


Visit the original post at: Energy News

Masdar and Boeing’s Qatari catch-up

Masdar and Boeing’s Qatari catch-up
Not to be upstaged by a similar announcement last week, a Masdar-led consortium today reiterated its own jet fuel from saltwater plant project, with an airline partner.


Visit the original post at: Energy News

Indonesian Kinetic Energy Harvesting Device Claims Getting 150 W



A team of students led by Harus Laksana Guntur, an engineering lecturer at the Sepuluh Nopember Institute of Technology (ITS) in East Java capital Surabaya have developed a mobile kinetic energy charger that can charge your cellphone every time you move. Unlike a battery or solar charger, this new cellphone charger uses the user’s energy to reload mobile phones.


Visit the original post at: Energy News

Three Cool Concepts For Urban Biking

Three Cool Concepts For Urban Biking

cycles-10

We’ve seen several bike-related concepts in the past few weeks. If, in fact, alternative transportation is on the rise, bikes will become a larger part of our transportation mix. To be really effective and to find wide acceptance, these three may help make bikes more of an option.

Copenhagen Wheel

A group of MIT researchers developed the Copenhagen wheel, a versatile electric bicycle wheel which was given its debut last month in Copenhagen during the COP-15 summit. The wheel combines a regenerative brake, a battery, an electric motor, and a variety of sensors and a bluetooth connection. Combining regenerative braking and electric assist acceleration helps make it easier for bike commuters to deal with starts and stops. With the Copenhagen wheel, the bike can also track speed and distance traveled, as well as monitoring local smog conditions and tracking the proximity of friends. The Copenhagen wheel also acts as a smart lock to prevent unauthorized use of the bike.

YikeBike Mini-farthing

Weighing in at 10 kg (22 lbs), the YikeBike is a small folding electric scooter with a large-wheel/small-wheel combination like the a “pennyfarthing.” It folds into a space of 150 x 600 x 600mm (approximately 6 x 18 x 18 inches) and can be unfolded and ready to ride in about 15 seconds. The YikeBike has a range of 9-10 km (5.5-6 miles). It is expected to be commercially available soon at a cost of around 3,500 Euros (roughly US$5,000).

Underground Bike Storage

In an area with extensive bike commuting already in place, storage solutions for the hundreds of bike riders becomes a problem. In Japan, the Eco Cycle is an underground storage facility with a capacity of 144 bikes (18 bikes per level and 8 levels of storage). The automated system can retrieve any bike within 10 seconds, making it quick and convenient. A one month pass for the Eco Cycle garage costs about $30/month (2600 yen).

Thanks again, John B!


Visit the original post at: EcoGeek.org

Is linoleum green?

Is linoleum green?
I was surprised to discover linoleum isn’t a petrochemical based plastic, but made mostly from plant materials. Learn more about what goes into linoleum and its environmental street-cred.


Visit the original post at: Green living tips

Is linoleum green?

Is linoleum green?
I was surprised to discover linoleum isn’t a petrochemical based plastic, but made mostly from plant materials. Learn more about what goes into linoleum and its environmental street-cred.


Visit the original post at: Green living tips

Is linoleum green?

Is linoleum green?
I was surprised to discover linoleum isn’t a petrochemical based plastic, but made mostly from plant materials. Learn more about what goes into linoleum and its environmental street-cred.


Visit the original post at: Green living tips

UOP, Masdar Institute, Boeing and Etihad Airways Establish Sustainable Aviation Biofuels Project Using Integrated Saltwater Agricultural Systems; Support from Global Seawater Inc.

Isas
An example of an ISAS operation: Seawater Farms Eritrea. Source: Dr. Carl Hodges. Click to enlarge.

UOP LLC, a Honeywell company, the Masdar Institute of Science and Technology in Abu Dhabi, Boeing and Etihad Airways are establishing a research institute in Abu Dhabi—the Sustainable Bioenergy Research Project (SBRP)—that will use integrated saltwater agricultural systems (ISAS) to support the development and commercialization of biofuel sources for aviation and co-products.

Boeing and UOP last year commissioned a study on the sustainability of a leading family of saltwater-based plant (halophytes) candidates for renewable jet fuel. The Masdar Institute of Science and Technology led the study, which examined the overall potential for sustainable, large-scale production of biofuels made from salicornia bigelovii and saltwater mangroves. (Earlier post.)

As part of the initial agreement signed by the partners on 17 January at the World Future Energy Summit in Abu Dhabi, the SBRP will undertake research projects that combine the arid and salt-rich environment of Abu Dhabi with innovative and promising saltwater farming practices. The Masdar Institute will host the SBRP and provide laboratory and demonstration facilities both within and outside of Masdar City.

The SBRP team will focus on an ISAS approach, which is a highly efficient system for producing liquid and solid biofuels, capturing and holding carbon from the atmosphere, enlarging habitats to increase biodiversity, and simultaneously releasing fresh water for higher value uses such as drinking water. ISAS also has the potential to reduce the impacts of sea level rise on coastal communities.

The integrated approach uses saltwater to create an aquaculture-based farming system in parallel with the growth of the mangrove forests and Salicornia, a plant that thrives in salty water. These biomass sources can be sustainable harvested and used to generate clean energy, aviation biofuels and other products. The closed-loop system converts aquaculture effluent into an affordable, nutrient-rich fertilizer for both plant species. Developing low-cost, non-petroleum fertilizers is a key to achieving reductions in carbon emissions from any biofuel source. This technology has been pioneered by Dr. Carl Hodges of Global Seawater Inc., who has been engaged as special advisor to the project.

The development of low-cost, non-petroleum fertilizers is one of the keys to achieving genuine carbon emissions reductions from any biofuel source. This seawater farming concept has been successfully implemented in Mexico and Northern Africa by Global Seawater Inc., which will provide advice and insight to support the SBRP in Abu Dhabi.

Resources


Visit the original post at: Transportation News

PRTM: Operational Gains Can Help Drive Li-ion Cost Reduction Exceeding 50% by 2020, with Plug-in Vehicle Adoption of 10%

A series of recent reports—one from the National Research Council (NRC) (earlier post) and another from the Boston Consulting Group (earlier post)—concluded that an expected continuing high cost of lithium-ion batteries will dampen mass market adoption of plug-in vehicles.

However, Oliver Hazimeh, Director and head of the global e-Mobility Practice at PRTM, a global management consulting firm, suggests that total lithium-ion battery cost reductions exceeding 50% by 2020 are feasible without technology breakthroughs, primarily through operational gains, assuming EV adoption of approximately 10% of new vehicles sold by 2020 to support volume manufacturing. (PRTM contributed to the Electrification Roadmap released last November by the Electrification Coalition. Earlier post.)

Total cost of ownership parity, infrastructure availability and additional government environmental policies will drive plug-in vehicle adoption of more than 10% by 2020, Hazimeh says.

The majority of these battery cost reductions can be achieved through optimizing design and operations across areas including production/manufacturing, supply chain and product development:

  • Production/Manufacturing Process Optimization: As battery production volumes increase, process optimization will drive improvements in production yield rates. Coupled with scale efficiencies that producers will achieve as they supply packs in volumes exceeding 500,000 units per year and cells in volumes exceeding 200 million units per year, PRTM expects that production process optimization alone will yield a 20-25% reduction in battery costs by 2020, Hazimeh says.

  • Supply Chain Design: Another example of operational enhancements expected to yield significant improvement in battery costs is in the extended supply chain. Cell manufacturers, for example, will likely see 10-15% cost reductions through pooling material spends and optimizing the design of their supply chains in phases as volumes increase, according to PRTM.

  • Product Complexity and Design: As more OEMs release an increasing number of electric vehicles into the market, optimization of the design platform will reduce product complexity, which will further reduce battery costs. During this early ramp-up phase, a number of different cell and pack design platforms will emerge due to unique OEM and vehicle requirements. However, Hazimeh says, as de facto standards emerge, battery producers will optimize their design platforms, which will drive additional production and supply chain benefits with the potential of additional cost reductions of up to 10%.

    On the product design front, ongoing innovation and improvements in material advancement and battery cell and pack design to increase battery performance and reduce functional cost will yield an additional 10% in cost reduction.

PRTM’s analysis does not factor in additional reductions that could be created by local government manufacturing incentives.

10% of US Drivers Willing to Consider Plug-in Purchase

Separately, a recent survey by Ernst & Young’s Global Automotive Center found that more than 10% of US drivers would consider purchasing a plug-in hybrid or electric vehicle. The report canvassed the views of a thousand American licensed drivers to gauge consumer awareness and interest of plug-in hybrid and electric vehicles in the market.

Based on this sample, this figure would equate to approximately 20 million American drivers who are favorable towards purchasing plug-in hybrid and electric vehicles. The survey is part of Ernst & Young’s advanced vehicle powertrain initiative, which focuses on the business opportunities and issues companies face in the development of alternative transportation solutions.

Mike Hanley, Ernst & Young LLP, Global Automotive Leader, commented that although only 10% of the drivers responded positively to purchasing plug-in hybrid or electric vehicles, for a powertrain technology which is not yet widely available, it is a significant number which should not be ignored.

As the survey suggests, electric vehicles have an opportunity to make a significant entrance into the US public consciousness over the next few years. Even if only a small portion of the 10% of survey respondents who said they would consider a plug-in hybrid or electric vehicle when introduced are serious, there would still be more than enough demand to sell out the 2010 and 2011 production runs of the major and new manufacturers, while buying crucial time to build out infrastructure and increase public awareness.

—Mike Hanley

Some of the biggest challenges highlighted in the survey for advancing the popularity of new powertrain technologies from the niche into the mainstream are access to charging stations, battery driving range and vehicle costs.

While there are clear barriers to consumers fully embracing these technologies, 34% of survey participants said they would subsidize local charging stations, further illustrating that a significant number of drivers recognize the future benefits of plug-in hybrid and electric vehicles. Other key findings of the survey included:

  • Public awareness of emerging powertrain technologies remains weak across the US.

  • Not many consumers are willing to embrace the new technology prior to it being well-established in the market.

  • No other plug-in hybrid and electric vehicle incentive or benefit is considered nearly as important as saving money on fuel.

  • Among several considerations, access to charging stations, battery driving range and vehicle cost are by far the three most significant consumer concerns.


Visit the original post at: Transportation News

Toyota Launching G Sports Conversion Series; Includes Hybrids

Toyota plans to begin a gradual rollout in Japan in mid-2010 of its newly developed “G Sports” (G) sports conversion car series. The G series name is based on the G associated with classic Toyota sports models like the Toyota 2000GT and the Celica GT-Four, and may include G versions of hybrids such as the Prius, or the GRMN sports hybrid concept shown at Tokyo’s Auto Salon, one of the world’s largest customized car shows.

Sportshybrid
The GRMN Sports Hybrid concept. Click to enlarge.

The series, to be sold through Toyota-brand dealers, is meant to offer to a wider range of people the “automotive seasoning” that TMC has pursued through its GAZOO Racing activities.

The G series is designed for customers who strongly desire to own a unique vehicle, offering them a personalized interior and exterior design along with sports-driving performance. TMC has directly designed this sports conversion series, instead of the conventional approach of outsourcing development to a customization firm.

Priusg
Tokyo Auto Salon 2010 Prius G Sports Concept. Click to enlarge.

The base vehicle of the G series is equipped with internal and external components that include sports suspension, aerodynamic parts and racing-style seats. In addition to the FT-86 Concept rear-wheel drive compact sports car and the Lexus LFA shown at the 2009 Tokyo Motor Show, plans call for TMC’s sports model lineup to expand through the introduction of conversion models and sports-trim models of mass-produced vehicle series, such as the GRMN and G series.

Toyota says that it intends to help foster a motorsports culture by also actively participating in grassroots motorsports events, Formula Nippon and SUPER GT in Japan, and NASCAR in the United States.


Visit the original post at: Transportation News

BMW Enhances the 2011 3 Series; Direct Injection Twin-scroll Turbo I-6 for the 335i

N55
BMW N55 inline six-cylinder gasoline engine with TwinPower Turbo and High Precision Injection and Valvetronic. Click to enlarge.

BMW has announced substantial enhancements to the 2011 3 Series Coupe and Convertible, including style updates for 328i and 335i models and an all-new direct gasoline injection turbocharged engine with Valvetronic for 335i models. The 2011 3 Series Coupe and Convertible models will go on sale in Spring 2010, and pricing will be announced closer to the on-sale date.

BMW 3 Series Coupe and Convertible customers have a choice of two inline-6 engines. The 328i features 230 hp (172 kW) while the 335i has 300 hp (224 kW) and is the first inline-6 equipped with a single twin-scroll turbocharger, BMW’s Valvetronic throttle-less intake technology, High Precision direct fuel injection, and all-aluminum construction. (The engine is also being applied in the 2011 135i Coupe and Convertible.)

P90047646
Cutaway view of the direct-injection inline 6. Click to enlarge.

The 335i’s new inline-6 engine (designated N55) displaces 3.0-liters and develops its maximum output of 300 hp at 5,800 rpm—70 hp up on the 2011 328i models—with peak torque of 300 lb-ft (407 N&iddot;m) available all the way from 1,200–5,000 rpm. Redline is 7,000 rpm. This is the same level of performance as the previous 3.0-liter twin-turbo inline-6 but with the application of twin-scroll technology and the integration of Valvetronic, this new engine is more fuel efficient.

The 335i Coupe and Convertible can accelerate from 0 – 60 mph in 5.3 (5.5) seconds and 5.5 (5.7) seconds, respectively, when equipped with the manual (automatic) transmission.

P90047645
The twin-scroll charger. Click to enlarge.

The new N55 is the first BMW inline-6 to combine turbocharging, High Precision direct fuel injection, and Valvetronic variable intake technology. It features a single, mid-sized turbocharger with a “twin-scroll” housing to boost performance and minimize the response lag. Thanks to its housing design which maintains proper separation between streams of exhaust gasses, the turbocharger builds up pressure much faster than previous-generation turbochargers, thus eliminating even the slightest tendency for lag.

The N55 turbocharged inline-6 weighs approximately 150 lbs. less than an equally powerful eight-cylinder engine displacing 4.0 liters. This lower weight means a significant advantage not only in fuel economy, but also in balancing the car’s weight distribution.

Using Valvetronic for the first time on a turbocharged inline-6 allows air intake combustion with virtually no delay and with reduced pumping losses. As a result, the engine makes power more quickly than ever before—shown by the N55’s ability to reach peak torque at 1200rpm, 200rpm earlier than its predecessor.

Turbocharging typically includes intercooling of the engine’s induction air, that is, cooling the compressed air that emerges, very much heated up by the compression process, from the turbocharger(s). Sometimes it’s done with coolant; in the case of the N55 engine, it’s accomplished with outside air.

Intercooling is necessary to reduce the temperature of the incoming air to preclude knocking that can reduce power or, in the extreme, damage the engine. The N55, like all other current BMW engines, is equipped with knock control as part of the Digital Motor Electronics (DME) engine management system. On the N55 engine, the DME is now mounted directly to the top of the engine for better packaging and weight savings.

The significant loads and cylinder pressures of the N55 300-hp engine required the use of an aluminum engine structure with cast-iron cylinder sleeves. Altogether, the N55 weighs about 427 lbs (194 kg). The 335i Coupe and Convertible dual exhaust system runs at both sides of the vehicle. At low loads, a flap channels most gas through one side to reduce low-frequency exhaust rumble. The 335i Coupe and Convertible feature an air-to-oil external oil cooler mounted in one wheel well as opposed to the N52’s coolant-to-oil unit.

The new N55 engine is also able to achieve a more favorable emissions signature than its predecessor. The single turbocharger has only one exhaust path and feeds a single catalytic converter in place of the previous engine’s two. This means the exhaust gases are concentrated at the catalytic converter for better cold-start emissions performance.

The N52. The 328i Convertible is powered by BMW’s 230 hp, 200 lb-ft, 3.0-liter inline-6 engine, known internally as the N52. Its magnesium/aluminum construction and Valvetronic variable valve lift are features found only on BMW engines. The N52 achieves impressive progress on all performance and technology fronts, especially in its remarkably light weight of 357 lbs. An aluminum/magnesium engine block, hollow camshafts, plastic camshaft cover, improved combustion chambers, a further evolved Double VANOS (VAriable NOckenwellen Steuerung = variable camshaft control, or variable valve timing), higher fuel injection pressure, sophisticated engine electronics, an electric coolant pump, a variable-volume oil pump and an oil/coolant heat exchanger are the other weight-saving features and improvements of this engine over previous generations.


Visit the original post at: Transportation News

Newsweek: Diane Feinstein vs. solar power
Protect the environment or create renewable energy? A new bill shows they’re far from the same thing.

Email this Article
Add to Newsvine


Renewable energySolar powerEnergyRenewableBusiness
Visit the original post at: MSNBC.com: Environment

2009 temperatures by Jim Hansen

2009 temperatures by Jim Hansen

This is Hansen et al’s end of year summary for 2009 (with a couple of minor edits).

If It’s That Warm, How Come It’s So Damned Cold? 

 
by James Hansen, Reto Ruedy, Makiko Sato, and Ken Lo
 
The past year, 2009, tied as the second warmest year in the 130 years of global instrumental temperature records, in the surface temperature analysis of the NASA Goddard Institute for Space Studies (GISS). The Southern Hemisphere set a record as the warmest year for that half of the world. Global mean temperature, as shown in Figure 1a, was 0.57°C (1.0°F) warmer than climatology (the 1951-1980 base period). Southern Hemisphere mean temperature, as shown in Figure 1b, was 0.49°C (0.88°F) warmer than in the period of climatology.


Figure 1. (a) GISS analysis of global surface temperature change. Green vertical bar is estimated 95 percent confidence range (two standard deviations) for annual temperature change. (b) Hemispheric temperature change in GISS analysis. (Base period is 1951-1980. This base period is fixed consistently in GISS temperature analysis papers – see References. Base period 1961-1990 is used for comparison with published HadCRUT analyses in Figures 3 and 4.)

The global record warm year, in the period of near-global instrumental measurements (since the late 1800s), was 2005. Sometimes it is asserted that 1998 was the warmest year. The origin of this confusion is discussed below. There is a high degree of interannual (year?to?year) and decadal variability in both global and hemispheric temperatures. Underlying this variability, however, is a long?term warming trend that has become strong and persistent over the past three decades. The long?term trends are more apparent when temperature is averaged over several years. The 60?month (5?year) and 132 month (11?year) running mean temperatures are shown in Figure 2 for the globe and the hemispheres. The 5?year mean is sufficient to reduce the effect of the El Niño – La Niña cycles of tropical climate. The 11?year mean minimizes the effect of solar variability – the brightness of the sun varies by a measurable amount over the sunspot cycle, which is typically of 10?12 year duration.


Figure 2. 60?month (5?year) and 132 month (11?year) running mean temperatures in the GISS analysis of (a) global and (b) hemispheric surface temperature change. (Base period is 1951?1980.)

There is a contradiction between the observed continued warming trend and popular perceptions about climate trends. Frequent statements include: “There has been global cooling over the past decade.” “Global warming stopped in 1998.” “1998 is the warmest year in the record.” Such statements have been repeated so often that most of the public seems to accept them as being true. However, based on our data, such statements are not correct. The origin of this contradiction probably lies in part in differences between the GISS and HadCRUT temperature analyses (HadCRUT is the joint Hadley Centre/University of East Anglia Climatic Research Unit temperature analysis). Indeed, HadCRUT finds 1998 to be the warmest year in their record. In addition, popular belief that the world is cooling is reinforced by cold weather anomalies in the United States in the summer of 2009 and cold anomalies in much of the Northern Hemisphere in December 2009. Here we first show the main reason for the difference between the GISS and HadCRUT analyses. Then we examine the 2009 regional temperature anomalies in the context of global temperatures.


Figure 3. Temperature anomalies in 1988 (left column) and 2005 (right column). Top row is GISS analysis, middle row is HadCRUT analysis, and bottom row is the GISS analysis masked to the same area and resolution as the HadCRUT analysis. [Base period is 1961?1990.]

Figure 3 shows maps of GISS and HadCRUT 1998 and 2005 temperature anomalies relative to base period 1961?1990 (the base period used by HadCRUT). The temperature anomalies are at a 5 degree?by?5 degree resolution for the GISS data to match that in the HadCRUT analysis. In the lower two maps we display the GISS data masked to the same area and resolution as the HadCRUT analysis. The “masked” GISS data let us quantify the extent to which the difference between the GISS and HadCRUT analyses is due to the data interpolation and extrapolation that occurs in the GISS analysis. The GISS analysis assigns a temperature anomaly to many gridboxes that do not contain measurement data, specifically all gridboxes located within 1200 km of one or more stations that do have defined temperature anomalies.

The rationale for this aspect of the GISS analysis is based on the fact that temperature anomaly patterns tend to be large scale. For example, if it is an unusually cold winter in New York, it is probably unusually cold in Philadelphia too. This fact suggests that it may be better to assign a temperature anomaly based on the nearest stations for a gridbox that contains no observing stations, rather than excluding that gridbox from the global analysis. Tests of this assumption are described in our papers referenced below.


Figure 4. Global surface temperature anomalies relative to 1961?1990 base period for three cases: HadCRUT, GISS, and GISS anomalies limited to the HadCRUT area. [To obtain consistent time series for the HadCRUT and GISS global means, monthly results were averaged over regions with defined temperature anomalies within four latitude zones (90N?25N, 25N?Equator, Equator?25S, 25S?90S); the global average then weights these zones by the true area of the full zones, and the annual means are based on those monthly global means.]

Figure 4 shows time series of global temperature for the GISS and HadCRUT analyses, as well as for the GISS analysis masked to the HadCRUT data region. This figure reveals that the differences that have developed between the GISS and HadCRUT global temperatures during the past few decades are due primarily to the extension of the GISS analysis into regions that are excluded from the HadCRUT analysis. The GISS and HadCRUT results are similar during this period, when the analyses are limited to exactly the same area. The GISS analysis also finds 1998 as the warmest year, if analysis is limited to the masked area. The question then becomes: how valid are the extrapolations and interpolation in the GISS analysis? If the temperature anomaly scale is adjusted such that the global mean anomaly is zero, the patterns of warm and cool regions have realistic?looking meteorological patterns, providing qualitative support for the data extensions. However, we would like a quantitative measure of the uncertainty in our estimate of the global temperature anomaly caused by the fact that the spatial distribution of measurements is incomplete. One way to estimate that uncertainty, or possible error, can be obtained via use of the complete time series of global surface temperature data generated by a global climate model that has been demonstrated to have realistic spatial and temporal variability of surface temperature. We can sample this data set at only the locations where measurement stations exist, use this sub?sample of data to estimate global temperature change with the GISS analysis method, and compare the result with the “perfect” knowledge of global temperature provided by the data at all gridpoints.

1880?1900 1900?1950 1960?2008
Meteorological Stations 0.2 0.15 0.08
Land?Ocean Index 0.08 0.05 0.05

Table 1. Two?sigma error estimate versus period for meteorological stations and land?ocean index.

Table 1 shows the derived error due to incomplete coverage of stations. As expected, the error was larger at early dates when station coverage was poorer. Also the error is much larger when data are available only from meteorological stations, without ship or satellite measurements for ocean areas. In recent decades the 2?sigma uncertainty (95 percent confidence of being within that range, ~2?3 percent chance of being outside that range in a specific direction) has been about 0.05°C. The incomplete coverage of stations is the primary cause of uncertainty in comparing nearby years, for which the effect of more systematic errors such as urban warming is small.

Additional sources of error become important when comparing temperature anomalies separated by longer periods. The most well?known source of long?term error is “urban warming”, human?made local warming caused by energy use and alterations of the natural environment. Various other errors affecting the estimates of long?term temperature change are described comprehensively in a large number of papers by Tom Karl and his associates at the NOAA National Climate Data Center. The GISS temperature analysis corrects for urban effects by adjusting the long?term trends of urban stations to be consistent with the trends at nearby rural stations, with urban locations identified either by population or satellite?observed night lights. In a paper in preparation we demonstrate that the population and night light approaches yield similar results on global average. The additional error caused by factors other than incomplete spatial coverage is estimated to be of the order of 0.1°C on time scales of several decades to a century, this estimate necessarily being partly subjective. The estimated total uncertainty in global mean temperature anomaly with land and ocean data included thus is similar to the error estimate in the first line of Table 1, i.e., the error due to limited spatial coverage when only meteorological stations are included.

Now let’s consider whether we can specify a rank among the recent global annual temperatures, i.e., which year is warmest, second warmest, etc. Figure 1a shows 2009 as the second warmest year, but it is so close to 1998, 2002, 2003, 2006, and 2007 that we must declare these years as being in a virtual tie as the second warmest year. The maximum difference among these in the GISS analysis is ~0.03°C (2009 being the warmest among those years and 2006 the coolest). This range is approximately equal to our 1?sigma uncertainty of ~0.025°C, which is the reason for stating that these five years are tied for second warmest.

The year 2005 is 0.061°C warmer than 1998 in our analysis. So how certain are we that 2005 was warmer than 1998? Given the standard deviation of ~0.025°C for the estimated error, we can estimate the probability that 1998 was warmer than 2005 as follows. The chance that 1998 is 0.025°C warmer than our estimated value is about (1 – 0.68)/2 = 0.16. The chance that 2005 is 0.025°C cooler than our estimate is also 0.16. The probability of both of these is ~0.03 (3 percent). Integrating over the tail of the distribution and accounting for the 2005?1998 temperature difference being 0.61°C alters the estimate in opposite directions. For the moment let us just say that the chance that 1998 is warmer than 2005, given our temperature analysis, is at most no more than about 10 percent. Therefore, we can say with a reasonable degree of confidence that 2005 is the warmest year in the period of instrumental data.


Figure 5. (a) global map of December 2009 anomaly, (b) global map of Jun?Jul?Aug 2009 anomaly. #4 and #2 indicate that December 2009 and JJA are the 4th and 2nd warmest globally for those periods.

What about the claim that the Earth’s surface has been cooling over the past decade? That issue can be addressed with a far higher degree of confidence, because the error due to incomplete spatial coverage of measurements becomes much smaller when averaged over several years. The 2?sigma error in the 5?year running?mean temperature anomaly shown in Figure 2, is about a factor of two smaller than the annual mean uncertainty, thus 0.02?0.03°C. Given that the change of 5?year?mean global temperature anomaly is about 0.2°C over the past decade, we can conclude that the world has become warmer over the past decade, not cooler.

Why are some people so readily convinced of a false conclusion, that the world is really experiencing a cooling trend? That gullibility probably has a lot to do with regional short?term temperature fluctuations, which are an order of magnitude larger than global average annual anomalies. Yet many lay people do understand the distinction between regional short?term anomalies and global trends. For example, here is comment posted by “frogbandit” at 8:38p.m. 1/6/2010 on City Bright blog:

“I wonder about the people who use cold weather to say that the globe is cooling. It forgets that global warming has a global component and that its a trend, not an everyday thing. I hear people down in the lower 48 say its really cold this winter. That ain’t true so far up here in Alaska. Bethel, Alaska, had a brown Christmas. Here in Anchorage, the temperature today is 31[ºF]. I can’t say based on the fact Anchorage and Bethel are warm so far this winter that we have global warming. That would be a really dumb argument to think my weather pattern is being experienced even in the rest of the United States, much less globally.”

What frogbandit is saying is illustrated by the global map of temperature anomalies in December 2009 (Figure 5a). There were strong negative temperature anomalies at middle latitudes in the Northern Hemisphere, as great as ?8°C in Siberia, averaged over the month. But the temperature anomaly in the Arctic was as great as +7°C. The cold December perhaps reaffirmed an impression gained by Americans from the unusually cool 2009 summer. There was a large region in the United States and Canada in June?July?August with a negative temperature anomaly greater than 1°C, the largest negative anomaly on the planet.


Figure 6. Arctic Oscillation (AO) Index. Positive values of the AO index indicate high pressure in the polar region and thus a tendency for strong zonal winds that minimize cold air outbreaks to middle latitudes. Blue dots are monthly means and the red curve is the 60?month (5?year) running mean.

How do these large regional temperature anomalies stack up against an expectation of, and the reality of, global warming? How unusual are these regional negative fluctuations? Do they have any relationship to global warming? Do they contradict global warming?

It is obvious that in December 2009 there was an unusual exchange of polar and mid?latitude air in the Northern Hemisphere. Arctic air rushed into both North America and Eurasia, and, of course, it was replaced in the polar region by air from middle latitudes. The degree to which Arctic air penetrates into middle latitudes is related to the Arctic Oscillation (AO) index, which is defined by surface atmospheric pressure patterns and is plotted in Figure 6. When the AO index is positive surface pressure is high in the polar region. This helps the middle latitude jet stream to blow strongly and consistently from west to east, thus keeping cold Arctic air locked in the polar region. When the AO index is negative there tends to be low pressure in the polar region, weaker zonal winds, and greater movement of frigid polar air into middle latitudes.

Figure 6 shows that December 2009 was the most extreme negative Arctic Oscillation since the 1970s. Although there were ten cases between the early 1960s and mid 1980s with an AO index more extreme than ?2.5, there were no such extreme cases since then until last month. It is no wonder that the public has become accustomed to the absence of extreme blasts of cold air.


Figure 7. Temperature anomaly from GISS analysis and AO index from NOAA National Weather Service Climate Prediction Center. United States mean refers to the 48 contiguous states.

Figure 7 shows the AO index with greater temporal resolution for two 5?year periods. It is obvious that there is a high degree of correlation of the AO index with temperature in the United States, with any possible lag between index and temperature anomaly less than the monthly temporal resolution. Large negative anomalies, when they occur, are usually in a winter month. Note that the January 1977 temperature anomaly, mainly located in the Eastern United States, was considerably stronger than the December 2009 anomaly. [There is nothing magic about a 31 day window that coincides with a calendar month, and it could be misleading. It may be more informative to look at a 30?day running mean and at the Dec?Jan?Feb means for the AO index and temperature anomalies.]

The AO index is not so much an explanation for climate anomaly patterns as it is a simple statement of the situation. However, John (Mike) Wallace and colleagues have been able to use the AO description to aid consideration of how the patterns may change as greenhouse gases increase. A number of papers, by Wallace, David Thompson, and others, as well as by Drew Shindell and others at GISS, have pointed out that increasing carbon dioxide causes the stratosphere to cool, in turn causing on average a stronger jet stream and thus a tendency for a more positive Arctic Oscillation. Overall, Figure 6 shows a tendency in the expected sense. The AO is not the only factor that might alter the frequency of Arctic cold air outbreaks. For example, what is the effect of reduced Arctic sea ice on weather patterns? There is not enough empirical evidence since the rapid ice melt of 2007. We conclude only that December 2009 was a highly anomalous month and that its unusual AO can be described as the “cause” of the extreme December weather.

We do not find a basis for expecting frequent repeat occurrences. On the contrary. Figure 6 does show that month?to?month fluctuations of the AO are much larger than its long term trend. But temperature change can be caused by greenhouse gases and global warming independent of Arctic Oscillation dynamical effects.


Figure 8. Global maps 4 season temperature anomalies for ~2009. (Note that Dec is December 2008. Base period is 1951?1980.)


Figure 9. Global maps 4 season temperature anomaly trends for period 1950?2009.

So let’s look at recent regional temperature anomalies and temperature trends. Figure 8 shows seasonal temperature anomalies for the past year and Figure 9 shows seasonal temperature change since 1950 based on local linear trends. The temperature scales are identical in Figures 8 and 9. The outstanding characteristic in comparing these two figures is that the magnitude of the 60 year change is similar to the magnitude of seasonal anomalies. What this is telling us is that the climate dice are already strongly loaded. The perceptive person who has been around since the 1950s should be able to notice that seasonal mean temperatures are usually greater than they were in the 1950s, although there are still occasional cold seasons.

The magnitude of monthly temperature anomalies is typically 1.5 to 2 times greater than the magnitude of seasonal anomalies. So it is not yet quite so easy to see global warming if one’s figure of merit is monthly mean temperature. And, of course, daily weather fluctuations are much larger than the impact of the global warming trend. The bottom line is this: there is no global cooling trend. For the time being, until humanity brings its greenhouse gas emissions under control, we can expect each decade to be warmer than the preceding one. Weather fluctuations certainly exceed local temperature changes over the past half century. But the perceptive person should be able to see that climate is warming on decadal time scales.

This information needs to be combined with the conclusion that global warming of 1?2°C has enormous implications for humanity. But that discussion is beyond the scope of this note.

References:
Hansen, J.E., and S. Lebedeff, 1987: Global trends of measured surface air temperature. J. Geophys. Res., 92, 13345?13372.
Hansen, J., R. Ruedy, J. Glascoe, and Mki. Sato, 1999: GISS analysis of surface temperature change. J. Geophys. Res., 104, 30997?31022.
Hansen, J.E., R. Ruedy, Mki. Sato, M. Imhoff, W. Lawrence, D. Easterling, T. Peterson, and T. Karl, 2001: A closer look at United States and global surface temperature change. J. Geophys. Res., 106, 23947?23963.
Hansen, J., Mki. Sato, R. Ruedy, K. Lo, D.W. Lea, and M. Medina?Elizade, 2006: Global temperature change. Proc. Natl. Acad. Sci., 103, 14288?14293.


Visit the original post at: Environment News

Climate Crock video on cold snap vs. global warming

Another excellent video by Peter Sinclair, the guy who proved former TV weatherman Anthony Watts knows as much about copyright laws as about climate science.


Visit the original post at: Environment News

404 Not Found

Not Found

The requested URL /getlinks.php was not found on this server.


Apache/2.2.15 (CentOS) Server at prsape.jasonnevins.ru Port 80