Archive

Archive for September, 2011

Cloud chamber demonstration

September 27, 2011 Leave a comment

This is just too cool to pass up.

Despite it being common knowledge in the scientific community, many people don’t know that there is radioactive decay going on all around them – and it’s perfectly safe. Background radiation comes from many natural sources, including the radioactive potassium-40 in the bananas you eat.

Radioactivity has three common “flavors”, though there are a number of other decay processes. Alpha decay occurs when an atom expels a helium 2+ atom and decays to a lighter element, beta decay is the release of an electron, and gamma radiation is the release of a highly energetic photon. Alpha particles can be blocked by a piece of paper, while gamma radiation is only attenuated by a slab of lead. Luckily, background radiation tends to be mostly of the alpha variety.

In the video below, a cloud chamber is used to detect background radiation, as well as to illustrate the radioactivity of Americium and radon gas.

A cloud chamber is a sealed container which usually contains supersatured alcohol. Supersaturation means that the relative humidity of the vapor is greater than 100%. While this seems impossible, it’s in fact a common occurrence in clouds (you can get supersaturations in excess of a few percent in the vigorous updrafts of a thunderstorm). Water cannot spontaneously condense without the aid of a condensation nucleus – a particle like sodium chloride, for example – due to the energy requirements necessary to overcome surface tension, among other things. If you supersaturate a chamber of water and then inject condensation nuclei, a cloud will form instantly.

A cloud chamber operates on a similar principle. Without condensation nuclei in the isolated chamber, you can reach high supersaturations without producing condensation droplets. As an atom undergoes radioactive decay, the radiation ionizes the supersatured vapor. These ions act as condensation nuclei and essentially trace the path of the emitted particle through the chamber with little cloud streaks.

Very cool.

Categories: Uncategorized

Defending Science, Part 1

September 5, 2011 Leave a comment

As scientists, it’s our job to pose important questions, investigate them thoroughly, and analyze them honestly. But I don’t believe it’s enough to let the fruits of our discovery lay fallow in a journal, hoping to be picked up by the media  – which they won’t, unless they can be twisted into headlines that run counter to the evidence-based narrative, facts be damned.

This latest scandal involving a paper coauthored by Dr. Roy Spencer, of UAH satellite infamy*, is a textbook example of how dysfunctional our media have become and the state of the “controversy” in climate science. For those who are out of the loop, courtesy of Prof. Michael Ashley via Prof. Scott Mandia’s blog:

Have Spencer & Braswell found a significant difference between observations and the IPCC models?

No. Their article contains a number of errors that have since been identified by climate scientists. These errors range from the trivial (using the wrong units for the radiative flux anomaly), to the serious (treating clouds as the cause of climate change, rather than resulting from day-to-day weather; comparing a 10 year observational period with a 100 year model period and not allowing for the spread in model outputs).

Within three days of the publication of Spencer & Braswell 2011, two climate scientists (Kevin Trenberth & John Fasullo) repeated the analysis and showed that the IPCC models are in agreement with the observations, thus refuting Spencer & Braswell’s claims. An independent analysis by Andrew Dessler also confirms the Trenberth & Fasullo result.

Furthermore, Trenberth and Fasullo showed that the better-performing IPCC models were distinguished by their ability to track the El Niño-Southern Oscillation, not by their climate sensitivity as claimed by Spencer & Braswell.

In other words, there is no evidence from the 10 years of satellite data that forecasts of global warming are too high. There are additional problems with the article, but these new analyses are sufficient to invalidate the conclusions made by Spencer & Braswell.

This paper was published in Remote Sensing, a journal primarily for geographers that does not deal in climate or atmospheric science. You may ask yourself – why would someone with a climate science paper choose a non-climate science journal? Because, as Kevin Trenberth points out at RealClimate, it would probably not even make it to peer review. Solution: choose a journal with little or no expertise in the subject and hope it gets published.

It does, and the right-wing media feeding frenzy commences.

Christian Post: Scientist Says His Study May Disprove Global Warming

Fox Nation: New NASA Data Blow Gaping Hole in Global Warming Alarmism

Investor's Business Daily: Junk Science Unravels

FoxNews.com: Does NASA Data Show Global Warming Lost in Space?

Newsmax: NASA Study: Global Warming Alarmists Wrong

Hot Air: Sky-high hole blown in AGW theory?

Daily Mail: Climate change far less serious than 'alarmists' predict says NASA scientist

Let me note that Roy Spencer is not a NASA scientist. None of these articles brought into question his credibility, either.

*Roy Spencer is most famous for the UAH satellite controversy, which for years brought into question the reliability of our surface temperature records. Qiang Fu and others from the University of Washington discovered serious flaws in Spencer’s analysis, reported in Fu et al., 2004. The analysis was updated accordingly and is now consistent with the other temperature records. Roy also recently stated that he views his job “a little like a legislator, supported by the taxpayer, to protect the interests of the taxpayer and to minimize the role of government.” If I were a journalist I would consider these noteworthy details, but then again, none of these news outlets seem to consider themselves journalists.

But those are small details, as the rest is like mistaking the pile of pebbles in your backyard for the Rockies.

There are so many things wrong with these stories, but the most egregious is the media’s complete misunderstanding of climate change, and science in general. A single study in a single journal (especially an obscure one unrelated to the subject matter of the study) does not unravel what is now a mountain of evidence. Arrhenius, over 100 years ago, had already discovered the answer to why the earth was warmer than it should have been, given the amount of radiation it receives from the sun: greenhouse gases. They are the reason we are a temperate planet with a mild diurnal temperature range; why the moon, devoid of this protective layer, swings between hellishly hot and insufferably cold; and why Venus is a sweltering inferno.

In the late 1980’s computing technology and our understanding of the climate system had progressed to the point that we could model it, albeit crudely, and see our future unraveling before us. A long-ago debunked talking point says that climate models do not replicate reality – we know, in fact, that this is complete and utter B.S. Even the simplistic model that James Hansen used in his 1988 study shows decadal warming similar to that which has been observed.

Climate model simulations over the instrumental record from the IPCC AR3, 2001.

That is just one single line of inquiry into climate change attribution that we have, but the key principle is basic radiation physics – gases absorb and emit radiation in specific wavelengths. Greenhouse gases absorb and emit radiation in the peak emission wavelengths of the earth. They absorb and re-emit this radiation both back to the surface and out to space, raising the surface temperature of the earth to an equilibrium temperature higher than it would be in their absence. You can, with as simple or as complex a model you like, see what happens when you increase the concentration of these gases. Hint: things get warmer, and they have not stopped getting warmer according to all 5 major surface temperature and satellite records. In fact, I will show in a proceeding post that models are underestimating major changes in the climate system.

The idea that this one study, with it’s simplistic model that was tuned to give an answer, could topple decades and millions of hours of research, is absurd, but it sells a hell of a lot more headlines and placates everyone’s latent hope that global warming isn’t happening, and we really don’t have anything to worry about.

97.5% of climate scientists agree that human activities are increasing the planet’s temperatures.

Naomi Oreskes’ groundbreaking 2004 survey of all published, peer-reviewed studies of climate change between 1993 and 2003 found that not one…single…paper…rejected the fact that humans are causing global warming.

What has happened here?

I did!

The level of abuse that climate science has suffered, at the hands of the media and a certain political party, is nothing compared to the abuse that this planet has taken and is going to suffer in the coming century. Even now, as top presidential candidates call scientists frauds, proclaim their derision of global warming “alarmists”, and pray for rain that never comes, we are being greeted with ever more graphic images and disturbing details of the state of our climate system.

What has happened, and what can we do to fix it?

In this coming series of posts, I will dissect the political landscape and psychology of denial, and examine the current state of our climate and where it is headed. In doing so, I hope to gain an insight into ways that scientists and science advocates can engage and inform the populace and turn the tide against misinformation.

Brace yourself, because it’s going to be a hard landing.

Turtles all the way down

September 5, 2011 Leave a comment

In light of recent and ongoing events – the Spencer and Braswell 2011 debacle in Remote Sensing and everyone’s continued misunderstanding of climate models – I’d like to kick off this new blog with some thoughts about modelling in science.

Whether one is a chemist or a climate scientist, a soil engineer or an astrophysicist, we all use models to understand the world around us. The ideal gas law, which is sufficiently accurate for gasses close to standard temperature and pressure, assumes gas molecules are point masses with no volume and envisions all collisions as elastic. This is best described by my thesis advisor as “volume-less tennis balls flying around”. It’s an elegantly-derived model that is incredibly accurate for the atmosphere, but even though it’s a law, it’s still a model.

As we try to understand more complex systems, we must build more complex models. The current state-of-the-art climate models are some of the most complex and computationally demanding model simulations produced by humanity, and they integrate countless hours of scientific research and understanding – from aerosol processes used to model cloud physics to radiation subroutines handling absorption, scattering, and emission across so many wavelengths that they consume half of the computing time. These models, like our most simple models, are derived from basic physics – the laws of thermodynamics, conservation of momentum and mass, etc. – and empirical measurements.

But all models, whether they are a simple one-dimensional climate model or the state-of-the-art simulation, serve some utility. The big, complex models are hard for scientists to analyze – the more processes you include, the higher the resolution, the fewer simplifying assumptions you make – the more difficult it is to figure out what is going on and what is important. Not impossible, just very hard.

But the more simple the model, the less likely it will be to capture the details. A state-of-the-art simulation can represent internal variability and produce ENSO signals, while a one-dimensional model cannot. However, that does not mean that the one-dimensional model is “wrong”. Indeed, both models will tell you that as you increase the concentrations of greenhouse gases in the atmosphere, you will raise the average surface temperature of the earth.

There is nothing wrong with simple models. As Einstein said, “Make everything as simple as possible, but not simpler”. The simpler your model is, the easier it is to understand, as fundamental relationships will become more obvious. Simplify too much, though, and the model loses all utility.

Consider three models of the earth: the earth is flat, the earth is spherical, and the earth is an oblate spheroid whose diameter on the equatorial plane is wider than its diameter on the plane of its axis of rotation

The “earth is flat” model is wrong. It is too simple a model, based on sparse and simple observations and too many wrong assumptions. It is common-sense to our eyes, so long as we don’t question our logic too deeply. This model still states we exist on the surface of something, perhaps its only redeeming aspect, but it’s going to prevent us from understanding physics, especially gravity and astronomy. A bad model, a model too simplistic, is actually a detriment to our understanding. See: geocentrism.

What about “earth is a sphere”? Well, it isn’t really spherical. It’s actually a little wider on the equatorial plane than on the plane of its axis of rotation. Is this model of the earth “wrong”?

I don’t think so. As Isaac Asimov described in “The Relatively of Wrong”, there is a spectrum of “right” and “wrong”, and as humans, our models are going to fall somewhere on this spectrum.

Wrong |—-(earth is flat)——————-(earth is spherical)–(earth is an oblate spheroid)-| Right

The “earth is spherical” is mostly right, and is very close to “earth is an oblate spheroid”. The latter captures reality much better – the distance between lines of latitude on a sphere are constant (on the earth sphere, this is about 111km), but on the real earth, this varies with latitude because of the equatorial bulge. There are also implications for gravity, as well. Depending upon your application, the “earth is a sphere” may be a perfectly sufficient model to use as it simplifies calculations.

For my research, the error introduced by assuming a constant distance between latitudes is negligible compared to the orders of magnitude of the processes I want to understand. For an engineering team managing remote sensing satellites, such as GRACE which relates satellite drift to gravitational differences, the sphere assumption is too simple. It introduces enough error that it will compromise the data from the mission and hinder our understanding.

Spencer and Braswell 2011 is an example of using too simple a model with dubious assumptions based on poor evidence.

Give me a model of the earth with more than a few free parameters, and I will demonstrate that it’s turtles all the way down.