Open Mind

Not Computer Models

August 17, 2009 · 162 Comments

Denialists love to denigrate computer models of earth’s climate. In my opinion they only do this because they’re in denial of the result, not because of any valid evidence. They also love to make the false claim that without computer models there’s no reason to believe that global warming is real, and is going to get worse.


The term “computer model” refers to an actual simulation of earth’s climate, often in remarkable detail. Such models are (of course!) not able to predict, or even post-dict, the chaotic aspects of the sytem (the weather), but they do an outstanding job of post-dicting the global statistical characterization of the system (the climate).

But such models and their results are not the topic of this post; I’d like to take a look at some simple models which are not computer models. They’re simple mathematical models of changes in global temperature, and although I’ve used a computer to do the arithmetic, I could have done so without a computer and they are most decidedly not computer models.

Global temperature responds to changes in the energy flow of earth’s climate system. When more energy flows through the system the planet heats up; with less energy flow the planet cools down. Changes in the energy flow constitute climate forcings. We know of many, including greenhouse gases, solar changes, ozone, snow albedo, land use, aerosols (both from volcanoes and from industrial processes), etc. We even have estimates of their magnitude for more than a century. For my mathematical models I’ll use estimates of climate forcing from NASA GISS; the data cover the time span from 1880 through 2003. Here’s the forcing data:

Forcings

The only positive (warming) forcing which has reached a level greater than 1 W/m^2 (watt per square meter) above 1880 levels is greenhouse-gas forcing. The net forcing is strongly positive, but if we leave out greenhouse-gas forcing it’s not:

NetForce

Many investigations of the relationship between climate forcing and global temperature use regression models. For such models, the response to climate forcing is treated as being instantaneous; there’s no (or at least negligible) lag between forcing and response. But that’s not a realistic model; there are many different components to the climate system, each with a different time scale. Hence in addition to simple regression models, I’ll also use a two-box model which allows for both a “prompt” response to climate forcing and a long-term response. This can be thought of as a rough mimicry of an atmosphere-ocean model, where the atmosphere responds quickly while the ocean takes much longer. I’ll allow the atmosphere to respond very quickly (in a single year) while for the oceans I’ll use a timescale of 30 years.

First let’s look at simple regression models, regressing global temperature against net climate forcing:

1boxAll

This model does show an overall increase in global temperature due to an overall increase in climate forcing. But two things are apparent: 1) the long-term trend is underestimated; and 2) the short-term dips (due to volcanic eruptions) are overestimated. It also indicates a climate sensitivity of only 0.25 deg.C/(W/m^2) (this is sensitivity to a forcing increase of 1 W/m^2, not sensitivity to doubling of CO2 concentration). That’s too low a value; the Stefan-Boltzmann equation alone implies a sensitivity of 0.3 deg.C/(W/m^2). Physics tells us that feedbacks in the climate system will make actual climate sensitivity greater than the Stefan-Boltzmann value, and paleoclimate evidence (ice age cycles) requires it.

The reason for the misfit is clear. Volcanic eruptions cause a huge negative (cooling) forcing, but the volcanic aerosols responsible for this forcing settle out of the atmosphere in a year or two so the effect doesn’t last very long, hence the response to a volcanic eruption is notable but not huge. But with only one response “coefficient” (this is a regression model with a single regressor, net forcing) this means the coefficient must be kept small enough not to cause immense cooling for every volcanic eruption. At the same time, long-term forcing changes cause warming, but if we set the regression coefficient small enough to create a good match with volcanic changes we’d see almost no long-term trend at all. Hence the long-term trend forces the coefficient to be big enough to get at least some trend.

The net result is that the regression coefficient is somewhere between the small coefficient of short-term response which matches volcanic changes, and the much larger coefficient of long-term response which matches global warming. As a result, the impact of volcanic eruptions is overestimated and the long-term trend is underestimated.

If we do a similar model using just greenhouse-gas forcing we get a decent (but not impressive) match to the long-term trend but the volcanic dips are now absent:

1boxGHG

The match to warming during the early 20th century isn’t very good. Nonetheless it does show some early-20th-century warming, showing that claims that greenhouse gases had no impact until just recently are mistaken. Greenhouse-gas forcing is probably not the main driver of early-20th-century warming, but it’s not an insignificant factor either. Also this model shows even smaller climate sensitivity than the all-forcings regression model, a mere 0.24 deg.C/(W/m^2). However, the match to the long-term trend since 1975 is not bad, indicating that perhaps greenhouse-gas forcing is the main driver of global warming over the last 35 years or so.

If we use all climate forcings except greenhouse gases we get this:

1boxNoGHG

Now the fit is dreadful! Also, the coefficient is of the wrong sign: the climate sensitivity using this model is -0.13 deg.C/(W/m^2), indicating that an increase in climate forcing causes cooling! What has happened is that without greenhouse-gas forcing, there’s simply no explanation for modern warming, so the model has fit the only variation it can find (modern volcanic eruptions) to recent warming, but had to reverse the sign to get any kind of match at all. Of course this model is nonsense.

If we try a simple regression model using just solar forcing, we can get a decent fit to early-20th-century warming but not to recent warming:

1boxSol

While the fit to the early-20th-century trend is not bad, the fluctuations of the solar (11-year) cycle just don’t match. This indicates that the coefficient from this model is probably too big — solar-cycle changes don’t cause that much warming and cooling. This is confirmed by examination of the coefficient from this model, indicating climate sensitivity of a whopping 1.73 deg.C/(W/m^2).

The fact is that while solar forcing is responsible for some of the early-20th-century warming, it didn’t cause all of it. The regression-on-solar-only model therefore must inflate the sensitivity to get a match. Even with inflated sensitivity it still doesn’t reproduce recent warming; solar output hasn’t increased.

The upshot of all the simple regression model results is that simple regression models (corresponding to prompt response to climate forcing) just don’t do the trick. Too little prompt response fails to show the trend we’ve observed, while too much prompt response exaggerates the effect of volcanic eruptions. Clearly we need to combine prompt response with long-term response in order to get a realistic picture of temperature change; that’s the purpose of the two-box model.

In fact the two-box model using all climate forcings does a good job fitting the observed data:

2boxAll

Both the early-20th-century and recent warmings are well modeled; the greatest mismatch is during the 1937-1945 period. It’s possible that some (if not most) of this mismatch is in the temperature data rather than the model, a result of the “bucket problem” with ocean temperature data-collection before and during world war II.

Another encouraging result of this model is that it yields a climate sensitivity of 0.73 deg.C/(W/m^2) (a sensitivity of 2.7 deg.C to doubling of CO2). This is well in line with the results of computer models; the GISS modelE shows a sensitivity of 0.68 deg.C/(W/m^2).

If we fit a two-box model but leave out greenhouse-gas forcing we get this:

2boxNoGHG

Once again, without greenhouse-gas forcing the fit is dreadful. The only reason it matches recent warming is that it has turned the long-term forcing coefficient upside-down, and this model also shows a nonsensical negative climate forcing of -0.77 deg.C/(W/m^2). It emphasizes that without greenhouse-gas climate forcing (which is the result of human activity), climate models — even noncomputer models like these — just don’t fit.

I’ll close with a two-box model, which includes all climate forcings and includes the southern oscillation index to mimic some of the short-term changes in temperature. The SOI isn’t really climate forcing since it’s not adding or removing energy from the ocean-atmosphere system, it’s exchange between ocean and atmosphere. Nonetheless it is related to changes in surface temperature (which is what we’re modeling). I’ve allowed a 7-month lag for SOI to have an effect. I also computed a model allowing SOI to have a long-term effect, just as though it were a climate forcing — but the long-term impact of the SOI turned out not to be statistically significant. Here’s the result of the model including all forcings and short-term SOI effects:

2boxAllSOI

The fit is excellent. The climate sensitivity indicated by this model is 0.69 deg.C/(W/m^2), extremely close to the sensitivity estimated from the (much more sophisticated) GISS modelE (computer) model of 0.68.

Many of you have seen the graphs (included in the IPCC reports) showing that using computer models, we can reproduce temperature history if we include human factors, but not if we omit them. That’s such powerful evidence that we’re the cause of global warming, it’s no wonder denialists have tried so hard to slander computer models and to insist that without them there’s no solid evidence of man-made global warming. The truth is that you don’t need computer models to show this. Even with very simple mathematical models (and these models are indeed simple) the result is the same. Without human causation, there’s no explanation for the global warming we’ve already observed. With human causation, there’s no explanation for a lack of global warming.

Categories: Global Warming
Tagged:

162 responses so far ↓

  • lucia // August 17, 2009 at 11:33 pm | Reply

    Tamino–
    What did you select for the heat transfer coefficient between the two boxes (Beta in your previous post)?

    Also, what do you impose for the initial conditions for the temperature in both boxes?

    I’ve fiddled a bit with these but I’ve always wished I had ocean temperature data to compare the temperature of both boxes against. (That is, in my mind, I assumed the slow box represents the temperature of an “deep ocean” box while the surface behavior is modeled by the “fast” box.)

    Two boxes is more realistic than one, but you end up being quite a few fiddle factors with two boxes.

    (I do agree that no matter how many boxes we have, we can’t explain the overall trend without ghgs.)

    [Response: It isn't necessary to estimate the heat transfer coefficient or specific heats or heat contents to apply this model to observed data. The exact solution of the two-box model is that temperature of each box is a linear combination of the "prompt" forcing and the 30-year-time-constant integrated forcing (with different coefficients for the two boxes). The coefficients of the linear combination for surface temperature were determined by multiple regression of surface temperature on those two series. So there are no "fiddle" parameters at all except the time constants. For the rapid response, it hardly matters to the final answer as long as it's kept short. For the slow response I assumed 30 years (didn't try other possibilities) because the GISS modelE has an equilibration time scale of about 30 years.

    It seems to me that a reasonable interpretation is that the fast box is the atmosphere and the slow box is the upper ocean. I suspect the time scale for the deep ocean is much longer than 30 years.]

  • John Mashey // August 17, 2009 at 11:44 pm | Reply

    Great exposition, thanks.

  • greenfyre // August 18, 2009 at 12:43 am | Reply

    Unless I am seriously misguided (decidedly possible and testable as a hypothesis) or computer technology has gone in radically new directions that I am unaware of (also possible), there really is no such thing as a “computer model” per se, in that they don’t actually do anything that could not be done with an infinite number of monkeys and sufficient supply of pencils and paper.
    Granted the graphics are less impressive with crayons and flipping a pack of cards for animations, but we’re not actually talking about anything beyond doing an unbelievable number of calculations at very high speeds … ie they don’t actually add anything unique to the technology.
    The term “computer model” is merely a way of identifying the method by which calculations were done, probably because of the complexity and scale of the model (and the expense of maintaining sufficient numbers of monkeys), but nothing more; or am I seriously crazy on this one?

    [Response: I don't know if there's an "official" designation, but I interpret "computer models" to mean simulations of the operation of the laws of physics. It's certainly possible to compute them with pencil and paper, but the time required would be even more than prohibitive -- even with supercomputers the simulation runs take a very long time.]

  • David B. Benson // August 18, 2009 at 12:45 am | Reply

    Deep ocean residence times (from memory):
    Atlantic Ocean: ca. 250 years;
    Indian Ocean: ca. 150 years;
    Pacific Ocean: ca. 2000 years.
    However, this last (all from a fairly recent paper in J. Chem. Oceanography) cannot have properly taken the North Pacific into account as it appears much too large for what happens in the northern North Pacific.

    Anyway, in their comment on Schwartz paper, Knutti et al. have a figure using 500 years as the characteristic time for the deep ocean.

    The atmosphere only has a ‘memory’ of at most a few months, so I think using a one year chracteristic time is too large. From what I am attempting to understand, the shallow ocean (down to MLD) mixes in less than one year (I’m having a lot of touble with this notion). The heat content of 2.5 meters of ocean is the same as the entire atmosphere; assuming an average MLD of 400 meters, that is a ratio of 160:1. Don’t know how fast the shallow ocean takes up or releases heat to the atmosphere.

    Of course, this is a multiple parameter estimation problem. Gradient descent might then find parameters which give a better fit. I use a variant of the direction set methods for my problem because the boundary conditions imply lack of first (and higher) derivatives at the boundaries. My reaction is that this method is rather slow (overnight on my 2.4 GHz machine), especially for six parameters to be estimated at the same time.

  • Ray Ladbury // August 18, 2009 at 1:22 am | Reply

    Computer models make easy targets for misinformation because they are a black box to laymen. So why shouldn’t they believe the nice man on the radio telling them comforting stories about how it’s all a hoax? Even this–refreshingly simple an clear as it is–is going to tax the patience of the average Survivor fan. We really need a book “Climate Change for Idjits”.

  • Didactylos // August 18, 2009 at 1:54 am | Reply

    The RealClimate folks seem to prefer the term “physics-based model” or “physical model”, meaning one that actually simulates physical processes. They would call Tamino’s work a statistical model. To me, “mathematical” and “computer” don’t help distinguish these different approaches.

    I was going to ask why you used SOI rather than MEI, but I see MEI is only available since 1949.

  • Joel Shore // August 18, 2009 at 2:29 am | Reply

    Didactylos,

    Not to get too hung up on terminology, but I personally would tend to call a model such as what tamino presents here a “phenomenological model” rather than a “statistical model”. This is because the model, while not treating the system in its full mechanistic details, does treat some of the basic phenomenology of energy balance and incorporates a differential equation to model the relaxation. By contrast, I would think of a purely statistical model as being, say, in a medical study where they simply regress say a symptom like heart disease against all sorts of potential factors, such as sm0king, fat intake, … without any real attempt to model even phenomenologically how the factors are causing the resulting disease.

  • Douglas Watts // August 18, 2009 at 3:06 am | Reply

    As an aside, geologists use models all of the time to test hypotheses regarding continental or ocean plate subduction. These models are constantly being “groundtruthed” by field geology because they make specific predictions about the specific type of mineral suite that would be expected by a certain temperature/pressure regime. Because the P/T conditions required to create certain mineral suites are quite well known, and in many cases, can be verified by laboratory tests, geologists can test a wide variety of formative models and throw out the ones that are not supported by field data. In general, the exercise follows the format of inquiry used here by tamino. As he shows well here, models are a great way to weed out weak models and determine which ones can withstand enough scrutiny to be worthy of further study.

  • Chad // August 18, 2009 at 3:07 am | Reply

    I wouldn’t think using MEI as a predictor variable would be wise. SOI is independent of temperature. It’s just a normalized pressure difference. Whereas MEI is based, in part, on SST and 2-m air temperature. Using MEI would be like sneaking in the dependent variable as an explanatory variable.

  • lucia // August 18, 2009 at 3:28 am | Reply

    Tamino–

    So there are no “fiddle” parameters at all except the time constants. For the rapid response, it hardly matters to the final answer as long as it’s kept short.

    I’m not sure I understand your response. You linked to this post containing a system of equations for a two box model. I’m cutting and pasting from the source:

    Each of these terms can be justified based on phenomenology; some values for alpha 1, alpha 2, F1, F2 and beta, would notmake sense others would.
    Did you fit to these equations? Or not? If not the following questions won’t make any sense, but I’d be interested to know what sort of equations you did fit since, in this case, “two box model” could mean any number of things.

    Now, assuming you used those equations, it’s pretty clear that two time constants alone are not sufficient to characterize the behavior of that general two box model. Unless you made some specific choices, we need to know the two time constants 1/Alpha1, 1/alpha2, the ratio of the heat capacities, C1/C2, and the ratio beta/alpha1 (or alpha2, take your pick), plus we need to know how you partitioned the forcings between box 1 and 2, and we need to know whether you applied some constraint to the temperatures at any particular time. For example, maybe you set the temperature to equilibrium values in 1880; that would make sense. But… did you do that?

    I don’t know what you did, but even given some set of values for the two time constants, the linkage between temperature in the boxes and the heat capacities will result in different solution for that system of equations.

    So, what did you do? Or, if you think you just got best fit value, what did you get for the parameters other than the time constants?

    [Response: Note that in that post, the response of a *one*-box model to a step-change in forcing is T = (F/alpha*C) (1-exp(-t/tau)). For a two-box model it would be a sum of two such terms. Given the value of F and a time series for T, and assuming a value for tau, one can estimate the coefficient (1/alpha*C) by regression.

    For a two-box model the solution for a single box is a sum of two such terms, proportional to the forcing, to physical constants, and to time-dependent terms which depend on the time constants. For a time-dependent forcing the terms (1-exp(-t/tau)) are replaced by integrals like the one in this post for the one-box model. There will be an equation (sum of two terms, one for each time constant) for the temperature in each box. We can compute the integrals (up to constant of proportionality) if we assume the value of the time constants.

    If we fit observed temperature to those integrals, then we can estimate the coefficients for the "box" we're modeling (in this case, surface temperature). The coefficients correspond to terms like (1/alpha*C) -- which certainly depend on physical properties, but we don't have to determine or estimate them to estimate the coefficients in the expansion, hence to model the temperature of a single box.]

  • dhogaza // August 18, 2009 at 4:04 am | Reply

    Response: I don’t know if there’s an “official” designation, but I interpret “computer models” to mean simulations of the operation of the laws of physics. It’s certainly possible to compute them with pencil and paper, but the time required would be even more than prohibitive — even with supercomputers the simulation runs take a very long time.

    Tamino … the physics models used by the Manhattan project, while not computed pencil-and-paper, were computed by “computers”, which at the time meant technicians (many of them women) using mechanical calculating machines.

    Modern modeling such as is done in computer models flows very closely to the modeling done by weapons physicists and engineers.

    The very basis – monte carlo modeling, with interpretation of many model runs, the basis for what we see today in climate models – came post-war (I think) when working on the hydrogen bomb. Von Neumann came up with the monte carlo method, while employed by the Manhattan Project or directly after at Los Alamos (postwar).

    I think this stuff is much older than you think, in terms of approach (details changing as more aggressive computer power has become available).

    And of course, that greedy desire for more computing power by the nuclear engineers fed Moore’s Law in the early years.

    I believe that a good response to the anti-modeling crap is to point directly at our nuclear engineering successes where modeling results, even in the 1940s, done by mechanical calculators driven by human (mostly women) “computers”), showed that a simple gun design for a plutonium bomb could not work.

    We don’t test any more. We just run models. And, if you step back and look, they’re no different than the models used to make sure our nuclear arsenal works.

  • dhogaza // August 18, 2009 at 4:06 am | Reply

    Meant to say that *climate* models are, in essence, no different than those used to make sure our nuclear arsenal works.

    The physics in the latter is more tightly bounded, but the principal is identical.

  • Stephen Spencer // August 18, 2009 at 4:22 am | Reply

    Following up on Douglas Watt’s comment, another area where computer models are used is in Astronomy, with the study of Galaxies. From my lay perspective, it appears that many areas of science use computer models, to probe the science.

    I would like to find an article describing the use of computer models generally in science, as another means of making the point to denialists that modeling is genuine science.

    I made a comment along these general lines on Deltoid this morning, and John Mashey referred me to a book written in 1993. Is there anything more recent?

  • Michael Tobis // August 18, 2009 at 5:42 am | Reply

    I have just been raving about this article. Nicely done!

    But Lucia’s confusion is understandable in the light of the linked post. It might help if you showed explicitly the equations you used as an appendix.

  • Eric L // August 18, 2009 at 5:57 am | Reply

    Correct me if I’m wrong, but your forcing data here includes 1) the greenhouse effect from water vapor, which increases with temperature, 2) the increase from reduced snow albedo, which also changes in response to temperature. So if you used this to estimate the climate response to anthropogenic GHG emissions, you would underestimate the response because you would miss the effects of these feedback mechanisms? Is there a simple way to correct for this? Does your sensitivity number include the effect of other feedbacks like clouds?

  • Dano // August 18, 2009 at 6:01 am | Reply

    This is an excellent use for scenario analysis.

    That is: the 4-quadrant Cartesian plane and the projection of 2 variables on that plane, in order (for this discussion) to elicit policy.

    This is a good extension of how folk do it today, and I’ll pass along. That is: it is good enough to judge the outcomes of policy in a ‘plan/check/do’ reiterative model.

    Thank you. This is an improvement.

    Best,

    D

  • jyyh // August 18, 2009 at 6:53 am | Reply

    Anyone who’s ever done a regression should be able to get this one, thank you, bookmarked.

  • jyyh // August 18, 2009 at 7:39 am | Reply

    The fit I once tried produced a reasonably good fit (for 1920- onwards) with just SOI, SunSpots (with a 5.5 year lag (like the rise in sunspots would be a driving force)), and CO2 concentration, however, the correlation breaks prior 1920s indicating there are other factors at play.

  • Mark // August 18, 2009 at 8:21 am | Reply

    “[Response: I don't know if there's an "official" designation, but I interpret "computer models" to mean simulations of the operation of the laws of physics.]”

    Computer games give a good demarcation.

    Flight sims to be specific.

    Flight Simulator from Microsoft uses a computer model not a simulation. Most flight sims to now.

    It has a long list of regimes of input and how the plane will act under them.

    It’s all lookup tables and fit to curve.

    The eternal attempt by denialists to get a non-log response to CO2 for temperature (and put in a reducing exponential) is an example. They want to fit the graph and want a maximum CO2 condensation effect (at around 500ppm) so that they can say “we might as well keep using fossil fuels because we are nearly at the maximum effect possible for CO2 as a warming agent”.

    There was a company called Looking Glass studios who produced another flight sim. It simulated the physics like a climate model does.

    It had tables on how the engine worked, what power you got out and (IIRC) even what the engine torque change on the airframe would be. But as to how this force change resulted in aircraft attitude change, it used the navier stokes equations.

    Not a lot was left of the CPU to make for computer AI. And it wasn’t as “fun” as the other models based on tables of results.

    So their next flight sim was a model based on lookups.

  • P. Lewis // August 18, 2009 at 9:05 am | Reply

    We really need a book “Climate Change for Idjits”

    There is one (well, more than one actually), it’s Heaven and Earth: Global Warming — The Missing Science by Plimer.

    Now, The Complete Idiot’s GuideTM to Climate Change would be something different, and one probably needs to look no further than Spencer Weart’s Discovery of Global Warming on that score.

  • DavidCOG // August 18, 2009 at 9:25 am | Reply

    Superb analysis – this URL has found a home in my Denier Debunker armoury.

  • jyyh // August 18, 2009 at 11:41 am | Reply

    sorry of course offset was not 5.5 years but about 8.25 (3/4 of the solar cycle).

  • PI // August 18, 2009 at 12:18 pm | Reply

    Lucia:

    “I’ve fiddled a bit with these but I’ve always wished I had ocean temperature data to compare the temperature of both boxes against. (That is, in my mind, I assumed the slow box represents the temperature of an “deep ocean” box while the surface behavior is modeled by the “fast” box.)”

    Probably the easiest data to work with for this purpose (calibrating two-layer box models) is the integrated ocean heat content anomaly time series (e.g., Levitus, Domingues).

  • PI // August 18, 2009 at 12:27 pm | Reply

    Eric L:

    Tamino’s model has a generic “total feedback” factor which is supposed to encompass all the feedbacks in the system (water vapor, snow albedo, clouds, whatever). It’s not calculated from first principles but fit from the temperature response data. This assumes that all feedbacks respond directly to temperature and don’t change in time, and that all relevant feedbacks are active in the historical record.

    One way to improve Tamino’s analysis is to allow a separate multiplicative factor to quantify the aerosol forcing, which is highly uncertain due to the aerosol indirect effect. This was done in, e.g., Forest et al. (2002) in Science. (They also estimate the climate sensitivity and the ocean response time.) The indirect effect isn’t really a feedback to climate change, because it doesn’t act directly on temperature changes, but you can sort of think of it as an extra “feedback” factor that specifically modifies the aerosol forcing.

  • DavidK // August 18, 2009 at 2:13 pm | Reply

    Pity McLean, de Freitas & Carter couldn’t have done something like this before they mooned.

    Well done.

  • lucia // August 18, 2009 at 2:25 pm | Reply

    PI–

    Probably the easiest data to work with for this purpose (calibrating two-layer box models) is the integrated ocean heat content anomaly time series (e.g., Levitus, Domingues).

    Yes. Using Levitus would be useful if we wanted to really do a real two box model.

    [Response: Let's be perfectly clear: the exact solution to the two-box model is of the form which is input to the regression in these analyses. So it's a *real* two-box model.]

  • Timothy Chase // August 18, 2009 at 3:09 pm | Reply

    DavidCOG wrote:

    Superb analysis – this URL has found a home in my Denier Debunker armoury.

    You know — that is a really nice avatar.

  • Timothy Chase // August 18, 2009 at 3:24 pm | Reply

    Ray Ladbury wrote:

    Computer models make easy targets for misinformation because they are a black box to laymen. So why shouldn’t they believe the nice man on the radio telling them comforting stories about how it’s all a hoax? Even this–refreshingly simple an clear as it is–is going to tax the patience of the average Survivor fan. We really need a book “Climate Change for Idjits”.

    Well, I believe greenfyre has more or less hit the nail on the head as far as that is concerned: computer models perform exactly the same calculations humans would, only far faster. And they won’t forget to carry a digit misplace a decimal place.

    Quite literally, they are embodiments of our knowledge of physics — in all likelihood more than it is possible for any one human to understand — and the mathematical calculations that result from that knowledge of physics — carried out to a degree that far surpasses what all of humanity could achieve in any remotely comparable period of time. They are the best estimates that science has to offer.

  • PI // August 18, 2009 at 5:02 pm | Reply

    Tamino,

    “I don’t know if there’s an `official’ designation, but I interpret `computer models’ to mean simulations of the operation of the laws of physics.”

    By that standard your two-box energy balance model is a computer model. Energy balance is a law of physics. All “physical models” are really approximations since they can never truly simulate everything from first principles. An AOGCM doesn’t simulate molecular dynamics, or any sub-grid scale physics. At best, it parameterizes it.

    Perhaps the distinction you’re looking for in this case is between circulation models that simulate fluid dynamics and simpler models which don’t have fluid dynamics. If you want to call GCMs “computer models” and simpler models “not computer models”, you can, but I’m not sure how useful the distinction is.

    In the end, all we have in science are models (and the data to compare them to). Some models are simpler than others. I think your real point is that when it comes to the attribution question of AGW, even simple models capture most of the relevant behavior: it’s not dependent on detailed physical assumptions.

  • george // August 18, 2009 at 5:20 pm | Reply

    Tamino:

    What’s the climate sensitivity if you run the latter analysis (forcings + SOI) just on the data from 1975-present?

  • lucia // August 18, 2009 at 5:34 pm | Reply

    Tamino–
    I get that this is an eigenvalue problem. I get that you can do a curve fit with two time constants and solve for them. I get that the two time constants are insufficient to specifically determine something like the heat transfer coefficient.

    What I don’t understand is whether you tested to see if the best fit values for the time constants you got maps into a set of physically realistic values for the parameters (alpha1, alpha2, beta1, beta2, F etc.) for a two box model that in which each box could conceivably represent some portion of the earth’s climate system.

    Basically: have you done anything to check that your specific solution for the values of the two time constants maps into a space that does not violate the second law of thermodynamics? Of those that don’t violate the 2nd law, does a subset have the portion that might be thought of as “the ocean” have a much larger heat capacity than the atmosphere? Of those that are ok on both those counts…. etc.

    There are a number of questions that could be asked, and they can only be answered if you were to try to map your time constants back to the values of α β while scrutinizing the solution to see whether those two time constants map back into a set of physically realistic two-box models.

    I have no particular reason to believe your results cannot be made to map into physically realistic space. However, it would be useful to know. Because if, by chance, the values you obtained do not map back into any physically realistic space, what we have here is a interesting math problem with a mathematical two box model. However, it would not be a real two box model because real two box models must not, for example, violate the 2nd law of thermo, and should not have utterly implausible collection of values for the parameters.

    [Response: Either you know perfectly well that these solutions are not unphysical, or ...

    You just like to create doubt. That seems to be a recurrent theme for you, from you very first interactions here to your attempts to "falsify" IPCC projections based on a dishonest claim about what the IPCC projections are.]

  • Eric L // August 18, 2009 at 5:38 pm | Reply

    PI,

    Tamino’s forcings include snow albedo and stratospheric H2O. If you look at the raw data for these they have trended upward over the past century, almost certainly as a result of temperature changes. If Tamino is treating these feedbacks differently than GHGs and aerosols and solar activity, that hasn’t been made clear. Likewise, there is H2O in the troposphere, which I assume is included in GHGs, but I might be wrong about that.

  • Mark // August 18, 2009 at 5:47 pm | Reply

    “I think your real point is that when it comes to the attribution question of AGW, even simple models capture most of the relevant behavior: it’s not dependent on detailed physical assumptions.”

    Bingo.

    And the computer models aren’t doing anything mystical. They’re doing exactly the same thing that Newton did to work out the equation for the force of gravity. Or what Kepler did to find the orbits of the planets were elipticals with the sun at one focus with Brahe’s data.

  • Douglas Watts // August 18, 2009 at 6:21 pm | Reply

    The “take home point” that I receive from Tamino’s example here is the many meanings people attach to the word “computer.” An abacus is a computer. Learning your times tables by rote is basically programming your mind to be a “computer.” A computer is “that which computes,” or more accurately, a person or device which accepts a set of inputs, performs a highly defined series of operations on the input, and in strict accordance to the input and operations conducted upon it, produces an output. All algorithms which attempt to mimic natural behaviors are “models” and algorithms are what “computers” use to convert input data to output data. What I like about Tamino’s presentation is how he shows by very simple methods that there is no way to fit the existing data to any model if the documented effects of humans are ignored. An interesting corollary would be if someone were to deny that volcanoes have any effect. Try fitting those Mount Pinatubo spikes into an explanation which says volcanoes have no effect. This, if I am not wrong, is the basic point Tamino is trying to make. As Henry Thoreau said, “sometimes circumstantial evidence is very strong, as when you find a trout in the milk.”

  • DrC // August 18, 2009 at 6:51 pm | Reply

    This is a bankable post. Thanks. When I teach about modeling I differentiate between conceptual models, statistical models, and process models and then describe all three in terms of climate. This is a great example of how a conceptual model, planetary physics, informs statistical and process models. They all come together beautifully!

  • Michael hauber // August 18, 2009 at 11:32 pm | Reply

    Can this model test the idea that the response to solar may have a climate sensitivity different to that of other climate forcing (eg Svensmark cosmic ray/cloud link).

    I assume that if you allow the sensitivity to solar forcing to be different you could get a better fit as the worst possible case would be the same sensitivity for solar and the same fit. Can you say anything about whether any improvement in fit is in some way statistically significant, or only what you would expect from having one extra tuning knob in the model?

  • David B. Benson // August 19, 2009 at 12:07 am | Reply

    Michael hauber // August 18, 2009 at 11:32 pm — Tung & Camp (2008) determine the response of global temperature to solar variations using 50+ years of data. In the appendix a ‘model’ is proposed which requires no exotic aspects whatsoever.

  • David B. Benson // August 19, 2009 at 12:49 am | Reply

    Michael Tobis applauds this thread:
    http://initforthegold.blogspot.com/2009/08/tamino-rocks.html

  • Dan Satterfield // August 19, 2009 at 1:25 am | Reply

    Absolutely excellent- Will link it on my blog as well.

  • Chris Colose // August 19, 2009 at 3:02 am | Reply

    Tamino, please correct any misunderstanding I may have here:

    It’s generally well accepted that using the 20th century to put contraints on climate sensitivity is not a good idea for two large reasons: 1) The climate is not currently at equilibrium 2) The 20th century forcing is not known to good accuracy. In fact, the anthropogenic forcing uncertainty band given in the AR4 is a large 0.6 to 2.4 W m^-2, mostly due to aerosol uncertanties (both the direct and secondary effects) on clouds. The time-evolution of forcings are also not well understood for non-GHG’s, a problem of uncertainty not captured in your figure 1 or the analysis in general.

    [Response: The fact that climate is not in equilibrium doesn't invalidate sensitivity estimates from this model; if the model and forcing data are correct this analysis will give the right answer. Uncertainty in forcing estimates does of course create uncertainty in estimates using this model. And there's also the fact that this model is really too simple; I think GCMs give the best sensitivity estimates.]

  • David Horton // August 19, 2009 at 5:36 am | Reply

    Good stuff, as always Tamino. I keep niggling away, here http://www.blognow.com.au/mrpickwick/Climate_change/, trying to understand the denialist mind set, a search that leads to depression and despair. I think you are too logical and too kind to them in your opening paragraph. What you are showing is that simple calculations and graphical representations, not incomprehensible calculations in the black box of a super computer, are enough to demonstrate the real world of GHG-induced global warming. Excellent, and beautifully illustrated, but of absolutely no interest to the denialist. What they (including, I think, Plimer, see http://www.intellectualactivist.com/php-bin/news/showArticle.php?id=1120) seem to believe is that the WHOLE of the AGW theory is generated from computer programs, inside computers. That is it is all just a mathematical construct which can be accepted or not as a matter of faith. The idea that there is real world data out there that runs in parallel to the regression lines escapes them. To the denialist there is no planet on which ice is melting, animal and plant distributions are changing, oceans acidifying, storms increasing, droughts worsening. These things are just weather, and bear no relation to climate change. Climate change, conversely, is just an imaginary computer construct. I find it difficult to believe two contradictory things before breakfast, but in the world of denialism that is a doddle.

  • Gavin's Pussycat // August 19, 2009 at 9:21 am | Reply

    Tamino,

    why didn’t you include the number of pirates as an independent variable for regression? Obviously your analysis is incomplete without it.

  • Mark // August 19, 2009 at 9:50 am | Reply

    “Try fitting those Mount Pinatubo spikes into an explanation which says volcanoes have no effect. ”

    But Marcus and Dhog have said that the CO2 went *down*.

    Therefore adding pinatubo in would be different from adding CO2 in since just the volcano adds CO2.

  • Barton Paul Levenson // August 19, 2009 at 10:47 am | Reply

    Lucia writes:

    have you done anything to check that your specific solution for the values of the two time constants maps into a space that does not violate the second law of thermodynamics?

    Mention of the 2LOT is usually a red flag for a posting by a pseudoscientist. The creationists started it but the AGW deniers have picked it up big-time. See, e.g., here:

    http://BartonPaulLevenson.com/JJandJ.html

  • PI // August 19, 2009 at 11:24 am | Reply

    Chris,

    “It’s generally well accepted that using the 20th century to put contraints on climate sensitivity is not a good idea”

    I don’t think that’s “generally well accepted” at all, and there has been a fair amount of effort in this direction (see IPCC AR4 WG1 9.6 and the Nature Geosci. review by Knutt and Hegerl).

    As Tamino says, the climate doesn’t have to be in equilibrium to estimate an equilibrium climate sensitivity. You’re really estimating a feedback factor and using that to extrapolate the equilibrium behavior.

    (How useful “equilibrium climate sensitivity” is as a concept, when the climate is never in perfect equilibrium, is somewhat more debatable.)

    As for there being large uncertainties, sure there are. But not really larger than the uncertainties you get using any other method (paleoclimate estimates or first principles calculation using GCMs).

    I think the “20th century climate sensitivity constraints aren’t useful” idea mainly comes from GCM modelers who trust their models more. But the observational estimates make fewer structural assumptions than the GCM estimates, so they’re a useful independent check. (e.g., if the GCMs left out or misquantified some big feedback, an observational estimate might be able to notice). And the answer you get is pretty much the same range as the GCMs, so I don’t understand why they’re supposed to be so much worse.

    As Knutti and Hegerl conclude, “The well-constrained lower limit of climate sensitivity and the transient rate of warming already provide useful information for policy makers. But the upper limit of climate sensitivity will be more difficult to quantify.”

  • lucia // August 19, 2009 at 12:45 pm | Reply

    [edit]

    Look. Have you checked? I know that it is possible for those curve fits to experimental data to map into something unphysical, that violate the 2nd law of thermo, or does other things that would not make anysense.

    [edit]

    [Response: Yes I checked. You didn't. I guess doing the work to find out before shooting your mouth off would get in the way of your modus operandi: FUD. Barton is right, you just mentioned the 2nd law of thermodynamics because it's such a popular way to confuse the ignorati.]

  • Mark // August 19, 2009 at 1:01 pm | Reply

    Another point for PI’s message above is the query: would a different sensitivity make the temperature explainable without human caused CO2?

    I don’t think so, do you?

    So if not, does it REALLY matter if the sensitivity is wrong? Whatever it is, it isn’t going to be orders of magnitude different (or different in sign) and that’s the only way you can figure out the records without human CO2 included.

  • Mark // August 19, 2009 at 1:02 pm | Reply

    “why didn’t you include the number of pirates as an independent variable for regression? Obviously your analysis is incomplete without it.”

    Hey, I’m seeding as much as I can on BitTorrent.

    It’s slowed down the temperature somewhat, but we need more pirates…

  • Ray Ladbury // August 19, 2009 at 1:59 pm | Reply

    Lucia,
    Based on my understanding, the one-year and 30 year responses would tend to be physical. Indeed, it would seem that they would also tend to impose realistic sizes for the 2 heat reservoirs. Can you elaborate on your concerns with the 2nd law, as I don’t see how that would be a serious concern with this model (e.g. heat will flow from warm to cold reservoir, regardless of whether the warm one is the large or small)?

  • Slioch // August 19, 2009 at 5:50 pm | Reply

    Barton
    – just to point out a typo in
    http://bartonpaullevenson.com/JJandJ.html

    First sentence should be “If the Earth’s surface had the same albedo (reflectivity) as the present Earth WITHOUT an atmosphere…”

    Cheers

  • clazy // August 19, 2009 at 7:54 pm | Reply

    Tamino,

    This discussion was interesting until you insulted Lucia. She asked whether you’d done a certain test, and you insulted her and said it wasn’t necessary. When she asked you the question again, you insulted her again, but you did at least answer the question, saying you’d done the test. So I’m puzzled. Was the test necessary or not? If not, why did you do it?

    [Response: Lucia has a history. The first time she appeared here she tried very hard, repeatedly, to show that one of my posts was seriously in error. She was wrong. The reason behind her false criticism was that she hadn't bothered to read the post with anything like the level of attention she devotes to finding fault. That's what she does.

    She has also devoted an immense amount of time on her own blog trying to "falsify" the IPCC projections, utterly failing even though her attempts are based on a dishonest statement about what the IPCC projections are. Downright dishonest. So I have a nickname for her: the falsifier.

    When she shows up it's usually for the purpose of nit-picking and/or fault-finding. She's not very good at it. It so happens that in the models shown here, as long as both time constants are positive and both regression coefficients turn out to be positive (which for the main model they do), they don't violate basic physics (including the 2nd law of thermodynamics). I knew that when I started. Note that for the models with negative coefficients I stated "Of course this model is nonsense."

    Lucia appears to have the skill to figure this out. But rather than do the work, she prefers to come here and plant that idiotic "violates the 2nd law of thermodynamics" meme that we've all heard a thousand times from more denialist idiots than the planet has room for. She's a petulant child, one who won't be commenting here again.]

  • Deep Climate // August 19, 2009 at 9:59 pm | Reply

    The fit of the model is quite remarkable for the recent period and appears to degarde somewhat as one goes back in time. Is this because forcings and/or temperature observations are known with greater certainty than for past periods, or because of a clearer GHG signal? Or some combination of factors?

    Great post as always …

  • naught101 // August 20, 2009 at 2:54 am | Reply

    When more energy flows through the system the planet heats up; with less energy flow the planet cools down

    Wouldn’t it be more accurate to say that “if the energy inflow increases relative to the outflow, the planet heats up”?

    [Response: Yes.]

  • george // August 20, 2009 at 3:33 am | Reply

    Lucia says

    I know that it is possible for those curve fits to experimental data to map into something unphysical, that violate the 2nd law of thermo, or does other things that would not make anysense.

    Like an apparent flat or even negative “trend” in global temperature since 2001 given the fact that the yearly increase of atmospheric CO2 is actually larger today than the average rate over the past 3 decades during which global temperature increased at nearly 0.2 C per decade?

  • Nathan // August 20, 2009 at 5:01 am | Reply

    George

    Are you suggesting that the flat ‘trend’ since 2001 doesn’t make sense?

    I would suggest that it doesn’t make sense to you, because for some reason you expect the temp to go up in step with CO2. That is a a pretty nonsensical assumption.

  • Mark // August 20, 2009 at 10:12 am | Reply

    “Like an apparent flat or even negative “trend” in global temperature since 2001 ”

    Uh, you mean a POSITIVE trend since 2001, george.

    It’s only negative if you run your eye over it where your brain expects it to continue up.

    And we have less solar radiation because we’re at a solar minimum. We’ve got a strong El Nino cooling in there after a strong warming just before.

    So those would explain why the graph isn’t going up quite so fast as it was without those changes.

    Now go back, george, and see what the temperature was the last time we had a stron ENSO effect AND a solar minimum.

    What was the temperature?

    Lets compare apples to apples, hmm?

    PS don’t forget: it’s a POSITIVE gradient since 2001 and the last 10 years is on average 0.17C warmer than the 10 years previous to that.

    Doesn’t sound like cooling, does it…

    • KenM // August 21, 2009 at 1:40 am | Reply

      Hey Mark, Help me understand your comment re: positive trend since 2001.
      I see -.01 per decade for GIS since 2001 and -.15 for RSS over the same period.

      [Response: Using just the GISS data since 2001, I get -0.01 +/- 0.26 deg.C/decade. Note that the 95% confidence interval easily includes the estimated trend since 1975, so there's no evidence of any change in the prevailing trend since that time. Same deal for RSS: -0.16 +/- 0.37. The probable error is so large that trying to estimate the trend from such a short time span is futile; you might as well try to estimate the global warming trend using just the data since last Tuesday.]

      • KenM // August 21, 2009 at 5:46 am

        Thanks Tamino – I understand your point – I thought maybe there was more to it. Can’t say I’d seen anyone declare a positive trend since 2001 before.

  • george // August 20, 2009 at 11:42 am | Reply

    Mark:

    Like temperature trends, comments are sometimes not what they appear to be on their face.

  • P. Lewis // August 20, 2009 at 1:20 pm | Reply

    I found george’s scornfullness quite sensical.

    “Like” what’s sauce for the goose is sauce for the gander almost.

  • Mark // August 20, 2009 at 1:23 pm | Reply

    george: then illuminate your comment so that it may be seen for what it is than inferred into what it isn’t by the shadows it creates.

  • Hank Roberts // August 20, 2009 at 4:51 pm | Reply

    >> more accurate to say that “if the energy
    > > inflow increases relative to the outflow,
    > > the planet heats up”?

    > [Response: Yes.]

    Although for present circumstances, energy reaching the planet from the sun hasn’t changed; it’s the rate of outflow that’s been suppressed by increasing greenhouse gases.
    That’s #1:
    http://www.skepticalscience.com/solar-activity-sunspots-global-warming.htm

  • chriscolose // August 20, 2009 at 6:06 pm | Reply

    PI,

    I agree mostly with you, and I agree with AR4 range of 2 to 4.5 C is most plausible. The paleoclimatic clearly cannot support a very low sensitivity argued by Lindzen, Spencer, etc (or presumably a very high one, but the mathematics of feedbacks is such that higher values are more difficult to rule out).

    My main point was that the uncertainty in total forcing provided by aerosols (and to a lesser extent, other non-GHG factors like land use, soalr, etc) means that the total net forcing over the 20th century has a huge spread, and what’s more, the remaining “heating in the pipeline” is not independent of the sensitivity. So while researchers have looked at the 20th century (and individual events like the response to Pinatubo) to constrain sensitivity, it is quite limited in practice and for models such as this one.

  • theguppy // August 21, 2009 at 10:38 am | Reply

    Lucia appears to have the skill to figure this out. But rather than do the work, she prefers to come here and plant that idiotic “violates the 2nd law of thermodynamics” meme that we’ve all heard a thousand times from more denialist idiots than the planet has room for. She’s a petulant child, one who won’t be commenting here again.

    That’s a shame, IMHO, because posting here takes her out of her comfort zone, and shows her up for what she is.

  • wagdog // August 21, 2009 at 1:16 pm | Reply

    I’d consider this work to be a climate model also. It is not, however, a finite element simulation like the ones Gavin Schmidt do, but it is still a physically based model.

    What is strange is when denialists flock to computer models that confirm their beliefs, like global warming being caused by sun spots interacting with cosmic rays. In such a case the models have very tenuous relationship to the physical world, often going no futher than suggesting that there might be a correlation that warrants further investigation. By no means is there a well developed modelling of all the physical mechanisms linking sun spots -> cosmic rays -> cloud formation -> global warming, that can be coded up and run on a computer as a simulation. Exactly how tenuous is this link? See http://www.pik-potsdam.de/~stefan/Publications/Journals/rahmstorf_etal_eos_2004.pdf

    By contrast, the physics-based link between CO2 and global temperature is very well modelled on computers.

    For examples of non-simulation non-physical models see this parody on RealClimate:
    http://www.realclimate.org/index.php/archives/2007/05/fun-with-correlations/
    When a model is not limited to physical laws, one can predict anything. The extreme case of this is the Bible Code: http://www.awitness.org/essays/bibcode.html

  • george // August 21, 2009 at 3:32 pm | Reply

    Tamino says

    [Lucia ] won’t be commenting here again.

    I agree with theguppy that this is too bad.[first time I ever agreed with a guppy on anything]

    It’s probably the one place where Lucia actually gets challenged when she makes bullshit comments and where her silly word games won’t work.

    Tamino: I think the most effective way to counter the BS is simply to do what you (and Barton) did above. Point out that the “possible violation of the second law of thermodynamics” is usually a ploy used to create doubt and means absolutely nothing in and of itself. (ie, unless one can actually show that something does violate the second law, saying that it “might” is really an empty statement. I can say that flying saucers fro alpha centauri “might” be real, but if I can provide no evidence, it’s a vacuous statement)

    Of course, lots and lots of mathematical results have little if anything to do with reality. That’s hardly a profound observation. That’s why we have physicists, chemists, biologists, climate scientists, etc to do reality checks on the mathematicians!
    (no offense Tamino, but you strike me as more of a physicist than pure mathematician)

  • Mark // August 21, 2009 at 4:30 pm | Reply

    “It’s probably the one place where Lucia actually gets challenged when she makes bullshit comments and where her silly word games won’t work.”

    But no improvement is seen in the patient.

    And no matter how patient the people correcting lucia are, letting her continue to get airtime is a waste of everyone’s time.

    Since the one who is putting time here is the blog owner and lucia has been given a lot of chances to show some porosity to new ideas and failed to show any retention of facts counter to a denial of AGW position, it’s fairly reasonable to give the plank the boot.

  • george // August 21, 2009 at 5:47 pm | Reply

    Mark,

    I don’t actually expect that Lucia will change her tune due to what Tamino or anyone else says here (or anywhere else, for that matter).

    That actually has nothing to do with why I think Tamino should continue to allow her to comment. Changing her mind is not the point.

    Sometimes the most illuminating/educational thing to do (for others) is to allow someone to express their contrary views in an environment where they do not set the “tone” of the conversation and “control the argument”.

    She has been making her claims (eg, the “IPCC projections Falsified” one) to largely “receptive” audiences primarily on her own blog (and on Climate Audit, for example) and while some (Gavin Schmidt, Tamino and others) have addressed the central problems with her claims, it has been largely “once removed” and she is free to “rebut” their posts on her own blog at her leisure and essentially free from any significant challenge (certainly free from any devastating challenge).

    That’s not to say that no one challenges her there. Some may do so, but critically, she controls who gets through and what gets allowed. This makes it much easier for her to continue the BS than if she actually had to “face the music” (here, for example.)

    Finally, I would put Lucia in a totally different class than someone like Anthony Watts (who IS a waste of time, in my opinion and for whom the best response is simply derision).

    Lucia actually has a technical/mathematical background, which makes it all the more important to point out where her “arguments” break down and where she is simply using obfuscation.
    .

  • Gavin's Pussycat // August 21, 2009 at 7:20 pm | Reply

    Actually I also agree someone with theguppy. Refute, don’t suppress (though it gets tiresome the seventeenth time around).

    george, Lucia is actually in _precisely_ the same class as Watts, but more science-smart and thus more dangerous. Refuting Watts is fish in a barrel.

  • Gavin's Pussycat // August 21, 2009 at 7:21 pm | Reply

    s/someone/somewhat/

  • t_p_hamilton // August 21, 2009 at 8:06 pm | Reply

    I just wonder if the Lucias of this world will apologize in 10-20 years for their role in delaying action.

  • erlhapp // August 22, 2009 at 2:01 pm | Reply

    From the post:

    “Changes in the energy flow constitute climate forcings. We know of many, including greenhouse gases, solar changes, ozone, snow albedo, land use, aerosols (both from volcanoes and from industrial processes), etc.”
    Tamino, you missed one and it happens to be the most important one. See :http://climatechange1.wordpress.com/

    And here is the bit that particularly pertains to you:

    “The atmosphere is not amenable to modeling that treats the globe as a closed system. Our understanding of atmospheric processes is elementary. Mathematicians who do not appreciate that the basic parameters driving climate are externally imposed and forever changing, are a hindrance to progress and best employed elsewhere. ”

    Sorry to be so aggressive but the sanctimonious crap gets to me.

    The debate as to whether people should be allowed to comment sets the tone.

    Who was it said “I don’t like what you say but I will defend to the death your right to say it”.

    That my friends is the true spirit of democratic debate and you should buy it.

    [Response: You're as full of it as McLean, de Freitas, and Carter. Your sanctimonious crap is a pollutant to the discussion. I'm not the least bit sorry to be so aggressive.

    It's one thing to claim the right to say what you want. It's quite another to insist that you have the right to spout your deceitful stupidity in my house.

    Get back in your crib and let the adults converse uninterrupted.]

  • Ray Ladbury // August 22, 2009 at 3:50 pm | Reply

    Erl, Tamino has discussed ENSO in detail. ENSO, however, is not a forcing as it doesn’t have a trend. I would suggest that a competent person would realize this, but there is some question about whether you would be included in this group.

  • Gavin's Pussycat // August 22, 2009 at 4:39 pm | Reply

    I just wonder if the Lucias of this world will apologize in 10-20 years for their role in delaying action.

    Nah… they’ll find a way to blame the scientists. I mean, we were the ones that created electric light and appliances, cars, aircraft etc., right? And if disaster somehow can be averted, the story will be that there wasn’t any problem anyway to begin with, and all the alarmism was for nothing, typical environazi crap. Cf. the ozone hole.

  • TAG // August 24, 2009 at 4:00 pm | Reply

    I mean, we were the ones that created electric light and appliances, cars, aircraft etc., right

    As a matter of hisroical record, the Wright brothers (aircraft) nor Thoas Edison (light bulb and appliances) were scientists.

    The Wright brothers did their own reseacrh into aerodynamics. They did attend a profession conference once and found that the professional scientists were far behind them and could offer nothing of use to the evelopment of aircraft.

  • David L. Hagen // August 24, 2009 at 8:43 pm | Reply

    Tamino
    You may wish to cite the two parameter model published by Nicola Scafetta (2008):

    N. Scafetta, “Comment on “Heat capacity, time constant, and sensitivity of Earth’s climate system’ by Schwartz.” J. Geophys. Res., 113, D15104, doi:10.1029/2007JD009586. (2008). PDF

    Here he finds tau1 = 0.4 years and tau2 =
    12 years.

    Scafetta has now published an update comparing two slower time parameters with the fast parameter model:

    N. Scafetta, “Empirical analysis of the solar contribution to global mean air surface temperature change,” Journal of Atmospheric and Solar-Terrestrial Physics (2009), doi:10.1016/j.jastp.2009.07.007. PDF

    The solar contribution to global mean air surface temperature change is analyzed by using an empirical bi-scale climate model characterized by both fast and slow characteristic time responses to solar
    forcing: tau1 = 0:47 +/-0:1 yr, and tau2 = 8 =/- 2 yr, or tau 2 = 12 +/- 3 yr.

    We look forward to your comparisons of your models with those of Scafetta.

  • Mark // August 25, 2009 at 10:28 am | Reply

    “As a matter of hisroical record, the Wright brothers (aircraft) nor Thoas Edison (light bulb and appliances) were scientists.”

    TAG, also note that they didn’t add to the science. All they did was solve the ENGINEERING problems.

    Incandescence is ancient knowledge and temperature-dependence of resistance was also well known.

    The only problem (and it’s an ENGINEERING one, not a science one) is how to stop the resistor burning in the atmosphere.

    Similarly the weight problems: the Wright Brothers were SECOND. An Italian (IIRC, could have been a Scot) invented it first, including the tension wire wheel (bicycle wheel) that was light enough to lift but strong enough to work. All he was missing was a power source.

    That required the invention of the ICE.

    Which likewise was an engineering problem, not a science one.

  • Kevin McKinney // August 25, 2009 at 8:09 pm | Reply

    And the Wright brothers learned quite a lot from the experiments of Samuel Langley, who was a scientist. (Astronomer, actually.)

    More on this when I publish my little article on Arrhenius (there’s a link there which some here will already be aware of!)

  • Eli Rabett // August 26, 2009 at 1:42 pm | Reply

    The Wright brothers were far more sophisticated than some of the comments above indicate.

  • Eli Rabett // August 26, 2009 at 3:19 pm | Reply

    FWIW
    http://www.solarnavigator.net/inventors/wright_brothers_wind_tunnel.htm

  • Mark // August 26, 2009 at 4:03 pm | Reply

    They were sophisticated engineers solving a difficult engineering problem.

    When I get back to my library, I’ll get the name of the bloke who solved the problems before the Wright Bros.

  • Geoff Wexler // August 27, 2009 at 2:57 pm | Reply

    Interesting piece of work. Why not publish it?

    After suggesting that you (Tamino) would be better employed elsewhere I think that the man from the vineyards has been treated quite generously because he has been allowed to advertise his web site on this thread. Here is what Barry Brook thinks:

    http://bravenewclimate.com/2008/11/11/response-to-a-wine-industry-climate-change-skeptic/

    Incidentally, who would have thought that any system with incoming and outgoing energy was being treated as “closed”?

  • Richard Treadgold // August 30, 2009 at 11:44 pm | Reply

    [edit]

    [Response: It's too bad that your "climate conversation" is nothing of the kind; it's just another source of denialist claptrap. A sure sign is your banner which reads "Apparently we destroy the climate with our carbon dioxide, which makes up only 0.00008 of the atmosphere. But the lunacy is in believing that such a trace amount might dominate the climate."

    I enforce a "stupid threshold" for this blog. You didn't make the cut. Feel free to comment about how mean and unfair I am -- on your blog. Rest assured your comments will not appear here.

    P.S. Try to learn the difference between "forcing" and "feedback."]

  • Andy Pandy // September 22, 2009 at 6:23 am | Reply

    Mr T,

    Do you have a list of the output of your 2-box model available. A csv file would be good I’m just mucking around testing some models and it would be good to include yours.

    Thanks

  • dhogaza // September 24, 2009 at 3:30 am | Reply

    I still haven’t received a reply — is there something wrong with my request?

    Tamino posted recently that he’d suffered a serious wrist injury and would not be typing much.

    That could be a reason.

  • Andy Pandy // September 28, 2009 at 8:39 am | Reply

    Sad to hear about Mr T. Hopefully when he can tap he’ll post the out in any easy to use form.

  • Andy Pandy // October 9, 2009 at 1:15 am | Reply

    Mr T,

    Glad to see you back and blogging — any chnace of getting the model data?

  • Andy Pandy // October 29, 2009 at 4:55 am | Reply

    Mr T — how’s the wrist? Any chnace of posting that data?

  • Hank Roberts // October 29, 2009 at 2:40 pm | Reply

    Isn’t that pandy’s fourth strike?
    By now, he could post his own for scrutiny.

  • Andy Pandy // October 30, 2009 at 2:31 am | Reply

    Hank — I just want to see the data, I have no intention to upset anybody. If Mr T doesn’t want to give me the data he can let me know and I’ll go back to lurking. I was just given the impression that the reason I hadn’t see any data was because of Mr T’s wrist injury — is there any other reason?

  • Hank Roberts // November 3, 2009 at 9:15 am | Reply

    You’re asking _me_? I’m just a reader here.
    Could be you’re not high enough up on his priorities list yet, and perhaps if you were to post your own work and show something you know, you’d be more interesting. Or maybe not, I can’t tell. Lots of people bother scientists for their work. They tend to help those who’ve shown they know what to do with the material, seems to me. But like I said, I’m nobody, just a reader here. You asked, that’s my opinion.

  • Andy Pandy // November 3, 2009 at 11:13 pm | Reply

    Hank I was just answering your comment. I’m sure Mr T is busy but I can’t post work on his model until I have the data. The graphs are useful but if I try to estimate the data from them I’m sure to make a mistake. Maybe I am too insignificant (yes that is a stats joke) and too low on his priority list as you suggest but rather than take your guess for it I await Mr T’s response.

  • ABG // November 9, 2009 at 10:17 pm | Reply

    Excellent demonstration Tamino – I plan to use it as an example. Would you be willing to post the code for these analyses (although it seems that I could probably figure it out myself)? Was this done in Matlab, R?

    Cheers

  • ZT // December 19, 2009 at 8:34 pm | Reply

    Any predictions for the future using the best available model?
    A thirty year prediction would be really fascinating.

  • dhogaza // December 19, 2009 at 9:59 pm | Reply

    ZT’s been trolling Real Climate, I suggest ignoring him.

    Though my own 30 year prediction is that 30 years from now, ZT will still be denying science.

  • David B. Benson // December 19, 2009 at 10:11 pm | Reply

    ZT // December 19, 2009 at 8:34 pm — The problem with predicting is knowing the emission scenario. But what the h**l, just assume as much warming as in the past thirty years. Then check Mark Lynas’s “Six Degrees” to obtain some sort of estimate of just how hellish that will be…

  • ZT // December 20, 2009 at 12:24 am | Reply

    Is there a prediction assuming the current exponential rate of increase in CO2 concentration?

  • David B. Benson // December 20, 2009 at 1:17 am | Reply

    ZT // December 20, 2009 at 12:24 am — es, I just gave it to you. Next thirty years warm as much more as the last thirty.

  • Ray Ladbury // December 20, 2009 at 2:05 am | Reply

    ZT, Look at the IPCC scenarios. Really, you won’t catch cooties from things written by the IPCC. I promise.

  • ZT // December 20, 2009 at 4:19 am | Reply

    Thanks – the next 30 years will have as much warming as the last 30. Got it. I wasn’t sure if the first message was a joke.

    Is it possible to get a computer model like the one described here to provide a year-by-year prediction for the future, with all the year-by-year variation seen on the model produced graphs on this page?

    The IPCC reports are jargon rich and information poor as far as I can see. Perhaps I haven’t downloaded the correct sets of PDFs yet.

    [Response: Sure. As soon as you provide the next 30 years of volcanic forcing data, as well as the multivariate el-nino index for the next 3 decades, we can give you the detailed prediction you want.]

  • ZT // December 20, 2009 at 4:23 am | Reply

    Can someone point me to a very simple mathematical model that shows there’s no explanation for the global warming other than human causation?

    [Response: Sure. Here.]

    • Ray Ladbury // December 20, 2009 at 5:50 pm | Reply

      ZT, A serious question: Are you really this ignorant or are you just bored and trolling? If you are this ignorant and really want to learn, this is the place for you. If you are a troll, you are coming very, very close to the stupid event horizon.

  • Didactylos // December 20, 2009 at 1:14 pm | Reply

    ZT, how about a 21-year-old climate model that successfully modelled the future 21 years? 21 years later, we can compare the model output (call it a “prediction” if you must) with observed temperatures.

    There is an excellent discussion of Hansen’s 1988 results here: http://logicalscience.com/skeptic_arguments/models-dont-work.html

    You will see there is a slight disagreement around 1991-1995. This is because Hansen allowed for a volcanic eruption in 1995, but in reality, Mt Pinatubo erupted in 1991.

    You will also see that the trajectory of the temperature rise is dependant on the actual emissions.

    All this means that your question “Any predictions for the future” is faulty – model results are only valid if the assumptions about the future that go into the model are valid. Therefore, model results cover a wide range of scenarios.

    Result: there is no single “prediction” for temperature.

  • Didactylos // December 20, 2009 at 1:29 pm | Reply

    ZT said:

    Can someone point me to a very simple mathematical model that shows there’s no explanation for the global warming other than human causation?

    There are two approaches that can be taken here. You can use a physics-based model, and run it on a computer, or you can use a statistical model, and use statistics to fit the different forcings.

    Here are the results of a physics-based model: http://maps.grida.no/go/graphic/modeled_temperature_compared_to_observed_temperature_for_the_last_150_years

    For a statistical model, I can’t do better than Tamino’s model at the top of this page. Just like for the physics-based model, he tries the experiment with just greenhouse gases, and with just natural forcings, and finally with both.

    For both the statistical model and the physics-based model, the results are painfully clear: greenhouse gases alone don’t explain everything, natural forcings alone don’t explain everything. Greenhouse gases and natural forcings combined provide the best fit.

    You want a “very simple mathematical model”. Well, this is as simple as it gets. Climate really isn’t simple, as so many deniers are fond of saying.

  • dhogaza // December 20, 2009 at 1:46 pm | Reply

    Can someone point me to a very simple mathematical model that shows there’s no explanation for the global warming other than human causation?

    This is very low-level trolling. What purpose is served by humoring him?

    • Didactylos // December 20, 2009 at 3:08 pm | Reply

      It’s snowing, and I’m bored?

      Seriously, you should know as well as anyone that it is very difficult to leave a question unanswered, and even if ZT learns nothing, some other seeker of wisdom may at least not be misled.

      Besides, I think the UNEP/GRID-Arendal graphics should get a lot more prominence. http://maps.grida.no/theme/climatechange

  • luminous beauty // December 20, 2009 at 2:02 pm | Reply

    ZT,

    Yes. The one at the top of the page.

  • Hank Roberts // December 20, 2009 at 4:56 pm | Reply

    > provide a year-by-year prediction for the
    > future, with all the year-by-year variation

    You want a 30-year weather forecast.

    What you can get is a large number of scenarios, each with some possible variation year by year.

    What you are having trouble getting is the distinction between what you want and what you can get.

    It’s not important to be able to do a 30-year weather forecast before doing anything about climate change.

  • dhogaza // December 20, 2009 at 5:37 pm | Reply

    Seriously, you should know as well as anyone that it is very difficult to leave a question unanswered

    True, I preach better than I practice :)

  • Scott A. Mandia // December 20, 2009 at 7:57 pm | Reply

    Speaking of weather:

    26 inches (66 cm) of snow here and 4 1/2 hours of shoveling.

    SNOW is a 4-letter word. :(

  • Deech56 // December 21, 2009 at 12:21 am | Reply

    We have 14-18 inches here and am not done shoveling (and I do mean shoveling). To this Buffalo ex-pat it’s like old times, but with 8X the soreness.

    Scott – I’ve always liked your site and when pointing someone with questions to good climate starting places, noticed that RC links it in the “Start Here” page. Good work.

  • David B. Benson // December 21, 2009 at 12:25 am | Reply

    New evidence confirms land warming record:
    http://www.metoffice.gov.uk/corporate/pressoffice/2009/pr20091218b.html

  • Ray Ladbury // December 21, 2009 at 12:29 am | Reply

    We got nearly 2 feet, and we’ve got a REALLY long driveway. Good exercise, and we got things cleared by late morning and let the Sun do it’s work.

  • dhogaza // December 21, 2009 at 2:02 am | Reply

    Yes, David, but look:

    This conclusion is in contrast to a recently released study by the Institute of Economic Analysis (IEA) think tank based in Moscow.

    We really are seeing a “denial of service” attack on climate science …

    And the other aspect here is I’m sure partly due to the claims that Climategate “shows” that CRU is “fraudulent” and therefore HadCRUT is fraudulent. The “actually all other sources show the same trend or worse” defense.

  • Sekerob // December 21, 2009 at 8:38 am | Reply

    David B. Benson // December 21, 2009 at 12:25 am

    Funny how I recently posted 3 numbers of land temp, the global the exact average of the NH and SH land values and me questioning this resulting is zero response. They were from CRUTEM3v. How can that be when 2/3 of the land is on the NH? Would Phil Jones know?

    This from a site a poster here recently linked too:

    Quiet Sun Means Cooling of Earth’s Upper Atmosphere

    and also allows to validate…

    “We suggest that the dataset of radiative cooling of the thermosphere by NO and CO2 constitutes a first climate data record for the thermosphere,” says Mlynczak.

    The TIMED data provide a climate record for validation of upper atmosphere climate models, which is an essential step in making accurate predictions of climate change in the high atmosphere. SABER provides the first long-term measurements of natural variability in key terms of the upper atmosphere climate.

    “A fundamental prediction of climate change theory is that upper atmosphere will cool in response to greenhouse gases in the troposphere,” says Mlynczak. “Scientists need to validate that theory. This climate record of the upper atmosphere is our first chance to have the other side of the equation.

    Looking forward to hearing of further corroborations that reality does match the broad scientific consensus.

  • Gavin's Pussycat // December 21, 2009 at 11:03 am | Reply

    A foot here too…

  • Scott A. Mandia // December 21, 2009 at 12:38 pm | Reply

    Thank you for your kind words Deech56. :)

  • Timothy Chase // December 21, 2009 at 9:32 pm | Reply

    Ray Ladbury wrote:

    We got nearly 2 feet, and we’ve got a REALLY long driveway.

    I just gotta ask: how can there be global warming if we are getting dumped with that kind of snow?

    Ray Ladbury wrote:

    … we got things cleared by late morning and let the Sun do it’s work.

    Now what have they been saying!? It’s been the sun all along.

    Time to move on.

  • Igor Samoylenko // December 22, 2009 at 5:25 pm | Reply

    Ray wrote: “We got nearly 2 feet”

    We got ~20cm of snow in Reading, UK on Monday! Everything was at the standstill… Beautiful though! Reminds me of Ukraine :-)

    Timothy wrote: “I just gotta ask: how can there be global warming if we are getting dumped with that kind of snow?”

    You can just imagine how many “sceptics” are going around the blogosphere claiming that this snow debunks the theory of AGW. Snow in the winter – who would have thought? :-)

    • Ray Ladbury // December 23, 2009 at 1:09 am | Reply

      It’s now looking like we’ll get an icy Xmas–freezing rain on top of snow and then freezing over again. It’ll be slicker than snot on a glass door knob, and Washingtonians are not known for their winter driving skills…or summer, truth be told. When JFK called DC a city of Northern Charm and Southern efficiency, he pretty much nailed it.

      • Deech56 // December 23, 2009 at 4:47 pm

        Ray, I was being a bit facetious; my correspondent seems to be along the lines of those intrepid sould who “expose junk science.” I can explain in more detail, but am already seriously OT. If you want, drop me a line at deech56 at yahoo dot com.

    • Deech56 // December 23, 2009 at 2:09 am | Reply

      Oh, this should be fun. Fortunately, we still have a good supply of milk, bread and toilet paper from when we stocked up for the last storm. Of course, living within walking distance of a grocery store helps, too.

      Ray, a local radio personality (the son of this one) keeps challenging me to a debate, thanks to some arguments we’ve had on a local County Comish’s FB page). Want me to give him your name? LOL.

      • Ray Ladbury // December 23, 2009 at 12:06 pm

        I tend to favor quiet education over debate, and in any case, I am hardly a climate expert. I’d be happy to exchange emails with him–at least as long as he’s not a tin-foil hat and black helicopter conspiracy theorist.

  • Timothy Chase // December 22, 2009 at 6:35 pm | Reply

    Igor Samoylenko wrote:

    You can just imagine how many “sceptics” are going around the blogosphere claiming that this snow debunks the theory of AGW. Snow in the winter – who would have thought? :-)

    With regard to snow, for a while at least some scientists had been expecting Antarctica to neither gain nor lose mass. The greater melting would presumably have been compensated for by increased snowfall. At lower temperatures you have less precipitation. And from about -40°C and above you roughly double the absolute humidity for every 10°C.

    Well, East Antarctica seemed to be holding its own until a few years ago — and now its losing mass. Melting has overtaken whatever increase in precipitation resulted from warmer temperatures.

    Still, heavier snow shouldn’t be that odd for most of the world which gets it. Extreme precipitation events are supposed to be more extreme. Precipitation during the winter is suppose to increase somewhat but fall during the summers. Winter snow packs will melt earlier in the spring, though, and like absolute humidity, rate of evaporation roughly doubles for every 10°C — which will prove to be something of a problem during the growing season.

    Furthermore, Stu Ostro, a senior meteorologist at the weather channel (not a climatologist, so take what he says in this area with a grain of salt) suggests that some of the weather extremes associated with summers and winters including extreme cold spells may result from more intense subarctic high pressure and low pressure ridges.

    Please see:

    A Connection Between Global Warming and Weather
    Posted on September 27, 2007 at 12:56 pm ET
    Stu Ostro, Senior Meteorologist
    http://climate.weather.com/blogs/9_13685.html

  • Igor Samoylenko // December 22, 2009 at 11:05 pm | Reply

    Timothy Chase wrote: “Well, East Antarctica seemed to be holding its own until a few years ago — and now its losing mass.”

    Isn’t it too early to say definitively that it is, though (the BBC article is about this paper by Chen at el (2009))?

  • Riccardo // December 23, 2009 at 5:35 pm | Reply

    Igor Samoylenko,
    on antarctic melting check this one too:

    http://www.agu.org/pubs/crossref/2009/2009GL040222.shtml

  • Arthur Von Neumann // December 24, 2009 at 4:57 am | Reply

    Dhogaza, the energy yields calculated for the Manhattan project using Monte Carlo methods were off by 10-20% in the event.

    I believe this is a relevant point.

  • dhogaza // December 24, 2009 at 12:52 pm | Reply

    Dhogaza, the energy yields calculated for the Manhattan project using Monte Carlo methods were off by 10-20% in the event.

    I believe this is a relevant point.

    And current models give us a sensitivity to a doubling of CO2 of 2-4.5C with a most likely value of 3C. Greater uncertainty than the 10%-20% you cite. 2C would still be a major problem, and this is a relevant point, too.

  • Ray Ladbury // December 24, 2009 at 3:49 pm | Reply

    Arthur von Neumann says, “…the energy yields calculated for the Manhattan project using Monte Carlo methods were off by 10-20% in the event. I believe this is a relevant point.”

    By all means it is relevant. It shows that with very primitive computing equipment, a bunch of smart guys could still solve very complicated hydrodynamic calculations and get an answer within 20%.

    Also relevant is now the errors are distributed and how different lines of evidence converge on a range of answers. In the case of climate change, we have more than 10 independent lines of evidence ALL favoring a sensitivity of 3 degrees per doubling. Moreover, the evidence shows that if we are wrong in this estimate, it is much more likely that our estimat is too low than it is that the estimate is too high. Not comforting.

  • Timothy Chase // December 24, 2009 at 6:59 pm | Reply

    Riccardo wrote:

    Igor Samoylenko,
    on antarctic melting check this one too:

    http://www.agu.org/pubs/crossref/2009/2009GL040222.shtml

    You might want to check out the post:

    Greenland and Antarctic ice sheet decay, continued
    October 13, 2009
    http://thingsbreak.wordpress.com/2009/10/13/greenland-and-antarctic-ice-sheet-decay-continued/

    In fact, I would recommend getting as far as at least the second entry in the comment section.

    The authors of the paper suggest that ice mass loss of Greenland and Antarctica are now both best described by means of quadratic formula — giving rise to that familiar “I threw my keys straight out and they fell to the ground” parabola that so mystifies the “skeptics” — same sort of thing over here:

    Global Glacier Thickness Change
    http://nsidc.org/sotc/images/glacier_thickness.gif

    … from:

    State of the Cryosphere: Glaciers
    http://nsidc.org/sotc/glacier_balance.html

    … but Greenland and Antarctica appear smoother.

    Oh, and assuming that third, fourth or fifth terms don’t rear their heads — which I can tell you from personal experience isn’t worth a plug nickel — we are looking at roughly 1.4 meters from Greenland and Antarctica combined by the end of the century. Simply going on this, the rule of thumb that 1% of the population will be displaced by rise in sea level for every meter and global population leveling off at around 11 billion we are looking at about 150 million being displaced simply from the rise in sea level from these two sources. However, Greenland and Antarctica might face a good run for their money from glaciers elsewhere.

  • Timothy Chase // December 24, 2009 at 7:07 pm | Reply

    I had ended my comment above with:

    However, Greenland and Antarctica might face a good run for their money from glaciers elsewhere.

    This is, however, only during this century. In the following century I believe the expectation is that Greenland and Antarctica will overtake the rest of the world’s glaciers in terms of mass loss.

  • Timothy Chase // December 24, 2009 at 7:44 pm | Reply

    Igor Samoylenko quoted me where I wrote:

    Well, East Antarctica seemed to be holding its own until a few years ago — and now its losing mass.

    … then responded:

    Isn’t it too early to say definitively that it is, though (the BBC article is about this paper by Chen at el (2009))?

    From Chen et al (2009):

    For the EAIS, our estimate is -57±52 Gt/yr, whereas the InSAR estimate is far smaller, at -4  61 Gt/yr, more similar to previous GRACE estimates.

    pg 2, J.L. Chen et al. (22 Nov 2009) Accelerated Antarctic ice loss from satellite gravity measurements, Nature Geoscience 2, 859 – 862
    ftp://ftp.csr.utexas.edu/pub/ggfc/papers/ngeo694.pdf

    Point taken.

  • tom // December 27, 2009 at 3:00 am | Reply

    Very interesting material. I’d really like to hear your thoughts on Climategate.

  • Ray Ladbury // December 27, 2009 at 3:23 am | Reply

    Climategate? You mean the theft of emails and the deliberate but unsuccessful attempt to distort the record and defame climate scientists to derail the Copenhagen Summit?

  • Mark Raynes // December 31, 2009 at 1:39 pm | Reply

    Yeah Ray, I thinks that’s what he meant :)

  • Hardly Done // January 1, 2010 at 12:02 am | Reply

    Ray — can you provide a link to the British court report that found the emails were stolen rather than released by a whistleblower? Over the holidays I must have missed the report.

  • dhogaza // January 1, 2010 at 1:11 am | Reply

    Ray — can you provide a link to the British court report that found the emails were stolen rather than released by a whistleblower? Over the holidays I must have missed the report.

    UEA CRU is treating it as a break-in, as are the police.

    There are these things known as “access logs”. It’s been long enough that I should think the police would’ve questioned anyone with legal access to the server. And of course anyone who would legally qualify for whistle blower protection would be seeking it … it’s preferable to a felony conviction, you know.

    The only whistle blower whining is coming from the denialsphere because, well, as usual they’re friggin’ liars without any sense of decency, morality, or honor.

  • Ray Ladbury // January 1, 2010 at 2:55 am | Reply

    Hardly Done,
    A “whistleblower” who steals private correspondence and selects a tiny portion to release out of context with the sole objective of character assassination is not worthy of that title any more than is a person who refuses to acknowledge the evidence is worthy of the term “skeptic”.

  • Hardly Done // January 1, 2010 at 4:07 am | Reply

    Well I suppose I have 2 points:
    1. Until there has been a court finding it is inaccurate (and close to illegal) to say,: stolen, hack or whistleblow without some qualification such as “alleged”or IMO or “as far as I can tell”" etc British law is clear on this as the Sun newspaper has found out over the years — sure UEA may be calling it a hack but the police have made no statement (and they never do in these cases until the investigation is complete) so may be we should be a bit more temperate in our statements.
    2. Even if illegaly released are we happy with what has been revealed? BTW I agree it was a distraction at COP15 and may have been part of the reason for a lack of binding targets and we can be unhappy about that — but are we happy with the way Jones etc have acted as revealed in the emails etc (and remember nobody from UEA has denied the validity of the emails only the legality of their release). Also the Russell review is into the actions of CRU as revealed in the emails not into how and the legallity of the release.

    Oh and Ray how do you know onlya tiny portion was released?

    • Marco // January 1, 2010 at 9:51 am | Reply

      I’m sorry, but your reference to the Russell review is just…well…bizarre. Legality can only be determined by a court.
      And while one can question certain aspects shown in the e-mails, there is absolutely no evidence that any of the science was crooked. Oddly, similar (and often much worse) behavior of some of the supposed ’skeptics’ goes by completely unchallenged by most of the media. Just for the ‘fun’ of it, you should read what Lubos Motl writes about certain people. The poor science of Willie Soon and coauthors has been exposed many times, but where’s the discussion in the media on his behavior? McKittrick has been caught with pretty shoddy science and has posted claims of scientific fraud by the IPCC…and is not being scrutinized. I could go on with a list of names. And it is very relevant, since these are the same people who make the claims that the UEA e-mails show something iffy.

  • dhogaza // January 1, 2010 at 6:22 am | Reply

    1. Until there has been a court finding it is inaccurate (and close to illegal) to say,: stolen, hack or whistleblow without some qualification such as “alleged”or IMO or “as far as I can tell”” etc British law is clear on this as the Sun newspaper has found out over the years

    Thank God I don’t live in the UK, and thank God I don’t share your perverted sense of morality (which I know the UK populace as a whole do not share).

    I’ve been robbed before where insufficient evidence to bring a case against any perp was ever uncovered, but never did the police, press, populace at large, or anyone dispute that I was robbed.

    To claim that one must catch a perp and prove in court that they’re guilty is necessary to substantiate a claim of theft is perverse.

    And, I’m sure, not true in English law.

  • dhogaza // January 1, 2010 at 6:23 am | Reply

    UEA may be calling it a hack but the police have made no statement (and they never do in these cases until the investigation is complete) so may be we should be a bit more temperate in our statements.

    Actually the police have said they’re treating it as a hack.

    Quit f***ing lying.

  • dhogaza // January 1, 2010 at 6:26 am | Reply

    Oh and Ray how do you know onlya tiny portion was released?

    Because the notion that all of the researchers at CRU send and receive an average of less than two e-mails per week is fantasy.

    Doesn’t even pass the sniff test.

  • Ray Ladbury // January 1, 2010 at 12:23 pm | Reply

    Hardly Done,
    First, I am not a newspaper. I hold no official title, and so I am not bound by niceties like having to say “alleged” any mor than I was when I said OJ was guilty as sin. Second, such niceties apply to a particular person accused of a particular crime, not to the legality or illegality of the act. In my opinion, given the subsequent actions of the hackers (e.g. trying to hack into websites to post the materials), one would have to be incredibly gullible or mendacious to allege anything but a crime here.

    Second, since you seem to see nothing unusual in there being only 50 MB of emails for the period of over a decade. My email server informs me on a regular basis that I often get this amount of email in a day. Even in the late ’90s it was common to have more than 50 MB of email storage PER PERSON on an email server.

    You also seem to find nothing unusual about the release of the emails out of context–the most notable examples being the decision to replace the proxy data with instrumental data post 1960 (the TRICK) and the Trenbreth email (TRAVESTY). That so much hype should have made in the denialosphere over these two plainly innocuous emails is clear indication that they were taken out of context. You seem to see nothing untoward in this? Shall I put you down for gullible, then?

    Third, the most striking thing about the emails is that there is nothing striking in them. Some emails propose some crazy ideas, but there is no evidence they were ever acted upon or ever constituted more than blowing steam. Some are petty, but hell they were written by the great ape homo sapiens. I have seen absolutely nothing in the entire release that could rise to the level of scientific misconduct, except in the most jaundiced of eyes or paranoid of minds.

    What I see are people doing science and feeling frustrated by the efforts of people engaged in the most sordid sort of anti-science.

    In my mind, the episode constitutes a nontroversy, as evidenced by the inability of the persistent efforts of the denialosphere to fan the dead coal into a flame. Not that it was a failure, mind you. I have no doubt that when the hackers got their trove, they were disappointed by the utter lack of dirt. Still they edited (heavily) and released whatever could be taken out of context. And it has bolstered the self-delusion of the denislophere and given spineless politicians a fig leaf behind which to hide their pandering to energy interests.

    Most important for the long term, though, it has left the mountains of evidence of anthropogenic causation untouched. Eventually, unless we forsake science entirely, we must again turn our attention to that evidence. The pile is only growing.

  • Gavin's Pussycat // January 1, 2010 at 1:14 pm | Reply

    Until there has been a court finding it is inaccurate (and close
    to illegal) to say,: stolen, hack or whistleblow without some
    qualification such as “alleged”or IMO or “as far as I can tell”

    I say “stolen”. Stolen, stolen, stolen. “Whistleblower”, stand up and sue Ray or me (my identity can be gleaned from Tamino’s access log, in response to a subpoena of course).

  • Gavin's Pussycat // January 1, 2010 at 1:28 pm | Reply

    … and by the way, I have it on good authority (but not at liberty to disclose), that NINE MILLION emails were stolen.
    Merely going through those, even by a Unix-scripting-savvy crook, would take several months min. I suspect the so-called McIntyre “mole” in July 2009 was actually this same cracker, and the server has been 0wned from before that. That would give at least 5 months for the selection job.

    You may cite this as “rumor, anon.” for now. Remember me when it becomes official.

  • Gerry Quinn // January 1, 2010 at 5:10 pm | Reply

    Hi Tamino,

    I downloaded the NASA-GISS data you used for your regressions. One thing I am not clear about is the exact meaning of the data points. They are referred to as ‘Global Mean Effective Forcing (W/m^2). Am I correct in assuming that these are basically physics-based estimates of the effective extra radiation compared to 1880, given average conditions? (For example, I’d expect the GHG column to constitute the effects of greenhouse gases based on theoretical infrared absorption and the measured concentration of these gases over the period in question, perhaps modified by measured cloud cover, for example, if that has a known effect on net infrared absorption.) Or are they actually something different?

    And which greenhouse gases are included? CO2 and probably Methane, I assume – but what about water vapour?

    I found your article an nice summary of the different kinds of ‘model’ used. It touches on a point that concerns me:

    The ‘regression models’ as described above, and the ‘physical models’ seem to give an enhanced effect from CO2 that is two or three times greater than the basic radiative effect. But this enhanced effect must come via some physical process which should have some measurable proxy; increased or decreased cloud cover or whatever.

    Are you saying that these things must be inferred because there are no good measurements over the period in question? Because in principle if they are measureable, you should expect to be able to do a regression including all such effects, and there would then be no amplification expected because it would be accounted for elsewhere.

    So which possible forcings are missing from the list because they can’t be – or haven’t been – measured?

  • Hank Roberts // January 1, 2010 at 7:58 pm | Reply

    > how do you know only a tiny portion
    > was released?

    What? You don’t _trust_ the anonymous people who posted what _they_said_ was only a “random” sample of what they stole?

    If you can’t trust people like that, who _can_ you trust? WTF?

  • Gareth // January 2, 2010 at 12:20 am | Reply

    I’ll second GP’s comment above: my sources also confirm that the entire email database was stolen.

    We then have to consider how the choice of emails and files to “release” was made. It certainly wasn’t “random”. It was a careful selection of correspondence related to issues that were either already “hot” in sceptic circles, or could be spun to be damaging. As GP suggests, sorting through millions of emails to find this sort of stuff is not a trivial undertaking, and it must have been done by people having an in-depth knowledge of both the “controversies” and the sort of stuff that would play well in PR terms.

    So we have: someone savvy enough to be able to hack into a network and copy files at will; someone who knows/follows the sceptic nitpicking at mainstream climate science; and someone with the PR & marketing smarts to start and run a campaign. If the hacker was not also an enthusiastic sceptic with a detailed knowledge of the sorts of things routinely discussed at Climate Audit, then they would likely have been paid to do the work. The selection and editing must have taken a lot of time — even an enthusiast might have sought some compensation. The source of the money? It’s not drawing a long bow to suggest that we have our third (PR) man to thank.

    So in the spirit of cui bono, we ask who might have the funds to run this operation? Who are the people already funding attacks on climate science? They would seem to be the logical place to start. Time for full disclosure, I would have thought…

  • Ray Ladbury // January 2, 2010 at 2:12 am | Reply

    Gareth,
    My suspicion is that the hacker need not have been intimate with the details of rantings in the denialsophere. He would have found many willing aides in determining what emails to select. Alternatively, a few weeks perusing certain anti-science sites would have been more than sufficient to “educate” the hacker. It is not as if the denialosphere has a complicated repertoire of points. Motivation could have been either ideological or mercenary. Speculation serves no purpose at this point.

  • Gareth // January 2, 2010 at 4:31 am | Reply

    Speculation serves no purpose at this point.

    If deniers can confabulate a worldwide conspiracy to impose green socialism, I feel free to speculate on a much smaller, but potentially provable conspiracy to steal and defame…

    • Ray Ladbury // January 2, 2010 at 3:23 pm | Reply

      The problem is that with so few facts to constrain the hypotheses, it’s too easy to descend into conspiracy theories whose sole purpose is to cast aspersions rather than elicit understanding.
      The greatest enemy of the denialists is their own paranoia as they try to understand their failure to vanquish physical reality. Let’s not relinquish the high ground.

      • Gareth // January 2, 2010 at 8:35 pm

        Ray, you’re both right and wrong. There’s value in the moral high ground (the emails were stolen), but the other side aren’t playing by any rules of engagement. They do what works – and that includes stealing, lying and smearing. That has to be addressed somehow. My view is that the Hoggan/Oreskes approach — laying out the evidence of well-funded PR campaigns against action — needs to be given much more prominence, supported by further investigation into who is driving the current denier strategy. And that includes working out who stole the emails, who edited them for release, and who funded the whole show. As GP says, we probably don’t need to look much further than the usual suspects…

    • Ray Ladbury // January 2, 2010 at 9:37 pm | Reply

      Gareth, The denialosphere thrives on persecution. I agree by all means that Oreskes, Deepclimate, Hoggan, Lippard, Skeptical Science, et al. have the right approach. However, they remain 100% evidence based. It’s the crucial distinction we have between us and the denialists. I still believe that Mark Twain was on to something when he said, “If you tell the truth, you’ll eventually be found out.”

      Our problem is that we aren’t peddling what the people want to hear. So I suspect they’ll keep going to those selling them sugary poison awhile longer. We have to keep selling the antidote, and the more poison people consume the more important it becomes that we not dilute the antidote.

      • Gavin's Pussycat // January 3, 2010 at 12:24 pm

        But Ray, you cannot stop us — and we are evidence based. Even our speculations (while speculative) are, and I for one am not apologizing.
        While I greatly admire British detection, they owe a debt of gratitude to the amateurs — S. Holmes, Ms. Marple — in whose proud footsteps we try to tread ;-)

  • Gavin's Pussycat // January 2, 2010 at 9:31 am | Reply

    Gareth:
    > email and files
    According to Frank Bi, the files are all email attachments, which Eudora stores in a separate directory.

    about the selection job, would it be too much to speculate that it was ‘outsourced’ to a sympathetic think tank?

    If so, we should perhaps analyze which think tank was best equipped, immediately after the release, to take up the propaganda effort. And then read up on FOI legislation in time zone UTC-0400 and UTC-0500 countries ;-)

  • Candies Shoes // January 2, 2010 at 7:17 pm | Reply

    I really had challenges to understand this debates about the climate. One side is telling something and the other one just gets right back with many more “facts” – do not know what to believe, but based on this page and so many responses I have to say that that makes sense. Will bookmark this as well…

  • Johnmac // January 2, 2010 at 8:03 pm | Reply

    Candies Shoes,
    I suggest you spend your time reading the sites that are inhabited by climate scientists, like this (and there is a useful list on the front page here, down on the right. )

    As a non-scientist, it is clear very quickly where the evidence and facts lie, and it is not with those who are campaigning with every tactic posssible to confuse the issue and raise doubts.

    Most of the “debate” has little to do with genuine scientific research and discussion, and a lot to do with business, politics and ideology.

  • Ray Ladbury // January 2, 2010 at 8:13 pm | Reply

    Candies Shoes, Where possible, stick with what the scientists are telling you–the ones actually doing the research. Try out realclimate.org as well. And don’t hesitate to ask questions.

  • Hank Roberts // January 2, 2010 at 9:01 pm | Reply

    The results of your Turing Test are in , and the prognosis is not good. It’s a bot.

    > Results … about 17,900 for “candies shoes”.

Leave a Comment