The Bob-bot strikes again
At the end of last year, Bob Ward had this to say about a new paper on climate sensitivity:
A new paper on climate sensitivity that @mattwridley and other peddlers of denial will pretend does not exist: https://t.co/RJWJFNODkR
— Bob Ward (@ret_ward) December 31, 2015
In fact, far from being ignored by the sceptic community, the paper in question, by Marvel et al., turned out to be something of a car-crash and was the source of steady stream of more-or-less amused blog posts in the months that followed.
This morning I couldn't help but wonder if someone has replaced Bob with a bot, preprogrammed to issue identical tweets in the response to any new paper on climate sensitivity:
New paper on climate sensitivity that @mattwridley and other climate change 'sceptics' will pretend does not exist: https://t.co/nXqOqxkz3D
— Bob Ward (@ret_ward) April 8, 2016
The paper, by Tan et al., looks as though it's a GCM-with-observational-constraints effort.
Global climate model (GCM) estimates of the equilibrium global mean surface temperature response to a doubling of atmospheric CO2, measured by the equilibrium climate sensitivity (ECS), range from 2.0° to 4.6°C. Clouds are among the leading causes of this uncertainty. Here we show that the ECS can be up to 1.3°C higher in simulations where mixed-phase clouds consisting of ice crystals and supercooled liquid droplets are constrained by global satellite observations. The higher ECS estimates are directly linked to a weakened cloud-phase feedback arising from a decreased cloud glaciation rate in a warmer climate. We point out the need for realistic representations of the supercooled liquid fraction in mixed-phase clouds in GCMs, given the sensitivity of the ECS to the cloud-phase feedback.
I can't imagine quite how large the gap between a 5.7°C-ECS climate simulation and the historical temperature record is going to be. I wonder what fairy story will be conjured up to explain that away.
Reader Comments (76)
Apr 9, 2016 at 5:30 AM | stewgreen
It is not just a flagrant misrepresentation of participants that Amelia Sharman, but what the "debate" is all about. People with PhDs in many other sciences who back the alarmism allowed to express vague opinions on their beliefs. But sceptical scientists speaking in their specialist fields (e.g. Richard Lindzen, the late Bob Carter) are shouted down. Even worse, the alarmist climatologists are suddenly experts is that on matters of forecasting, public policy, economics, ethics and lifestyle choices simply because they can play around with some computer models.
lomborg had an interesting WSJ article the other day..ridiculing the Anointed One for his pseudo scientific stance on many many warmish deaths in the usa to come..
Models deviate more and more from reality, but those who believe in models don't much care.
https://mobile.twitter.com/kate_sheppard/status/718152319603490820
She says we're fxcked, because the models contained a now-supposedly-fixed bias. No, that is just an error in models, one of many.
Frank wrote: "The authors argue that the observed SLF (satellites) is much greater than assumed".
If correct, the models will have to be re-tuned. No one can predict how re-tuning will change ECS. One way to correct for this problem may raise ECS. That way may create other problems that need correction and lower ECS. Parameters interaction with each other in unpredictable ways. A prestigious scientific journal has no business allowing authors to speculate about the outcome of re-tuning. Those experiments haven't been done. The "news" is that another important fundamental aspect of models needs fixing.
The fundamental problem with models is that there are an infinite number of combinations of parameters, and many combinations do a reasonable job of reproducing today's climate. A recent paper described re-tuning one parameterization in the GFDL model and lowering ECS by 1 K without any deterioration in the model's performance.
Dear ole Bob,, having another predictable hissy fit.
As a reasonably intelligent layman, I feel that obvious twerps such as 'Retward' give Alarmism a bad name.
He reminds me of the multitudes of hopeful swains who once filled the lonely hearts columns of the Times of India, who described their educational status as 'failed BA'.
@Franktoo: you are not ambitious enough. The Climate Model problem can be easily solved, just by replacing R D Cess' 1976 'effective emissivity' argument, which is scientific nonsense because it presumes emissivity is the ratio of radiant exitances at different temperatures, with the truth. Real CO2 Climate Sensitivity in a cloud free atmosphere is ~0.85 K. Clouds reduce it to exactly zero because that's their purpose, hence the planet's incredible present thermal stability.
Jeremy Grantham, Bernie Madoff, Tesla and Enron is this the last time you will see them in one sentence ?
The Bob-bot and the Microsoft-offensive-tweet-bot that had to be removed ..are they that different ?
Oh yes MS has some humility: Microsoft apologizes for 'offensive and hurtful tweets'
NCC 1701E wrote: "Franktoo: you are not ambitious enough. The Climate Model problem can be easily solved, just by replacing R D Cess' 1976 'effective emissivity' argument, which is scientific nonsense because it presumes emissivity is the ratio of radiant exitances at different temperatures, with the truth. Real CO2 Climate Sensitivity in a cloud free atmosphere is ~0.85 K. Clouds reduce it to exactly zero because that's their purpose, hence the planet's incredible present thermal stability.
Our planet's climate doesn't have great thermal stability: ice-ages change GMST by about 5 degC - despite that fact that the global solar forcing from Milankovitch cycles is trivial compared with that of increasing GHGs. Before ice ages, there were other major changes, such as glaciation of Antarctica. Let's not ignore the evidence for instability and stability.
Do you have any references concerning Cess's work? One can calculate an "effective emissivity" for large objects such as our planet, but there is no way for simple calculations to determine how that emissivity will change. We can calculate how emission changes for simple blackbodies near 255K: 1 W/m2/K of warming. The earth is not a simple blackbody. Taking an ECS of 3.7 K/doubling and dividing by 3.7 W/m2/doubling gives ECS in different units: 1 K/W/m2. Taking the reciprocal give what is commonly called the climate feedbacks parameter, 1 W/m2/K. This is a type of "effective emissivity" and those who believe in high ECS believe the earth's emissivity will be dramatically decreased by feedbacks. When the earth warms 1 degK, it will emit only 1 W/m2 more LWR to space, not 3.7 W/m2. Actually, we need to worry about how both LWR and SWR, so technically we say that net radiation to space (radiative cooling) increases only 1 W/m2/K (for ECS 3.7 K/doubling).
To get an ECS near 0 K/doubling, the climate feedback parameter needs to be infinity. In other words, no matter how much warming, the earth radiates the same amount of energy to space. This appears absurd.
The planet warms about 3.5 degK every year during summer in the NH. If the planet behaved like a simple blackbody, it would radiate 13 W/m2 more LWR to space. This doesn't happen. Through clear skies, the planet radiates about 8 W/m2 of LWR - a big difference that is measured reliably. About 2.2 W/m2/K. Climate models predict that OLR will be reduce about 2 W/m2/K due to increasing humidity and increased by about 1 W/m2/K because increasing humidity will decrease the lapse rate (causing greater warming higher in the atmosphere that at the surface). Those same models predict that without feedbacks the planet should emit about 3.2 W/m2/K more LWR, similar to that expected for a blackbody but slightly different for complicated reasons I won't go into. So GCMs appear to get increase emission through clear skies about right - a reduction in LWR of about 1 W/m2/K.
The situation with respect to reflected SWR is more complicated, but LESS SWR is reflected by clouds when it is warmer. GCMs do poor and mutually-inconsistent job of predicting the observed changes in SWR from both clear and cloudy skies.
http://www.pnas.org/content/110/19/7568.full.pdf
Whatever Cess said about effective emissivity a half-century ago is an obsolete fantasy today. It is perfectly plausible to believe that ECS is 1.6-2.0 K/doubling (Lewis and Curry, Otto), meaning a climate feedback parameter of 2 W/m2/K. IMO, it is absurd to believe climate sensitivity is near zero. IMO, it is also absurd to believe in anything models say about cloud feedback and therefore their quantitative projections are meaningless.
When climate science sums match up to reality, I will take notice again.
As climate scientists now deny having predicted anything, why did I ever bother in the first place?
Golf Charlie: Since 1970, when sulfate emissions stabilized, warming has been consistent with a best estimate for ECS of about 2.0 K/doubling. The sulfate emissions produce aerosols which are the biggest source of uncertainty anthropogenic forcing. Since 1880, observed warming has been most consistent with a best estimate for ECS of 1.6 K/doubling. That period covers two cycles of the AMO, the biggest know source of long-term variability. There has been significant variability in the rate of warming in response to rising GHGs (i.e. unforced variability), that the IPCC has chosen to downplay.
Which climate scientists have denied predicting anything? In general, projections from models have exceeded observed warming by almost a factor of 2X since 2000, but there is a lot of uncertainty and noise in warming over this period. Climate scientists have not dealt well with the over-confident projections that they have made for political reasons
Frank
A bit of a side step, but ..
I watched an astronomy TV programme in which a galaxy appeared twice on a photographic plate due to an (apparent) powerful gravitational lens effect. There are examples of multiple images of distant galaxies on the same image. That is in addition to all the other mind boggling potential distortions out there.
http://csep10.phys.utk.edu/astr162/lect/galaxies/lensing.html
At this point, I would sell Hubble to the Russians and take up water paints. What they actually did was to 'normalise' the situation by adding numerous dark matter areas that explained the supposed gravitational lensing effect.
I've lost my key, I've looked everywhere. There must be a ghost in the house. Give me a grant to hunt it down.
Modern science seeks to explain away its deficiencies by inventing new factors that must be investigated (at a price). Keeping oneself in a job seems to be the primary motivation. We are in an era of tackling impossible complexity (e.g. climate) and running away from the reality that we don't have the tools to deal with it.
Astronomy TV programme (UK only)
http://www.bbc.co.uk/iplayer/episode/b075dxsq/the-beginning-and-end-of-the-universe-2-the-end
BBC mockumentary ridiculing dark matter science
http://www.dailymotion.com/video/x2jw29g
Frank, they have all denied responsibility for predicting anything. This is apparent when failed predictions pass their 'sell-by' date, and nothing has happened in accordance with the predictions.
If I have missed out on a more recent instruction stating that climate scientists should now accept responsibility for their failed predictions, I must have missed it, but I don't normally get access to climate science instructions, or their admissions of blame, or failure.
Pielke Jr spent a long time looking at how we met differentiate skill from chance, using football scores* as the subject matter.
http://rogerpielkejr.blogspot.co.uk/2010/07/skill-in-prediction-part-i.html
What I took from it is that given the narrow range of possible global temperatures and knowledge of how they changed in the past, computer climate models aren't really predicting anything. They aren't doing much more than guessing. Like the previous astronomy example, the laws of physics involved are completely valid, but the complexity of the system defeats any attempt to apply them skillfully.
* He ran an experiment with a number of his blog participants to predict the final positions of English premier league teams. The winner was moi, yes, little old me. LOL !
At this point I had very little skill in predicting the outcomes of those games. No more than the average octopus, anyway.
@Frank: the emissivity of Earth's atmosphere is very near unity**. This is by definition: self-absorption IR Physics means that spectral temperature for each band is set by the emitting species being space filling at that altitude/temperature. All you need to prove this is a simple differential equation borrowed from analytical spectroscopy.
The planet self controls because as CO2 IR to space falls, H2O IR increases. The cloud system separates surface temperature from OLR control. It's all so easy once you realise clouds are a dependent variable of the thermodynamic system, acting to keep net surface IR warming/cooling of the atmosphere at zero, also minimising radiation entropy production rate, as required by the 2nd Law of thermodynamics for an open thermodynamic system.
**The only doubt is the Atmosphere Window IR from the surface, which could be 0.97 emissivity. PS, Cess 1976 was and remains absolutely wrong because his claims imply >50% more atmospheric heating than reality. Later in 1976, the GISS modellers put out a paper backing up Cess, but it included 'negative convection' to offset the imaginary extra energy. That was scientific fraud; 'negative convection' does not exist. 24 years later, Hansen admitted this to an AIP interviewer; all of this is fully documented. I feel very sorry of those who have not realised that a simple energy balance destroys IPCC/GISS Climate Alchemy, but any professional should have checked this basic issue.
NTZ explained the above that @NCC_1701E mentioned
Models, schmodels!
How about a paper that deals with actual data?
Or would that not fit the narrative? Oh come on now, surely the data can be changed post-hoc so that it does fit the narrative? What's that you say? Oh...
I was amazed that anyone agreed to do this in the first place. This is the Victor Meldrew effect in action. These people are genuinely daft in the head and twice as pompous..
"COUNCIL bosses are under attack for BINNING weekly food waste collections after spending hundreds of thousands of pounds on the service.
More than 110,000 plastic bins were doled out to homes as part of the flagship environmental service which was rolled out between September and November 2013. Renfrewshire Council also paid out a fortune on the six Isuzu Farid 7.5 ton refuse collection vehicles used to operate the kerbside collection scheme across the local authority area.
However, under proposals going to the Leadership Board on Wednesday, bosses are dumping the service and merging food waste collection with garden refuse pick- ups. The move has sparked anger, with Renfrewshire councillors scratching their heads over the cost to the public purse in setting up a scheme which lasted barely two years.
SNP group leader Brian Lawson, member for Paisley East and Ralston, said: “This means 110,000 plastic bins, two per household, will be literally going in the bin themselves. “I wonder if the council has plans to recycle them? Add to this the cost in purchasing food waste bin collection vehicles, and the associated costs in staffing the service too.
http://www.dailyrecord.co.uk/news/local-news/fury-renfrewshire-council-bins-food-6927416#xfQCUAbGl7RJK4h7.99
NCC 1701E: Emissivity is a tricky (and unnecessary) concept to apply to the Earth's atmosphere. First, the atmosphere's temperature varies by 100 K (a factor of 3X after converting to power emitted via oT^4). What temperature do you use? Second, emissivity (emittance) is I/B(lambda,T). B(lambda,T) arises from Planck's Law - which is derived by assuming that radiation reaches EQUILIBRIUM with "quantized oscillators", now know to be quantized energy levels in molecules. Thermodynamic equilibrium means that the number of photons emitted (which depends on temperature) is equal to the number of photons absorbed (which depends on the radiation intensity). Black cavities (holograms) with a pinhole were the devices first used to study thermodynamic equilibrium between radiation and matter. Unfortunately, even thermal infrared often does NOT come into thermodynamic EQUILIBRIUM with the molecules in the atmosphere on its way to space.
When we talk about an "optically-thick layer" of atmosphere, we are saying that the radiation emitted by that layer has come into thermodynamic equilibrium with gas molecules at a particular temperature. Such layers emit radiation of blackbody intensity and their emissivity is unity. The lower atmosphere is optically thick at some wavelengths, but not others. Go high enough and there will always be some altitude with too few GHG molecules to allow equilibrium between absorption and emission at the different temperature for that altitude.
Solids and liquids generally contain radiation of blackbody intensity due to the same equilibrium, but some of that radiation is reflected as it leaves the surface. This reflection produces emissivity less than 1 (explaining why emissivity equals absorptivity). Since gases lack a surface, their emissivity is 1.
When we talk about an optically-thin layer of atmosphere, we are admitting that absorption and emission have not come into thermodynamic equilibrium before radiation exits that layer. In that case, some people say that the emissivity of the gas is proportional to the density of GHGs it contains and its thickness. I don't like using the same term for an intrinsic and extrinsic property of a material.
In summary, emissivity means different things depending on whether one is discussing an optically thick or thin layer. Some locations and wavelengths in the atmosphere are optically thick and some are optically thin. The temperature is not the same everywhere. So the emissivity of the atmosphere is a meaningless concept to me. (The other dimensionless terms, transmittance and absorbance, cause endless confusion when applied to an atmosphere that emits thermal IR, but are extremely useful when emission is negligible.)
For optically thin layers of atmosphere where thermodynamic equilibrium between GHGs and radiation can't be assumed, we use the Schwarzschild eqn to calculate how radiation changes as it travels through a layer:
dI = n*o*B(lambda,T)*dz - n*o*I*dz
The incremental change in spectral intensity (dI) at a given wavelength as radiation of spectral intensity I passes an incremental distance dz through any non-scattering medium is determined by the emission (first term) and absorption (second term) that occurs along a path through the layer. n is the density of the GHG, which depends on the density of the atmosphere and mixing ratio of the GHG. o is the absorption cross-section of the GHG at that wavelength (which is also the emission cross-section and quantifies strength of the interaction between radiation and gas molecules). B(lambda,T) is the Planck function. Radiation transfer is calculated by integrating the Schwarzschild eqn along a path (say from the surface of the earth to space for OLR or vice versa for DLR) and then integrating over all relevant wavelengths. Stack enough optically thin layers on top of each other and you may or may not produce an optically thick layer. Just do the math and you don't need to worry about emissivity or the inhomogeneity of the atmosphere. (Terms for scattering can be added.)
Given its reliance on the Planck function, the Schwarzschild eqn is only appropriate for gases when the Boltzmann distribution correctly predicts the fraction of molecules in an excited state. Below about 100 km, collisions relax excited GHGs and create excited states much faster than they can emit and we call this situation "local thermodynamic equilibrium". It is inaccurate at higher altitudes.
In the laboratory, we use light sources that are several thousand degK. In that case, the emission term is negligible. n and o are usually constant and integration gives Beer's Law.
If dI/dz = 0, I = B(lambda,T) and we have a layer that is emitting blackbody radiation at that wavelength. Or n or o = 0.
Basically, the Schwarzschild eqn says that the radiation that emerges from a layer is closer to blackbody intensity than when it entered and the rate (dI/dz) at which it approaches blackbody intensity is proportional to the density and absorption coefficient of the GHG. If numerical integration is done with layers thin enough to have an constant temperature, pressure, GHG mixing ratio, absorption-cross section and radiation intensity (I is much greater than dI), then one doesn't need to worry about the fact that these parameters are all functions of altitude (z): n(z), T(z), I(z) and o(z).
Programs like MODTRAN and HITRAN that calculate "radiation transfer" are using the Schwarzschild eqn. AOGCMs use short cuts to reduce the amount of computer time needed to perform the calculations. One of the early coupled model intercomparison projects (CMIP) discovered that these shortcuts were producing significantly different results than integrating "line-by-line". These shortcomings were corrected. The water vapor continuum has also been refined.
On a different subject, I fail to understand why intelligent skeptics continue to refer to work a half-century old - like that of Cess discussed above. Disagreements about the GISS model in 1976 aren't relevant to the world today.
NCC 1701E: Your comments about the principle of maximum production of entropy are interesting. To my knowledge, no one has found a way to use it to predict climate sensitivity. See Judith Curry's post on this subject and the linked paper. And I gather that the general applicability of this principle remains unproven.
One way to calculate entropy production is using AOGCM's. Perhaps someday we can discover the best parameters for a climate model by adjusting them to produce maximum entropy.
@Frank/Franktoo: it's not maximum entropy, it's minimisation of the rate of radiation entropy production as the terrestrial open thermodynamic system in steady state equilibrium transforms low entropy solar photons to high entropy IR photons
Read Chris Essex, also Brookhaven Labs publications. My simple engineering approach is to show Climate Alchemists failed in 1976 by pretending they could create >53% more atmospheric warming than reality, then fraudulently offset it, firstly 'negative convection', now the UKMO's more plausible but still wrong Kirchhoff's Law of Radiation approach, based on incorrect cloud aerosol optical physics. This was and remains unprofessional but I'll probably be jailed for saying so!
NCC 1701E: I read Judith Curry's post and the review article she linked and at least the abstracts of the papers cited by the review article discussing climate sensitivity.
https://judithcurry.com/2012/01/10/nonequilibrium-thermodynamics-and-maximum-entropy-production-in-the-earth-system/
Although I don't understand everything I have read, I was left with the impression that the general applicability of the principle of maximum entropy production is unproven, that it is difficult to apply to climate and climate sensitivity and that AOGCMs are being used to calculate entropy production in some cases. Partridge did succeed in getting approximately the correct meridional temperature gradient from a very simple model, but that is miles away from an answer for ECS. Have I missed anything important?
(FWIW, I can't explain the relationship between minimizing entropy and maximizing entropy production. One visible photon clearly represents less entropy than 10 thermal IR photons with the same total energy. dS = dQ/T and qQ can be radiation. After those basics, things get fuzzy, Are they any good videos on this subject?
The indirect aerosol effect (the reduction in cloud droplet size caused by increased nucleation by anthropogenic aerosols) is undoubtably real. Smaller droplets do reflect more SWR. We don't have any good way of determining how MUCH change anthropogenic aerosols have caused and little evidence from observations.
@Frank: the indirect aerosol effect, smaller droplets for more [CCN], is real, but only for thin, fresh clouds. Once cloud droplets coarsen in optically thick clouds, albedo increases. The issue here is that van der Hulst (1967) and Hansen (16969) thought they had solved the problem of strong Mie forward scattering, but missed a second optical physics effect which increases albedo with time, paper in preparation.
The effect of higher [CCN] is to narrow initial droplet size distribution and reduce coarsening kinetics, so gives global warming by reducing mean cloud albedo. the evidence is the end of ice ages - this process amplifies Milankovitch warming, the 1980s and 1990s fast warming caused by Asian aerosols reducing cloud albedo, and the Arctic melt-freeze cycle, a mini ice age effect. CO2-AGW is near zero, kept there by another effect of the water cycle, and is probably slightly negative from biofeedback. Sorry, but when I destroy icons, there is little left but dust!
NCC 1701E: Thanks for the info. When I reviewed this subject before commenting, I was struck by the lack of observational data confirming that calculations about the effect of aerosols on cloud albedo have any relevance to what happens in the real atmosphere. Is there any evidence that clouds formed in the presence of more aerosols (say in the NH vs the SH) have higher albedo.
@Frank: new clouds with small droplets have a given albedo defined by tau proportional to LWC/Nc.r, usual terms. Increase [CCN] and tau rises, also albedo. However, follow a cloud's evolution and as droplets coarsen at constant LWC, albedo of an optically-thick cloud increases, yet r increases. Therefore the equation used by Climate Alchemy to relate tau and r is wrong when there is a significant dispersion of droplet size. The explanation is a bit of physics that Hansen missed in 1969, increase of geometrical optical backscattering, in effect increasing optical extinction coefficient.
A-train project observations (2007 -2008) showed average tau for low level oceanic drizzle clouds was twice the no-drizzle level. The evidence is very strong.
I think I've never heard so loud
The quiet message in a cloud.
=====================