MENU

Social Channels

SEARCH ARCHIVE

  • Type

  • Topic

  • Sort

Film camera.
© wellphoto/Shutterstock
INTERVIEWSFEATURES
7 July 2015 11:45

The Carbon Brief Interview: Syukuro Manabe

Leo Hickman

07.07.15

Leo Hickman

07.07.2015 | 11:45am
InterviewsFeaturesThe Carbon Brief Interview: Syukuro Manabe

Syukuro Manabe is the senior meteorologist on the program in atmospheric and oceanic sciences at Princeton University. After completing his doctorate at the University of Tokyo in 1958, he began working as a research meteorologist at the US Weather Bureau. From 1963-1997, he was the senior research meteorologist at NOAA’s Geophysical Fluid Dynamics Laboratory. From 1997-2002, he was the director of global warming research program at the Frontier Research Center for Global Change in Japan.

Syukuro Manabe.

Syukuro Manabe. Credit: InterAcademy Council.

Yesterday, Carbon Brief published the results of its survey of climate scientists asking them to name the most influential studies of all time. The clear winner was a paper published in 1967 written by Syukuro Manabe and Richard. T. Wetherald.

The interview began via email:

CB: Please can you compare, in lay terms, a climate model from that period [the late 1960s] to one today?

SM: Parameterisations of subgrid scale processes (eg, moist convection, cloud, land surface processes) were much simpler. The large-scale process (eg, the dynamics and thermodynamics of atmospheric circulation) is practically identical.

CB: Did you realise at that stage the importance they would end up having – and how much they would be scrutinised and debated – decades later?

SM: No. As the first step towards developing 3D model of the atmosphere, I developed 1D radiative-convective model of the atmosphere. As a by-product, I investigated the role of greenhouse gases for maintaining the vertical profile of temperature in the atmosphere and for causing it to change.

CB: In the 1960s, you were using computers to better understand how our climate worked. Now computers are used to create projections for emissions scenarios and such like?

SM: From 1960s onward, we were interested in understanding as well as predicting climate change. Equal emphasis should be placed on both of these objectives.

CB: Are we too reliant on models in trying to form a policy response to climate change? Where should we be using models to our advantage? And where should we apply caution to their use?

SM: Models have been very effective in predicting climate change, but have not been as effective in predicting its impact on ecosystem and human society. The distinction between the two has not been stated clearly. For this reason, major effort should be made to monitor globally not only climate change, but also its impact on ecosystem through remote sensing from satellites as well as in-situ observation.

CB: What, in your view, have been the landmark moments in climate science throughout your career (both your own work and the work of others)?

SM: Listed below are the landmark studies of my choice.

  • Phillipps, 1956: The general circulation of the atmosphere: A numerical experiment. Quart. J. Roy. Meteor. Soc. 82, 123-164.
  • Manabe and Wetherald, 1967: Thermal equilibrium of the atmosphere with given distribution of relative humidity. J. Atmos. Sci., 24, 241-259.
  • Manabe and Bryan, Climate calculation with a combined ocean-atmosphere model. J. Atmos. Sci., 26,786-789.
  • Manabe and Wetherald, 1975: The effect of doubling CO2 concentration on the climate of a general circulation model. J. Atmos. Sci., 32, 3-15.
  • Hansen et al., 1988: Global climate change as forecast by Goddard Institute for Space Studies three dimensional model. J. Geophys. Res., 93, 9341-9364.
  • Stouffer, Manabe and Bryan, 1989: Interhemispheric asymmetries in climate response to a gradual increase of atmospheric CO2. Nature, 342, 660-662.
  • IPCC, 1990: The first IPCC Scientific Assessment [Houghton et al. (eds)]. Cambridge University Press, Cambridge, UK, and New York, USA
  • IPCC, 2007: The fourth IPCC Scientific Assessment [Solomon et al. (eds)]. Cambridge University Press, Cambridge, UK, and New York, USA

CB: What, if anything, would you have done differently in your own career? And, more widely, what, if anything, should have been done differently in the field of climate science?

SM: In my opinion, climate scientists have done very well so far. Increased emphasis should be placed upon the confirmation of the projection of climate change and its impact through satellite – as well as in situ-observation.

CB: The issue of climate sensitivity is a hot topic at the moment, with some now arguing it is on the “low side” of what some earlier feared. Given your early work with Wetherald, how do you view the way this topic has played out – scientifically and politically – over the decades?

SM: It is highly desirable to continuously validate and update climate models through satellite, as well as in-situ observations.

The interview then moved to the telephone, with additional remarks and clarifications added by email…

CB: How did you come to work on climate modelling?

SM: I was a graduate student at the University of Tokyo in the early 1950s and I was very excited by the numerical weather prediction research at the Institute for Advance Study [at Princeton]. Until that time, weather forecasting was more or less an empirical problem looking at pressure maps and then extrapolating forward to try and predict the weather. It was primitive and non-scientific, in many ways. It is very difficult to extrapolate because cyclones develop very fast and very often. People always joked that Farmers Almanac was better than the weather forecast for that day.

CB: How did the early computers help to improve forecasting?

SM: At the beginning, to run the hydrodynamical models on the computers was hard because the computers were so slow – thousands, maybe hundreds of thousands of times slower than modern computers. This meant you had to make many additional assumptions. At that time, the empirical methods were better than the mathematical models. It took another decade before they finally accepted – and the models themselves had, of course, improved by that time – hydrodynamical models. The skill of the numerical weather prediction improved very gradually. A lot of scepticism was expressed at that time.

But as computers improved exponentially, so you could add finer, more accurate grid points onto which the meteorological data was placed. Actually, climate modelling started helping, too. Weather forecasting became remarkably better than before. The European mid-range forecasts in the UK today are remarkably good forecasts, except maybe for summer showers, compared to half a century ago.

My boss, Joseph Smagorinsky, who hired me in 1958 was working very hard to improve the hydrodynamical equation and try to input heat, condensation, radiative transfer and so forth. He was trying to develop the general circulation model, which is now called the climate model. My first job in early 1960 was to put in the variable effects such as the convention, the transfer of heat and moisture from the Earth’s surface to the atmosphere, and so on.

I think since that time numerical computation of the basic hydrodynamical equation has been improved, but basically the improvement is that as the computer gets faster and faster you use finer and finer numerical computations – the grid size becomes smaller and smaller. Moist convention, cloud formation and so on are now much more complicated today. Back then you had to make the parameterisation as simple as possible, otherwise you could never complete a computation as it took too long. My main job was to make parameterisation as simple as possible. Now they can make those parameterisations very, very complicated and include just about every factor you can see from your window. Suppose you want to parameterise the land surface process, which controls the heat and water vapour exchange between the atmosphere and the land surface, you can make these processes as complicated as possible. The more you look out of a window, the more complicated it looks. So they keep putting more and more into the models. That means that qualitatively they are more sophisticated and complicated. However, there is no guarantee that the parameterisations are quantitatively realistic. You can never compete with nature in complexities.

CB: Were you constantly frustrated by how slow the computers were in those days? 

SM: My parameterisation back then was very effective and probably very competitive with current models in the simulation of the global distribution of precipitation, for example. I feel that I succeeded in simplifying these processes. Nevertheless, it took hundreds of hours of very expensive computer time to run these models back then. I created a one-dimensional, vertical column model – which we called the convective model – instead of running three-dimensional models. I felt that we got quite realistic temperature distribution of our atmosphere at equilibrium state. Then, in order to test the model, I decided to change various greenhouse gases, such as carbon dioxide, ozone etc, and then try to see the influence of these gases to the equilibrium temperature, which was called the radiative convective equilibrium. We found that our model did produce the vertical structure of our atmosphere remarkably well. So I decided to change these GHGs by taking them all out – water vapour out, ozone out, etc – and realised that if I took them all out of the model the temperature came down by 30C compared to what we have on the ground now. If the average temperatures was 15C, and I took out all the GHGs, the temperature fell to -15C. At that time, I had no idea how important the idea of greenhouse gases would become. I had no idea it would have such a great impact on society.

CB: Did you make the connection at that moment that mankind was adding greenhouse gases to the atmosphere?

SM: It was the British engineer Guy Callendar (1938), who discovered the potential climatic impact of carbon dioxide emission into the atmosphere through industrial activity. Using a simple radiative energy balance model of the Earth surface, Callendar showed that the observed rate of the increase in atmospheric carbon dioxide was large enough to account for the increase in temperature during the several decades around the turn of the 20th century. Callendar’s approach, however, had serious flaws, as noted by Fritz Möller. Reading Möller’s paper, I realised that my radiative-convective model was an excellent tool for studying the greenhouse problem.

CB: Did it have an impact straight away? Or did it take a while for other scientists to notice?

SM: At first, it didn’t have much impact in terms of general recognition. It took more than a decade before people began to realise the merit of my approach. My contribution, I feel, is that I helped to put the research into global warming back onto the right track, identifying the very promising avenue for climate modelling.

CB: What do we still not fully understand when it comes to climate models? Where can we improve our knowledge?

SM: As the models get ever more complicated – or, as some people say, sophisticated – no one person can appreciate what’s going on inside them. One modeller can be an expert in one component of the model, but doesn’t know the other parts. But now it is impossible to do so. If you make a prediction based on the model and you don’t understand it very well then it is no better than a fortune-teller’s prediction.

What we have to do now is more of the things that I was doing in the old days when I used a simpler parameterisation of the sub-grid scale process, but keeping basic physics such as the hydrodynamical equation, radiative transfer etc. That model is run much faster than the so-called earth system model which they now use for the IPCC. And then using a much faster computer you can run a large number of numerical experiments where you can change one factor at a time, as if the model were a virtual laboratory. You can then see how the model is responding to that change. Taking advantage of fast modern computers you can run a large number of experiments; you begin to gradually understand how the climate depends on various processes and has various sensitivities.

CB: Why do we still have a large range for climate sensitivity?

SM: One of the most challenging tasks of climate science is to determine the sensitivity of climate. It is often defined as the response of the global mean surface temperature to the doubling of atmospheric concentration of carbon dioxide, given sufficient time. Unfortunately, there is a large spread among the sensitivity of climate models. The spread is attributable in no small part to the parameterisation of cloud process that has become increasingly detailed, introducing many parameters that are difficult to determine either theoretically or observationally. In order to solve this problem, it is desirable to constrain the parameterisation of cloud macroscopically, using satellite observation of the radiative flux at the top of the atmosphere. I have been  working on this problem.

CB: So, looking back over your whole career, what lessons did you learn? What might you have done differently, knowing what you know now?

SM: Initially, Smagorinsky and I had to work extremely hard in order to construct a model that is realistic enough to study climate. Looking back over my long research career that lasted almost 60 years, I find it most enjoyable to conduct numerical experiments using a climate model as a virtual laboratory. For example, increasing the concentration of atmospheric carbon dioxide, one can create the warm climate of the Mesozoic, when dinosaurs were roaming on our planet. Or one can create the cold climate of the last glacial maximum by substantially reducing atmospheric concentration of carbon dioxide. Removing the Tibetan Plateau, one can get rid of the arid climate of Gobi desert. I hope that the new generation of climate scientists conduct countless numerical experiments, unraveling the mystery of climate of past, present and future.

CB: What things have surprised you over the past 50 years in terms of changes to the climate? For example, there has been recent debate about the slowing of the rate of temperature rise over the past 15 years or so. Has this surprised you?

SM: It did not. In my opinion, it is a manifestation of natural, unforced fluctuations of the global mean surface temperature with inter-annual, decadal and millennial timescale. We witness such fluctuations in the multi-century integrations of coupled atmosphere-ocean-land model with and without thermal forcing, such as gradual increase of atmospheric carbon dioxide. Similar fluctuation is also evident in the time series of the observed global mean surface temperature, as you know.

CB: Is there anything else – other than clouds – that you think we have limited knowledge about, or perhaps don’t really even know about yet? 

SM: I think the number one priority is ice sheet modelling. I have thought that for the Greenland icesheet to disappear, which would raise sea level by about 8 metres. Recent observations suggests it may occur over substantially shorter time. There is large uncertainty about the key processes, such as basal melting, basal sliding, percolation and runoff of water in and out of icesheet, ice stream and so on. In view of its importance, major effort should be made for the modelling and long-term monitoring of continental icesheets.

As you know, global warming involves not only temperature change, but also change in both evaporation and precipitation. As predicted by climate models, drought has become increasingly frequent in many arid and semiarid regions of the world, such as southwestern region of North American continent and Australia. Extremely heavy rainfall and major flooding have become increasingly frequent in Southeast Asia and Japan. In order to realistically simulate and predict precipitation that is affected by topography of continental surface, it is necessary to use climate models with very high computational resolution. Fortunately, very powerful computers have become available, as you know. It is very desirable to place major emphasis on the study of the global change in water availability.

CB: Do you think we should now be considering some form of geo-engineering?

SM: I think it is a terrible idea, because the climate, even in the absence of global warming changes, can go up and down. So let’s suppose you put sulphates into the stratosphere to block the sunshine and the temperature still rises for a period. People will complain. And then the temperature starts to go down. People will then complain that it is too cool. This will continue. You can’t please everyone. And there will likely be huge litigation costs, too. They will blame you for whatever happens to the climate from that point on.

CB: Do we just have to try and adapt?

SM: Here in the US, it is impossible to get carbon trading, or what have you, through Congress. I think it is practically impossible to achieve the reductions in carbon emissions as demanded by the IPCC scenarios. I think for the time being – and this will probably happen anyway – that we will use natural gas produced by fracking. It will buy us some time. Meanwhile, we should put a major emphasis on clean technologies and optimise our electricity grid systems so we use less fossil fuel. Basically, everything that has already been proposed. Over time, they [clean technologies] will take over. This is a more natural approach rather than try to impose carbon trading, etc. It blows my mind how you might go about getting us off carbon fuels.

Main image: Film camera.

(The interview was conducted via telephone and email by Leo Hickman and concluded on 26 June 2015.)

Sharelines from this story
  • The Carbon Brief Interview with Syukuro Manabe

Expert analysis direct to your inbox.

Get a round-up of all the important articles and papers selected by Carbon Brief by email. Find out more about our newsletters here.