Thomas Chrowder Chamberlin was tall and rugged, with the flowing beard and raucus mustache popular in the late 1800s. As a young geology professor, he hiked the flatlands of southeast Wisconsin, surveying tracks of long-gone glaciers. It was popular at the time to speculate on what caused the rise and fall of ice ages, and Chamberlin seized on one theory that pointed to a gas.
“The effect of the carbon dioxide and water vapor is to blanket the earth with a thermally absorbent envelope,” he wrote in 1899. He concluded that doubling that gas in the atmosphere would raise the temperature of the Earth by 8 or 9 degrees Celsius.
This relationship between carbon dioxide and the Earth’s temperature came to be known as the greenhouse effect. Chamberlin was right about the linkage, though he was off in the numbers.
The numbers are still elusive.
“We’ve grown leaps and bounds in our ability to collect climate data, particularly in the last 30 years since we’ve had satellites,” says Zeke Hausfather, a climate scientist at the Breakthrough Institute, a think tank in Oakland, California. “But at the end of the day, we need to know what is likely to happen in the next few decades and the rest of the century and centuries to come. And for that, you need some sort of model.”
The science of climate modeling – forecasting – has produced 30 or more different versions that try to predict how changes in the atmosphere will alter the climate.
All point in the same direction: Add greenhouse gases and warming will follow. But the details vary.
“There are some things where there are very robust results and other things where those results are not so robust,” says Gavin Schmidt, who heads NASA’s respected climate modeling program at the Goddard Institute for Space Studies. But the variances push skeptics to dismiss the whole field.
“There’s enough stuff out there that people can sort of cherry-pick to support their preconceptions,” says Dr. Hausfather. “Climate skeptics ... were arguing that climate models always predict too much warming.” After studying models done in the past 50 years, Dr. Hausfather says, “it turns out they did remarkably well.”
But climate modelers acknowledge accuracy must improve in order to plot a way through the climate crisis. Now, a team of climatologists, oceanographers, and computer scientists on the East and West U.S. coasts have launched a bold race to do just that.
They have gathered some of the brightest experts from around the world to start to build a new, modern climate model. They hope to corral the vast flow of data from sensors in space, on land, and in the ocean, and enlist “machine learning,” a kind of artificial intelligence, to bring their model alive and provide new insight into what many believe is the most pressing threat facing the planet.
Their goal is accurate climate predictions that can tell local policymakers, builders, and planners what changes to expect by when, with the kind of numerical likelihood that weather forecasters now use to describe, say, a 70% chance of rain.
Tapio Schneider, a German-born climatologist at the California Institute of Technology and Jet Propulsion Laboratory in Pasadena, California, leads the effort.
“We don’t have good information for planning,” Dr. Schneider told a gathering of scientists in 2019. Models cannot tell New York City how high to build sea walls, or California how much to spend to protect its vast water infrastructure.
They simply vary too much. For example, in 2015 in Paris, 196 countries agreed there will be alarming consequences if the planet warms by 2 degrees Celsius, measured from the industrial age. But when will we get there? Of 29 leading climate models, the answer ranges from 20 to 40 more years – almost the difference of a human generation – under current levels of emissions. That range is too wide to set timetables for action, which will require sweeping new infrastructure, everything from replacing fossil fuels to switching to electric vehicles to elevating homes.
“It’s important to come up with better predictions, and come up with them fast,” Dr. Schneider says.
Most climate modelers use past data when they create a model. But that means there is a fire hose of fresh climate measurements that go mostly unused – from satellites, balloons, ships, planes, weather stations, and thousands of sensors floating in the seas. Dr. Schneider wants to plug into that stream and force a new model to learn from it.
“The crux is to use more data, period,” he says. He and his colleagues have spent two years figuring out how to do it.
“The original idea was not to start over,” Dr. Schneider says. He talked with a friend at the Massachusetts Institute of Technology, Raffaele Ferrari, an Italian researcher who, befitting his name, has a penchant for automotive analogies. They realized, Dr. Ferrari says, “that you can take a race car and start replacing parts but pretty soon it becomes easier to build a new race car.”
The two men had been friends since they met one summer while graduate students in Woods Hole, Massachusetts. Dr. Ferrari pursued oceanography, and helped build a much-used model for the oceans called the MIT General Circulation Model.
Oceans and the land are intimate partners with the atmosphere, but they often are studied separately. Dr. Schneider and colleagues at Caltech study the air; Dr. Ferrari and MIT researchers study the sea. Both men realized the advantage of joining forces.
In 2017 and 2018, Dr. Schneider convened a series of workshops at Caltech, grandly called the Future of Earth Systems Modeling. “We just invited the best people in the world” to hash through the topic, he says.
Their consensus was that “the development of climate models was struggling; something was not working,” Dr. Ferrari says. “They were looking for new ideas.” Gradually, they concluded they should build a new model. They named it the Climate Modeling Alliance – the acronym CliMA is “climate” in Italian and Spanish.
It is “scary to start with a blank slate,” Dr. Schneider says.
But Dr. Ferrari notes, “By starting from scratch, you can clean up a lot of what has happened over time.”
Melanie Stetson Freeman/Staff
MIT researcher Raffaele Ferrari is working with a team to create a new climate model.
Building a disruptor
It also is audacious. The group mapped out a project that will take at least five years of work by teams at Caltech, MIT, NASA’s Jet Propulsion Laboratory, and other institutions – tens of thousands of hours of research. It will take money – at least $25 million – that typically would come from government grants, but that appeared unlikely early on because of the Trump administration’s disaffection with science.
And it threatens to ruffle feathers in the climate science world, especially at the established modeling centers, like Dr. Schmidt’s NASA group at Goddard. “I think they have oversold what they can do,” Dr. Schmidt says. Is a new model needed? “They would say yes. I would probably say no.”
There are three main U.S. government-funded climate centers: in New York City; Boulder, Colorado; and Princeton, New Jersey. Rather than compete with the established centers for federal financing, CliMA turned to private money. Soon, it won a pledge from former Google Chief Executive Officer Eric Schmidt and Wendy Schmidt, whose philanthropy for the environment ranges from oil cleanup competitions to deep-sea submersibles. They pledged most of the funds needed for the first three years, and with smaller grants, CliMA launched on Sept. 11, 2018.
John Marshall, who developed the oceans model at MIT, says getting funds from outside the government is “a hugely important part of the project.”
“I see the project as a disrupter, like an Uber project,” he says. “Any organization which has been going for a long time, it kind of ossifies.”
The other distinguishing feature, Dr. Marshall notes, is those working on it. “The model is actually less important than the team of scientists that you have around it,” he contends. In fact, the 60 to 70 researchers and programmers in the CliMA group represent a veritable United Nations.
Somebody put a map on the wall at the CliMA house, a converted provost’s home at Caltech, and asked everyone to pinpoint their homes. “There were a lot of needles,” Dr. Schneider says.
Meltwater from the Laohugou No. 12 glacier flows though the Qilian Mountains in China’s Gansu province. Glaciers in the rugged region are rapidly disappearing as a result of global warming.
A climate model that “learns”
CliMA decided on an innovative approach, to harness machine learning. Satellite and sensor information is freely available – much of it for weather forecasters. Dr. Schneider envisions “training” their model with the last three decades of data, and then routinely feeding it the latest updates. The model itself could “learn” from the data and calibrate its performance with formulas refined by AI, even as the climate changes.
Climate models work by dividing the globe into a grid. That allows computers to replicate conditions by calculating atmospheric formulas for each grid cell. These are not simple equations that focus only on the level of carbon dioxide; models now handle hundreds of factors that influence climate, ranging from solar radiation, particles from volcanoes, dust, ocean spray, and savannas, to agricultural fields and sea ice.
They do this in grid cells typically about 15 to 30 miles square and a few miles deep. CliMA strives for much smaller cells, but to model Earth’s whole atmosphere they would need supercomputers thousands of times faster than presently exist. Instead, CliMA will drill down on a sampling of smaller grids – some little more than 100 feet square and 15 feet deep – and use AI to teach the rest of the grid formulas from those samples.
By focusing on this level of detail, the CliMA group hopes to pick up influences on the climate that are often just roughly estimated. Chief on Dr. Schneider’s list is to gauge the influence of clouds. Low, flat, stratocumulus clouds gird huge swaths of the planet at any given time. But they are so wispy there is no good way of including them in models.
They “are really important for Earth’s climate. They cool Earth by about 8 degrees Celsius globally, simply by reflecting sunlight,” Dr. Schneider says. Existing climate models underestimate their effect – he calls it a “blind spot” – creating large uncertainties in the models. At extreme levels of greenhouse gases, Dr. Schneider says, stratocumulus clouds could disappear entirely, jolting temperatures of the Earth. It might be one reason there were crocodiles in the Arctic 50 million years ago, the most recent hot period in Earth’s history.
But the team had a problem. While computers have gotten faster and faster, the very mechanics used by computer modelers is creaky. They have to tell a computer what to do, step by step, in a “language” the computer can decipher.
Since 1957, scientists have often used a programming language called Fortran. It is fast: Once written, it causes computers to act with great efficiency. That efficiency is vital when the number crunching is as big and complex as it is in a climate model, carrying out trillions of calculations per second.
But Fortran is clunky and laborious to write. It must be further modified for today’s supercomputers. For younger programmers, it is kind of like ancient Latin. “If you tell undergrads you want help writing Fortran, nobody wants to get involved,” says Dr. Ferrari. “They think it’s the end of their career.”
Newer languages – there are dozens, with names like Python, C, and C++ – are easier to write, but they take more time to process in the computer. For the CliMA modelers, that was a dilemma.
“They said they thought they wanted to use Julia. I was tickled pink, really.” – Alan Edelman, a mathematician at MIT who created a new computer language (Julia), which members of the Climate Modeling Alliance are using (shown here with his corgi, Phil)
The man with the answer occupied an office on the seventh floor of MIT’s “CSAIL” building – the Computer Science and Artificial Intelligence labs. Alan Edelman is known for bringing his pet corgi to classes and then dispatching students to find it when it wanders off. “He hasn’t figured out that the dog doesn’t sit there,” chuckles a colleague.
Dr. Edelman also is an award-winning mathematician who pondered the computer language conundrum, and developed a new language in 2009. He called it Julia, and it bridged the language gap, he says: as fast as Fortran, and easier to use than Python.
In September 2018, he got a “Dear Professor Edelman” email from Dr. Ferrari, and six hours later three fellow MIT professors who had been at the Caltech climate modeling sessions were perched on the narrow couch in his office.
“They said they thought they wanted to use Julia,” Dr. Edelman recalls. He immediately saw Julia and CliMA as a perfect match. “I was tickled pink, really.”
But Julia had relatively few users, and in California, Dr. Schneider was worried it might flop. “Everyone was excited about Julia – so much so that I was very nervous, because it felt like too much groupthink,” he recalls. “What we’re trying to do has its own risk. Do we really want to load the risk of a new language on top of it?”
But the simplicity of using Julia was a game-changer. When the CliMA group began to cautiously use the new language, Dr. Edelman suddenly realized that other scientists and younger graduate students were poking their heads into his lab to learn about this new whiz-fast programming tool. People from different disciplines were interacting. “I didn’t see this coming,” he says.
The group at CliMA was quickly convinced. “There was no way we could have done it with another language,” Dr. Ferrari says. “After three or four months, we realized there was no way we could go back.”
“Julia paid off for us better than they would have imagined,” Dr. Schneider admits.
With Julia, the team released CliMA 0.1, part of the first version of the model, in June. Dr. Schneider says their work is ahead of schedule, and he is encouraged.
They are a step toward providing climate information that will be useful at a local or regional level, helping predict the frequency of droughts, extreme rainfalls, heat waves, and major storms. Dr. Schneider even envisions a cellphone app that could give information to anyone contemplating, say, the purchase of a house or planning future crops for a farm.
“You need granular information on a local level,” Dr. Schneider says. “The challenge, in the climate area, is how to make information actionable. There’s a large gap between what we scientists communicate and what people can actually use.”
To bridge that, they are looking to weather forecasters. “When they tell you that tomorrow it might rain, you don’t know exactly what to do. You want to know whether the probability is 10% or 100%,” Dr. Ferrari says. “If it’s 10%, you’re going to get an umbrella; if it’s 100% you might not go for a hike. So knowing that ... is crucial.”
Ultimately, he and Dr. Schneider say, they expect to achieve that.
“We’ll see at the end,” Dr. Ferrari says. “It’s always a mistake to say that you shouldn’t try something new. Because that’s how you change the world.”