
The city of Delhi, India is plagued with fog, air pollution, the burning of agricultural fields, very hot summers, and very cold winters. Together, these factors contribute to low visibility and have become a prevalent problem for Delhi’s Indira Gandhi International Airport, whose flights are often delayed due to fog.
However, the Gordon Group — led by Dr. Hamish Gordon, assistant professor in chemical engineering at Carnegie Mellon — in collaboration with the U.K. Met Office and the National Centre For Medium Range Weather Forecasting in Delhi worked on counteracting these issues by creating a numerical weather model, coined Delhi Model with Chemistry and aerosol framework (DM-Chem), to predict when fog will be particularly heavy in the area. This model can inform the airport ahead of time when flights will be delayed and prepare a reroute if needed.
The Tartan spoke to Dr. Gordon to learn more about the numerical weather model used, how the model improved weather forecasting predictions, and other projects the group has been working on.
Two of the most significant changes made in creating the DM-Chem model were implementing GLOMAP-mode, , a simplified version of the GLobal Model of Aerosol Processes that simulates aerosol particles’ size distribution and the microphysical processes that control it within a two-moment model, and improving the aerosol activation scheme based on an article on updraft velocity by Hayder Abdul-Razzak and Steven J. Ghan.
Although air pollution contributes to low visibility, researchers at the National Centre for Medium Range Weather Forecasting in Delhi did not take it into account in their national weather model, whose base programming is courtesy of the U.K. Met Office’s Unified Model, a numerical model for weather and climate forecasting. The researchers have altered the Unified Model such that it simulates winds and weather at high spatial grid resolution — an important factor to take into consideration — and monikered their model: the ‘Delhi Model.’
The Unified Model implements data in its structural phases of the modeling process. It needs data when it’s initialized to know the current state of the atmosphere and thus takes in factors such as temperature, pressure, wind, humidity, etc. The model needs to know the boundary conditions — weather at the edges — with which it is working. The user also has to implement surface and land data, such as soil moisture, urban structure, and vegetation type. It additionally takes in aerosol and emission data if available, and needs real-world observational data with which the output can be compared against so that alterations can be made to the model for fine-tuning.
What was particularly difficult in creating DM-Chem is the lack of quality datasets to implement in the Unified Model. “We don’t measure all the emissions anywhere, let alone in Delhi, where it’s particularly complicated, but we get an idea of the number of kilograms per square meter of particles coming into the atmosphere,” Gordon said.
Consequently, the Gordon Group used GLOMAP-mode, which has led to an improved simulation of the number concentration of particles that interact with clouds.
“Gas molecules condense onto the particles, the particles coagulate, they stick together, the particles can activate to make cloud or fog droplets and go into water droplets, and then the particles can be removed, either by sedimentation, settling, dry deposition, or by rain,” Gordon said. GLOMAP-mode handles these processes, and at the same time, the Unified Model is moving the particles around according to the wind, so the models communicate with each other frequently.
One dataset that provided noteworthy contributions to the study was that of the fog properties recorded during a foggy period in Paris back in the early 2010s, courtesy of the Paris Fog Field Campaign. “They had balloons going up, measuring the temperature at higher altitudes, lots of detailed measurements of the fog,” Gordon said. “Paris in 2011 had measurements of the properties of the fog, the number concentration, size distribution of the droplets, visibility, and the surface temperature.” The output of the numerical model could thus be tested against the dataset from Paris.
Updraft velocity is a result of atmospheric turbulence. “Essentially, you’ve got waves in the atmosphere, and then little waves build on bigger waves, and little waves build on littler waves until it becomes pretty chaotic.”
The chaotic motion in the vertical direction moves air from high temperatures to low temperatures, and, as water in the air moves from high to low temperatures, they have a tendency to condense into droplets.
“As you cool water down, it condenses so that the cloud formation process is sensitive to this turbulence, and as you change the grid size of the model, the turbulence doesn’t always behave nicely,” Gordon continued. “So the updraft speed in the model should really stay the same no matter what the grid size is, but, unfortunately, that doesn’t happen.” He said that this is a continuous headache his group has to deal with at the moment.
The Delhi Model also assumed a standard value for the concentration of aerosols and air pollution and did not simulate how these particles evolved over time. The model also did not accurately represent the size a particle needed to be in order to become a cloud droplet.
“Only relatively big aerosols, particles of about 100 nanometers and higher in diameter, can usually activate to become cloud droplets,” Gordon added. The DM-Chem model offers a more sophisticated description of which particles become cloud droplets and how.
While the team in Delhi worked on improving emissions datasets, the Gordon Group worked on how the model represents visibility in fog. Another challenging component in improving the numerical weather model was getting the simulation of fog visibility to take the emissions data into account.
Of course, the Gordon Group had improved the model to work on grid resolutions of 500 and 330 meters (1640.5 and 1082.7 feet, respectively, for all you Americans), which begged the question of how the model would be optimized to work within that resolution domain.
The model essentially splits the Earth into boxes, so the more boxes you have, the more boxes the model has to analyze and work with, and the faster the wind — a constantly changing and moving parameter — travels from one box to another. “The time taken to run the model goes as the size of the boxes to the power of four. So that’s hard, but the solution to that is just to run the model for over periods that are less long, or to use smaller regions,” Gordon said. But these simulations can still take a bit of time to run and require the technological prowess of the Pittsburgh Supercomputing Center.
“We just use the region of interest close to the measurements. We have a modeling system where it’s quite easy to change the location of the center of the region that we’re simulating,” Gordon said.
Students in the Gordon Group are working on testing the DM-Chem model more thoroughly by looking at other foggy regions of the world, which they can view through a satellite. “Other students in my group are simulating regions over the Atlantic Ocean where we’re trying to look at storms and stuff like this with the same model,” Gordon said.
Leave a Reply