Casey Dembowski
July 28, 2025
CAMx at 30: A look back on creation and impact of the first publicly available photochemical air quality model
As the 30-year anniversary of CAMx’s release approaches, we talked with CAMx creators Ralph Morris, Gary Wilson, Greg Yarwood, and Chris Emery to discuss some of the key moments in the model’s history and how they’ve kept the model at the forefront of the air quality consulting industry for three decades.
The origins of photochemical modeling are rooted in the Los Angeles “smog” problem that first garnered serious public health concern in the 1940s from the massive surge in automobiles and urban sprawl. In the 1950s scientists discovered that ozone is a key component of smog (Haagen-Smit), which forms photochemically in the atmosphere from emissions of nitrogen oxide (NOx) and volatile organic compounds (VOC).
In the 1980s, Ralph Morris, currently a principal in Ramboll, used one of the first-generation photochemical grid models (PGM), the Urban Airshed Model (UAM), to study emission control strategies to reduce ozone in Los Angeles. In the late 1980s, Ralph demonstrated to USEPA how the UAM could be used for ozone air quality planning in five other cities across the country. In 1990, USEPA recommended the UAM in new air quality modeling guidelines for ozone planning. A few years later, Ralph and his colleagues brought their diverse specialties – meteorology, chemistry, air quality, and mathematics – together to develop the Comprehensive Air quality Model with extensions (CAMx) and introduced ozone source apportionment (OSAT) as a powerful new tool for understanding how emissions from many sources and regions interact with each other to cause regional air quality issues.
CAMx simulates the formation, transport, and fate of ozone, particulates, and toxics over scales ranging from neighborhoods to continents and from hours to years. CAMx operates much like a weather prediction model by simulating the evolution of air pollution analogously to the evolution of clouds and rain. It does this by tracking hundreds of compounds that interact through hundreds of non-linear chemical reactions.
For decades, CAMx has served the needs of governments, industry, and academia around the globe and remains an essential tool for achieving clean air to protect human and ecosystem health.
Upon its release in 1997, CAMx became the first publicly available model for use by researchers and regulators. It was followed within a few years by USEPA’s own Community Multiscale Air Quality (CMAQ) model, but in that time CAMx had swiftly reached a global audience and even found its way onto personal computers.
We asked the team to walk us through what was unique about the program then and its swift acceptance and evolution in those early years.
Ralph Morris: A unique aspect of the model, beyond being freely available, was that we had a GNU license, which meant that people could download it and use it, but they'd have to make any modified version of the model available for free.
Gary Wilson: Thinking back, we had a website in 1997 [a novel concept at the early dawning of the internet]. The model was out there in the community because people could go to the website, they could see what it was about. We got a lot of interest from academia on it – graduate students had a tool that they could use to do their studies. And that access and interest was a big part of getting our user community built right from the start.
I also think about CAMx in conjunction with the development and general use of Linux [a free, open-source operating system emulating Unix]. When we first started, we had expensive Unix workstations, but with the advent of Linux, you could put our model on a PC running Linux.
Greg Yarwood: There's an interesting story in how we started using PCs to run this model. I was asked to go to Beijing to do CAMx training at Peking University in 1999. As Gary says, at that time we were running this model on $30,000 workstations, and at Peking University they were trying to do it on $1,000 PCs and they succeeded. I came back from that trip – Gary, I don't know if you remember this –and I’m like—
Gary Wilson: Have you ever heard of Linux?
Greg Yarwood: It was just such a big realization that PCs were as powerful as these workstations.
Chris Emery: Not long after that, in the early 2000s, Gary and Greg trekked off to South Africa to do some CAMx training with a research group and since then we've been doing CAMx work and training in Europe. I ended up going to Colombia around 2010 to train a university group in Medellin.
Gary Wilson: We've done some in Mexico City.
Chris Emery: It’s being used in the Middle East and India, in Southeast Asia. It’s entirely global and that's rewarding because I'm not sure we were thinking about that when we were developing the model. We certainly were thinking in terms of US applications particularly to support government work and important research done by industry and academia. But the way it took off globally, was a bit of a surprise, and it opened up a lot of interesting opportunities that I wouldn't have imagined back in 1997.
With the model in use, a user community in place, and technology evolving, CAMx and its creators were faced with two conflicting challenges – how to make the model faster and how to keep it up-to-date with the state-of-the-science.
Chris Emery: Even with Linux, the model was running on a single CPU and as the model was getting bigger and bigger, it was getting slower and slower.
Through collaboration with the University of Texas, we implemented an ability to run CAMx in parallel across multiple computer chips. Later, we implemented parallelization that spreads the load across a high-performance computer cluster system. These gave us huge speed gains. And with that speed, came more development and more capabilities, bigger grids. We even developed a hemispheric version of the model that can run the entire northern hemisphere. We're still working on making it more efficient.
Greg Yarwood: There's always more science. There's always new science. There are always new things that you can add. It takes a lot of discipline to ask, “do we really need that?” “Are we really confident that this new science is correct before it’s brought in?”
Every time you add something, it comes with a cost. And we talked about a focus on efficiency because that's what makes CAMx usable. There's a definite trade off, and I think Ralph has done us a great service over the years by stressing the model to its very limits by doing giant applications that use all of CAMx’s capability and then some, which forces you to think if we add more science and slow it down, those kinds of applications are not going to be possible. So, it comes back to, “do we really need this?”
Gary Wilson: There’s also the more practical application in that we constantly have to answer a new question because we'll be given a project and the client wants to know, “what happens if we do this?” And we've never done it before. That's where Ralph's creativity comes in. I don't how many times he's come to me and said, “hey, can we do this?”
All this leads to lots of flexibility in the model and a lot of other groups have taken advantage of that kind of flexibility and come up with their own unique ways of applying the model.
Chris Emery: The biggest and fastest moving science in particulate matter is secondary organic aerosols [SOA]. These are particulates that are generated usually from gas emissions and oxidize into different things that can condense into an organic aerosol.
The science has been evolving rapidly over the last decade or more, and we're constantly finding something new about this science that's complicated. It involves potentially thousands of emitted precursor chemical species and perhaps hundreds of different types of SOA particles. We're trying to take all that science and update the model into an approach that makes sense, remains flexible and efficient. That's a challenge for any model developer.
As we closed our conversation with CAMx’s creators, we asked Ramboll experts Jeremiah Johnson, John Grant, and Tejas Shah to discuss the future of the model.
Jeremiah Johnson: It will be important to run the model and achieve results quicker. That's an area where we can use AI to help, particularly with all the complexities and the computing time to characterize the chemical reactions that are happening in the atmosphere. Ease of use is going to be key – making CAMx easier to set up and get a working simulation. Another priority is enhancing CAMx's cloud compatibility.
Tejas Shah: There is lot of research happening in the field of atmospheric chemistry, and we like to stay on top of things and incorporate new science. That’s part of why CAMx is widely accepted. It always has the state-of-the-science implemented in the model.
John Grant: I’d add that the model is being pressed to be more and more accurate with respect to what's happening in the atmosphere related to the suite of emissions, to the air chemistry, and then the air quality results. There's a lot of work to better understand consumer products and volatile chemical products, things like hair sprays, that we wouldn't even have thought of 10 to 20 years ago when considering what's important in air quality. But now those things are becoming more important because there's been so much control of other sources. We have projects to make the model better at handling those sources, so that we can better understand what's happening in the air quality environment with the new paradigm with low emissions and very stringent standards that need to be achieved.
Want to know more?
Christopher A. Emery
Senior Managing Consultant
+1 415-899-0740
Felicia Chou
Public Relations and Media Outreach Mgr
+1 703-516-2313