Postdoc Spotlight: Nate Sime uses math to study planetary interiors
Nate Sime is a computational physicist who joined Carnegie as a Postdoctoral Fellow in 2018 after receiving his Ph.D. in Mathematics from the University of Nottingham and working for a time as a Research Associate at the University of Cambridge. He builds scalable mathematical models and tools that help shine a light on the complex physical phenomena that built our planet.
In this postdoc spotlight, Sime explains how he uses computational modeling to study planetary physics and how his FEniCS project will ensure this work is scalable into the future. Plus, he shares how mathematics can help you bake the perfect cinnamon roll.
What is your area of research and why would you say it’s important?
The short answer: Quite simply, day-to-day, I compute scientific problems.
The more elaborate answer: I contemplate physical phenomena and try to come up with ways of simplifying their vast complexity. This simplification has the caveat that the most important physical information must not be lost, whilst enabling us to tractably simulate them in a reasonable time. We use mathematics to describe these models which are typically formed of a number of conservation laws. In essence, these laws state: what comes in, must go out.
If we can find solutions to these equations, we will be equipped with tools enabling us to predict the evolution of these physical phenomena. We may also learn more about what goes on in regions in which we cannot take direct measurements. For example, we can take measurements of the Earth's composition at the surface, but we (as yet) cannot climb down a ladder to the Earth's core and check out the scenery.
Mathematicians and scientists have found solutions to these equations in idealized conditions, e.g. in spheres composed of a single material (see https://en.wikipedia.org/wiki/Spherical_cow). But the key problem is solving these equations in non-trivial geometries with material properties obtained from experiments.
For example: although the Earth could be approximated by a sphere; what if one wished to solve these equations in the geometry of Iceland, Mount Rainier, or the Mariana's trench?
Just as we approximated physical phenomena with models, we approximate the solution of their underlying equations. A popular method (which I use every day) is the finite element method.
This allows us to cut up the computational geometry into a number of polygonal shapes, which we call a mesh. We then simultaneously solve small equations which approximate our original model equations in each of these polygons. In fact, we can show that if our mesh was infinitely fine, i.e. composed of an infinite number of infinitely small polygons, we will recover the exact solution.
What’s the biggest challenge about this type of computational modeling?
If I’m being honest, I spend excessive lengths of time trying to figure out why physical things do what they do by modeling them with thought experiments. I use computers to help probe these ideas.
It can be a tedious process—imagine assembling a jigsaw puzzle with no reference picture to work with, most of the pieces are misshapen or simply missing, and you have no idea whether the finished puzzle will be the size of a postage stamp or a field. But once you have a small section assembled, you stand back to see if you can get a sense of what you're doing. Even if you can only make sense of a small part of the big picture, it's so satisfying to see where the puzzle is going.
What are the larger implications of your work?
Each tool created since the dawn of humans was fashioned with a less precise precursor. We started by hammering with rocks and now we have spherical gyroscopes crafted within 40 atoms of precision. This same process is true of computational modeling. As the models become more complicated so do their underlying equations and the tools required to solve them.
We rely on technological developments to equip us with ever more powerful computers. If we're not careful, the cost of solving larger computational models will grow faster than the rate at which the hardware can keep up. By this, I mean that if we were to double the size of a computational problem, we cannot simply double the "size" of the computer on which we will solve it. We therefore must augment our numerical toolbox with a scalable means to approximate their solution so that the relationship between computational problem size and computer size can remain linear.
I'm part of a team of people who work on one of these tools—the FEniCS Project. This software project facilitates the computation approximations of physical phenomena by the finite element method. We are forever improving on the components of the FEniCS Project such that we can compute the most recent and complicated models. Using FEniCS, we can scale from any old laptop up to the top supercomputer du jour.
This sounds very useful for geodynamicists, who can use the FEniCS Project?
The FEniCS project is completely open.
You can go to fenicsproject.org and download the source code and run its various components completely for free. We also encourage anyone and everyone to take part (fenicsproject.org/people-of-fenics).
The FEniCS Project isn't limited to problems in geodynamics either. For example, in my time at Carnegie, I've collaborated with Zack Geballe to model the laser heating of pyrolite in a diamond anvil cell to help us understand the rate at which heat flows through the Earth's mantle. People around the world use FEniCS to help understand problems in medicine, biology, engineering, mathematics, and physics. We even hosted the FEniCS community on the EPL campus in 2019 for our annual FEniCS conference. Our next in-person conference will be at UC San Diego in 2022.
What is a project you’ve been working on recently?
Although I apply these computational tools to help us understand models of geodynamics processes, it's also important to understand that what we are doing is correct and physically valid. This has shaped my two most recent publications:
- Burying Earth’s Primitive Mantle in the Slab Graveyard
Due to aggressive mixing, the composition of the Earth should become entirely homogeneous over its lifetime to the present day. However, we observe clear signals of the chemical "primitive" signal indicating that some portion of the mantle has survived this mixing process. Using geodynamical simulations I worked with Tim Jones and Peter van Keken to hypothesize that some portion of the Earth's primitive mantle is buried within a "slab graveyard" at the core-mantle boundary.
This work was made possible by developing the add-on package LEoPart for the FEniCS Project which incorporates movement of chemical composition data. As part of this numerical implementation, we found that it is necessary to very precisely resolve the incompressible approximation of the mantle.
- An exactly mass conserving and pointwise divergence free velocity method: application to compositional buoyancy driven flow problems in geodynamics
Following from the previous work where it was necessary for us to precisely approximate an incompressible mantle model, colleagues Jakob Maljaars (TU Delft), Cian Wilson, Peter van Keken and myself devised and demonstrated a numerical method where we may exactly satisfy this criterion. We further highlight the importance of numerical methods which exactly conserve mass in order to avoid insidiously spurious results.
What inspired you to choose this field of study?
Simply, I enjoy making pretty pictures. There's a field of "computational fluid dynamics", which is sometimes (affectionately) referred to as "colorful fluid dynamics". This goes back to a quote from George Box, "All models are wrong, but some are useful". Meaning that if we're not careful, we are just creating colorful pretty pictures which have no physical significance. I’ll be honest, I was initially drawn to those colorful pretty pictures.
Although I take much pride in creating these pictures it's absolutely essential that the underlying physics is valid and meaningful. The picture I produced from my most recent publication (with Tim Jones and Peter van Keken) made the cover of the journal https://agupubs.onlinelibrary.wiley.com/doi/epdf/10.1002/ggge.22218.
Here you can see a time snapshot of the temperature and material age of the Earth's mantle. Highlighted are the "slab graveyards" along the Earth's core I mentioned earlier.
How has your background influenced your research?
My parents encouraged me to use the home computer from an early age. I was allowed to play around with it whenever I wanted and told to tinker. I remember faffing around with QBasic, an early "simple" programming language, to make small games and draw shapes. This evolved such that when I reached the end of high school where I remember writing a simple program to simulate objects interacting with each other under the effect of gravity.
After watching those make-believe planets orbit and crash into each other, I was hooked.
Why did you choose the Earth and Planets Laboratory
So, this answer probably won't be typical of other postdocs at Carnegie.
Coming from the field of applied mathematics and industrial engineering, I honestly hadn't heard of the Carnegie Institution for Science. After talking to colleagues in the Earth Sciences department at Cambridge it became crystal clear that it would be a great fit for what I wanted to do. Almost instantly I wanted to work with scientists here, who are at the tops of their fields. I felt like I had a lot to contribute by bringing what I've learned from mathematics and computational modeling to the geophysics community.
What's next for you? Any projects coming up?
I'm working with Peter van Keken, Lara Wagner, and Cian Wilson to develop a toolbox for modeling subduction zones.
There's a lot to cover. We want to provide the community with the means to input their seismic data and obtain temperature and stress profiles with ease whilst also providing those with experienced backgrounds in numerical methods and supercomputing the ability to solve enormous computational simulations. Meanwhile, we have to devise new numerical methods to capture the physical phenomena we are developing in the underlying models.
Do you have any advice for current graduate students?
If it's difficult, good. Otherwise, everyone else would be doing it! Of course, this comes with a caveat. Make sure you ultimately enjoy what you're doing.
When you’re not modeling the interior dynamics of our planet, what do you do for fun?
Does it count that I used pyadjoint combined with FEniCS to find the heat distribution required to bake the perfect cinnamon roll? My secret is to caramelize the sugar without overbaking the dough.
To learn more about Nate Sime and his work, visit his website at https://nate-sime.github.io/. If you're interested in an open-source (LGPLv3) computing platform for solving partial differential equations (PDEs), visit fenicsproject.org and read testimonials from the scientists who have used the FEniCS software.