The new Mac M1 processors are pretty cool, and are enough to convert this veteran Linux user to Mac. Since Mac M1, M2, etc. are all ARM-based there are some differences to be aware of when installing your Python environment. Some instructions I have seen have recommended installing the x86 (Intel) packages, but there is a performance penalty when x86 programs are interpreted by Rosetta2 to work on the M1 Macs. Instead, these instructions will get you a fully functioning scientific Python environment working natively on M1 architecture.
Dr Ben Mather and Prof Louis Moresi from AuScope’s Simulation, Analysis and Modelling (SAM) program have recently developed a novel, speedy and data-driven way to model the temperature of Earth’s crust in southeastern Australia. Their work has since attracted a grant from the International Association of Mathematical Geosciences (IAMG) to refine the new method and investigate earth temperatures further afield in Alaska.
Geodynamicists from Sydney and Australian National universities have developed Stripy, a software module that allows scientists to efficiently place GIS ‘wrapping paper’ around the spherical Earth ‘present’. This is the first module to be built for a common scientific programming language like Python, that supports such ‘wrapping’, or mapping features. Here, developer Dr Ben Mather explains Stripy’s key functions for the AuScope Earth modelling community.
A bit of a late-comer to this game, I’ve just discovered the merits of so-called “continuous integration”. In a Journal of Open Source Software (JOSS) review for stripy, one of the reviewers suggested Travis CI as a way to test if the code is working correctly. I’ve heard of CI before, but the learning curve to actually integrate it within my workflow seemed daunting.
In an Underworld release far, far away…
Geodynamicists struggle to model planetary dynamics due to the Cartesian Empire. Physical observations suffer inappropriate meshing and projections bend minds. The Underworld team builds the ultimate weapon to erase the Cartesian nightmare based on the ancient practice of the Cubed-Sphere mesh. A fight to modify a numerical implementation begins to bring peace and restore funding across the galaxy.
Most of the codes I develop run in parallel using MPI (Message Passing Interface) using the python wrapper, mpi4py. There is a reason why highly scalable programs use this approach, and that is because each processor handles its own chunk of memory and communicates with other processors only when it’s needed. PETSc, for example, is a behemoth computing framework entirely written in the MPI computing philosophy. Despite MPI’s efficiency, there are some barriers:
Make a palaeoclimate correction to heat flow data
A method to relate the spatial configuration of mesh nodes to lithology that is differentiable - an adjoint to the inversion of geological structure.
Computing Curie depth from the magnetic anomaly
This is how you go from a jpeg to a csv