Weeknotes: 8th September 2025
Last week
Video playback mysteries
Last week I posted a video of an animation I made with Claudius, which worked fine in Firefox, but not in Safari, which refused to play it. I was short on time when I posted the weeknotes, so I had to ship it that way, but afterwards I had a look at what was going on, and found that Safari was making an initial request for just the first two bytes of the video, and my hacked up static file serving code in OCaml wasn't respecting the range
HTTP header in the request and just returning all the data. As a result, Safari just gave up (despite me giving it a superset of what it asked for).
A relatively easy fix, but a reminder that as much as I like Dream, the OCaml web framework I've built my sites upon, I do wish it did a little more in the way of taking care of things like this.
LIFE
In lieu of my plans on using Tessera for generating habitat maps, this week I've been coding up a revision of the maps Tom Ball did for his food impact work, which combines the Jung habitat map we use in LIFE with more detailed data on Global Agro-Ecological Zones from the UN and the HYDE anthropogenic land use estimates. Rather than just re-implementing Tom's existing work I'm implementing a revised version based on discussions in the LIFE team about trying to use a more detailed approach to integrating the two.
This is a before map showing the Jung habitat classification:

And this is the after once I've added in the GAEZ/HYDE data:

The lighter areas in the second map show an increased impact of framing across a lot of the planet.
The main challenge with this is that both the GAEZ and HYDE 3.2 datasets come at a resolution of 10 km per pixel at the equator, and the original Jung map is at 100 m per pixel at the equator, and so if you look in detail there's a fair bit of artefacting that results:

For our particular use case we can live these quantisation artefacts as the final dataset will be quite course itself, but it's interesting to keep track of this sort of issue when combining different datasets.
One interesting side note is that it's the second section of the LIFE pipeline where I wrote a lot of unit-testing, as I felt the risk of a small mistake getting lost in data that is hard to view is quite high. That said, I was trying out the latest beta builds of Apple Silicon native QGIS on my Mac Studio and it was really handling the 100m TIFs well, which was nice to see. That said, I'm also very glad the tests are there!
Topogmesh
This was the last week this summer working with UROP student Finley Stirk on Topogmesh. The task I set Finley was to build a tool that lets me more readily generate 3D-printable models from open data, and he's not only done that, but as I wrote last week by integrating Open Street Map data, he's made it quite easy for anyone to start building out colour printable models from surface elevation.
Last week I showed a section of Liverpool we printed, but OSM data goes well beyond streets despite the name, and to prove that this is a print of Mount Rinjani, a volcano on the Indonesian island Lombok:

What I really like about this print is that you can already see a lot more from the 3D-model than you can from just a top down map of the same data on the screen: for instance you can see the crevice in the side of the volcano that limits the size of the water pooling in the crater.
Finley's final act has been to add some documentation to the code repo and write some weeknotes on why you might want to use it, so with luck it'll be use to more people than just me!
Learning about digitising moths
I got to go visit the work of some other UROP students from the computer lab, this time helping out with a project for the Museum of Zoology trying to make a simple way to crudely but quickly scan insects.

They do have an existing rig for doing this, the open-source scAnt device that generates some high-resolution 3D-images of the moths and butterflies in the collection, but it takes several hours to do so:

As good as the scAnt device is, when you have millions of specimens, a crude 3D-model that is quick to generate will probably be good enough to then let people know what is there that can then be scanned in more high detail. To this end the student group working on this project have been trying to make a 3D-printable jig for holding a phone a set of orientations that the images can then be put together using a Gaussian Splatting tool like Polycam (isn't "Gaussian Splatting" a great term?).

Anil brought along lab alumnus Alex Ho, now a professional photographer, and myself (as I guess a quasi-professional 3D-print expert) to give feedback, and we had a great couple of hours chatting with the students about the practicalities of what they were proposing.
Often my job feels like I could be doing software for any organisation - I write software for a living now just as much as I have done before I re-joined the University - but it's afternoons like this where you really appreciate what a unique environment we actually get to work in.
This week
- Generate some new LIFE results with the habitat maps I generated based on the GAEZ/HYDE data so we can review them
- Think about the talk I'll need to give in a month for PROPL
- Continue to push at the validation implementation I started a couple of weeks ago before I got pulled onto these habitat maps
Tags: topogmesh, 3D-printing, life, yirgacheffe