Weeknotes: 8th September 2025

Last week

Video playback mysteries

Last week I posted a video of an animation I made with Claudius, which worked fine in Firefox, but not in Safari, which refused to play it. I was short on time when I posted the weeknotes, so I had to ship it that way, but afterwards I had a look at what was going on, and found that Safari was making an initial request for just the first two bytes of the video, and my hacked up static file serving code in OCaml wasn't respecting the range HTTP header in the request and just returning all the data. As a result, Safari just gave up (despite me giving it a superset of what it asked for).

A relatively easy fix, but a reminder that as much as I like Dream, the OCaml web framework I've built my sites upon, I do wish it did a little more in the way of taking care of things like this.

LIFE

In lieu of my plans on using Tessera for generating habitat maps, this week I've been coding up a revision of the maps Tom Ball did for his food impact work, which combines the Jung habitat map we use in LIFE with more detailed data on Global Agro-Ecological Zones from the UN and the HYDE anthropogenic land use estimates. Rather than just re-implementing Tom's existing work I'm implementing a revised version based on discussions in the LIFE team about trying to use a more detailed approach to integrating the two.

This is a before map showing the Jung habitat classification:

Screenshot of QGIS software displaying a grayscale world map showing habitat classifications based on Martin Jung's work. The map uses varying shades from white to black to represent different habitat types.

And this is the after once I've added in the GAEZ/HYDE data:

Screenshot of QGIS software displaying a modified version of the habitat map that incorporates farming land use data from GAEZ and HYDE 3.2. The map shows similar grayscale patterns to the first image but with much of the world covered in lighter areas that show a greater impact for farming.

The lighter areas in the second map show an increased impact of framing across a lot of the planet.

The main challenge with this is that both the GAEZ and HYDE 3.2 datasets come at a resolution of 10 km per pixel at the equator, and the original Jung map is at 100 m per pixel at the equator, and so if you look in detail there's a fair bit of artefacting that results:

Screenshot of QGIS software displaying a zoomed-in view of the updated habitat map, revealing pixelated grid patterns where the coarser resolution GAEZ and HYDE datasets have been integrated with the higher resolution Jung dataset, creating visible quantisation artefacts in the grayscale land use classifications.

For our particular use case we can live these quantisation artefacts as the final dataset will be quite course itself, but it's interesting to keep track of this sort of issue when combining different datasets.

One interesting side note is that it's the second section of the LIFE pipeline where I wrote a lot of unit-testing, as I felt the risk of a small mistake getting lost in data that is hard to view is quite high. That said, I was trying out the latest beta builds of Apple Silicon native QGIS on my Mac Studio and it was really handling the 100m TIFs well, which was nice to see. That said, I'm also very glad the tests are there!

Topogmesh

This was the last week this summer working with UROP student Finley Stirk on Topogmesh. The task I set Finley was to build a tool that lets me more readily generate 3D-printable models from open data, and he's not only done that, but as I wrote last week by integrating Open Street Map data, he's made it quite easy for anyone to start building out colour printable models from surface elevation.

Last week I showed a section of Liverpool we printed, but OSM data goes well beyond streets despite the name, and to prove that this is a print of Mount Rinjani, a volcano on the Indonesian island Lombok:

A photo of a 3D printed circular topographical model sitting on a wooden surface. The model depicts a mountainous volcanic landscape with a crater lake, using a colour-coded system where red/orange represents the ground, green shows forested areas on the side of the volcano, and blue depicts a crater lake at the centre.

What I really like about this print is that you can already see a lot more from the 3D-model than you can from just a top down map of the same data on the screen: for instance you can see the crevice in the side of the volcano that limits the size of the water pooling in the crater.

Finley's final act has been to add some documentation to the code repo and write some weeknotes on why you might want to use it, so with luck it'll be use to more people than just me!

Learning about digitising moths

I got to go visit the work of some other UROP students from the computer lab, this time helping out with a project for the Museum of Zoology trying to make a simple way to crudely but quickly scan insects.

Photo of a museum display case containing pinned butterfly specimens arranged in rows, showing various species of blue butterflies (likely from the Lycaenidae family) with pale blue wings marked by dark spots, alongside cream-colored specimens with black spotted patterns, each mounted with small identification labels in a glass-topped wooden display case.

They do have an existing rig for doing this, the open-source scAnt device that generates some high-resolution 3D-images of the moths and butterflies in the collection, but it takes several hours to do so:

Photo of an open source scAnt 3D scanner setup on a laboratory workbench, featuring a black motorised scanning apparatus with a white circular platform, surrounded by various cables and electronic components.

As good as the scAnt device is, when you have millions of specimens, a crude 3D-model that is quick to generate will probably be good enough to then let people know what is there that can then be scanned in more high detail. To this end the student group working on this project have been trying to make a 3D-printable jig for holding a phone a set of orientations that the images can then be put together using a Gaussian Splatting tool like Polycam (isn't "Gaussian Splatting" a great term?).

Photo of a black 3D-printed dodecagonal (12-sided) jig with multiple circular holes and rectangular openings, designed as a prototype for positioning insects during mobile phone scanning, sitting on a laboratory workbench with various equipment and papers visible in the background.

Anil brought along lab alumnus Alex Ho, now a professional photographer, and myself (as I guess a quasi-professional 3D-print expert) to give feedback, and we had a great couple of hours chatting with the students about the practicalities of what they were proposing.

Often my job feels like I could be doing software for any organisation - I write software for a living now just as much as I have done before I re-joined the University - but it's afternoons like this where you really appreciate what a unique environment we actually get to work in.

This week

  • Generate some new LIFE results with the habitat maps I generated based on the GAEZ/HYDE data so we can review them
  • Think about the talk I'll need to give in a month for PROPL
  • Continue to push at the validation implementation I started a couple of weeks ago before I got pulled onto these habitat maps

Tags: topogmesh, 3D-printing, life, yirgacheffe