Observing anonymity

The Observatory of Anonymity is a fascinating tool from Imperial College London that demonstrates just how much of a smokescreen anonymised datasets can be. The tool asks you to choose various demographic sets, displaying how each choice narrows down the chance of being identified.

Australia isn’t included, but if I lived in London my choices led from a 1 in 70 million chance all the way down to 1 in 326. Which is still enough to be anonymous apparently, but far more select - and I’m the definition of a generic person. Any slightly less average choice makes it far more precise.

As Cory Doctorow observes, the problem becomes particularly acute when two datasets are available, which allows cross-checking and a vastly increased ability to target individuals, despite the supposed anonymisation.

Pizza Toast & Coffee

Craig Mod, walker, photographer, philosopher, and now videographer, has produced a lovely short film on the Japanese tradition of pizza toast & coffee, or kissaten.

Many years ago I toured a tiny section of Japan with my mum, and one morning we were desperate for a change from the breakfasts of fish and miso. We wandered the streets and daringly (for us - we had no language) ventured into a small cafe that promised toast.

It was wonderful - not pizza toast, like Craig, but a wedge of just as thick, white, paper-light bread, toasted to perfection, and smothered in butter. It was also the first cup of coffee I’d ever enjoyed, largely because it was served with lashings of cream, making it more like a bowl of molten dark chocolate than coffee. Now I drink double-shot ristretto piccolos, which I guess is progress?

Craig is a bit of a wonder. He writes beautifully on his experiences walking and living in Japan, the craft of bookmaking, and the challenges and joys of running a membership program that funds all this creativity. His attention to detail on all his projects is inspiring, as evidenced in the pizza video - the sound, editing, and typeface choices are all perfect.

Worth following on any platform he turns his hand to.

Posting to Micro.blog

After seeing Colin Devroe respond to my post about his photo process, I realised that there was no good way for people to interact with this site. Not that I’m expecting much, but it helps to have a way to contact an author for corrections if nothing else.

Colin managed to dig out my dormant Twitter handle for this site (that I’d more or less forgotten about) to let me know there was a problem with the RSS feed - very kind of him to point it out, and now fixed hopefully.

As a result of this interaction, I started looking around for a way to automatically syndicate posts to Twitter again (which used to be easy with Wordpress). nicemachine is published using GitHub and Netlify, so it looked like this was going to be complicated.

Micro.blog to the rescue! As well as Twitter I thought I should finally push posts to my empty Micro.blog account too, which is incredibly easy - just add an RSS feed to your account and it’s done.

And as a nice bonus, the Micro.blog setup solved the Twitter problem too - it will cross-post anything coming in on the feed to Twitter too. I’ll be interested to see if the Twitter link is to the original post or to the Micro.blog feed - hopefully the former.

In any case, posts from here should now be turning up on both platforms. Hello if you found this in either!

Tyler Hall’s Photo Management

On the topic of photo management, developer Tyler Hall is working1 on a Iris, a macOS photo management tool (‘the culmination of over a decade’s worth of thinking and experimenting’) with a great philosophy of power and privacy:

Iris is for people who care about their photos and videos and believe they’re worth safeguarding in a private, future-proof format that will outlive their grandchildren.

Iris is for family historians and archivists. As well as those who want a better, more robust, and dare I say nerdy way to manage an ever-growing photo and video library.

Iris is pragmatic and does not impose a certain workflow on you. Just point the app at your photo library on disk, and Iris will take it from there.

All sounds perfect for someone running their site on Hugo :-) Speaking of, the ability to export to a static site is a unique(?) and promising feature:

Iris can export a beautiful static website of your library, specific albums, or a custom search query. Then, just upload that to your own website.

Tyler’s blog is another marvel of well thought out posts that explain his (sometimes heated!) software frustrations and how he’s gone about solving them. He’s created a wealth of simple but elegant macOS utilities, often free, to enhance his workflows - the excellent looking TextBuddy being the most recent example.

He’s posted about his efforts to take control of his photo data in the past, and the quality of his existing work gives me high hopes for Iris. I don’t even own a Mac (waiting on the near-mythical 14" Mx Macbook) and I’m interested.

  1. Or at least I hope he’s working on - there have been no updates for a few months. He does call it his ‘white whale’, so I guess we should expect slow but steady progress! ↩︎

Colin Devroe’s Photo Management

Colin Devroe has posted his updated photo management workflow, a process that has taken him considerable effort to develop:

In 2018 I decided to set out on a sort of mission; I wanted to create a photo library that would be relatively future proof. Should I decide to use a different app or storage platform, I’d want to be able to do so easily rather than painfully.

Colin uses macOS, but the principles in the article apply to any platform.

His philosophy on keeping the photo metadata with the image file is exactly what I would want to do - protecting the metadata from a corrupt database or unexpected software changes, and allowing portability. I take the same approach to my ripped FLAC music library, which is gathering dust these days but still a nice backup for the days you want something the streaming services don’t have or have suddenly dropped.

I’d like to see an another post on lessons learnt and why not to do certain things. Also curious about the step where Colin fast-filters the keepers - is that just using Finder to flick through the images? There’s probably no need for anything more complex, which is a refreshing thought.

Also interesting that despite working on the workflow and tool selection for years, there’s still a missing piece: search on mobile.

I appreciate the effort people take to post things like this. Every post adds to the pool of knowledge to help avoid easily made but difficult to change mistakes, like Colin’s own findings when depending on Apple Photos as the main tool.

Dr Drang is one of the best examples of this kind of writing, using his posts to show the inner workings of his processes and why he chose certain paths over others - and as a reference site for his own memory.

Quite Expensive Monitors

A full year after posting about the incoming 32" 4K high-refresh monitors, it looks like they may finally be here. Or almost.

TFTCentral have listed a slew of likely candidates releasing in the first half of this year, with many models being confirmed at CES and given vaguely legitimate release dates.

They come in two variants: an HDR600 panel, and a spectacular sounding Mini LED backlight panel with HDR1400. The former will be quite expensive (I’d guess ~$2k Australian), while the latter will be more like double that. All the regular suspects - Acer, ASUS, Philips, Viewsonic - have announced models (LG are conspicuous by their absence), so hopefully there’s some pricing competition.

I almost caved in and purchased one of the LG 38" 1600p widescreen models that have been very well reviewed, but I’ve never liked curved screen monitors so managed to resist.

After all this time waiting, I think the ‘cheap’ model will more than do the job. Then it’s just a matter of finding a mythical 3080 or 6800 GPU to drive it. Maybe in another year.

Very expensive monitors

I’ve been waiting to replace an ancient Dell 24" monitor for a few years now. My goal has been to find a 32" 4K display, preferably with a refresh rate higher than 60Hz.

Unfortunately nothing like that has been available, with the monitor industry seeming to decide they’d top out the high refresh 4K displays at 27" - a resolution much better served by a 1440p panel (and there are a lot of excellent reasonably priced monitors to choose from in that space).

That changed today with the announcements from Acer and ASUS of the ‘unicorn’ monitor that checks just about every box: 32", 4K, 144Hz, IPS, NVIDIA G-sync Ultimate, Mini-LED backlight with 1152 local dimming zones, and VESA DisplayHDR 1400. None of your faux HDR400 here.

Perfect! Until you notice the price: US$3600. That’s over $5000 Australian. Ouch.

It’s inexplicable how it could cost (or be worth) that much, especially with LG releasing G-sync compatible variable refresh rates on all their 2019 OLED panels - at around half the price for twice the size. The only disadvantage is OLED currently only comes in 55"+ sizes - not very practical for desktop use.

I do wonder if the release of Apple’s Pro-Display XDR at AU$8500 has emboldened other manufactures to ask for big dollars for their cutting edge models. Hopefully these sell in low numbers to force some sanity to return to pricing.

Don Melton’s video transcoding scripts

Don Melton has updated his very smart video encoding scripts to make them far more automated, and significantly faster too:

…this package automatically selects a platform-specific hardware video encoder rather than relying on a slower software encoder.

Using an encoder built into a CPU or video card means that even Blu-ray Disc-sized media can be transcoded 5 to 10 times faster than its original playback speed, depending on which hardware is available.

Slightly surprisingly (given his Apple background) he recommends Windows as the platform of choice, largely due to to nVidia’s superior NVENC encoder.

Encoding a 30GB BluRay rip of Princess Mononoke took about 15 minutes, which from memory is about half the time taken compared to his older software driven scripts. The resulting file was 6GB which is far more manageable. I haven’t tried a lower bitrate for portable devices yet.

(Tip: The scripts auto-detect any installed hardware encoders and choose the best one, but I had to force it using the —hevc flag. Turns out it was because I hadn’t updated my nVidia drivers to the latest required version, which the script log shows when you use the specific flags. Otherwise it simply falls back to software encoding, which I hadn’t noticed.)

I still struggle slightly with the logic of doing this given the price of storage these days, though it does make sense for Plex streaming efficiency and portability. But it also means you’re watching a lossy source, which seems counter intuitive when we’re all buying 4K OLED screens precisely because of their image quality.

However I suspect it’s like high quality lossy audio - blind testing can’t differentiate there, so hopefully the same thing applies here. I haven’t spent time doing an A/B comparison, but I trust that these scripts are already pretty well tested given their popularity. In any case I think I’ll keep the full fidelity rips somewhere. The ripping process using MakeMKV is pretty slow for BluRay, so it’s not something you would want to repeat.