Very expensive monitors

I’ve been waiting to replace an ancient Dell 24" monitor for a few years now. My goal has been to find a 32" 4K display, preferably with a refresh rate higher than 60Hz.

Unfortunately nothing like that has been available, with the monitor industry seeming to decide they’d top out the high refresh 4K displays at 27" - a resolution much better served by a 1440p panel (and there are a lot of excellent reasonably priced monitors to choose from in that space).

That changed today with the announcements from Acer and ASUS of the ‘unicorn’ monitor that checks just about every box: 32", 4K, 144Hz, IPS, NVIDIA G-sync Ultimate, Mini-LED backlight with 1152 local dimming zones, and VESA DisplayHDR 1400. None of your faux HDR400 here.

Perfect! Until you notice the price: US$3600. That’s over $5000 Australian. Ouch.

It’s inexplicable how it could cost (or be worth) that much, especially with LG releasing G-sync compatible variable refresh rates on all their 2019 OLED panels - at around half the price for twice the size. The only disadvantage is OLED currently only comes in 55"+ sizes - not very practical for desktop use.

I do wonder if the release of Apple’s Pro-Display XDR at AU$8500 has emboldened other manufactures to ask for big dollars for their cutting edge models. Hopefully these sell in low numbers to force some sanity to return to pricing.




Don Melton’s video transcoding scripts

Don Melton has updated his very smart video encoding scripts to make them far more automated, and significantly faster too:

…this package automatically selects a platform-specific hardware video encoder rather than relying on a slower software encoder.

Using an encoder built into a CPU or video card means that even Blu-ray Disc-sized media can be transcoded 5 to 10 times faster than its original playback speed, depending on which hardware is available.

Slightly surprisingly (given his Apple background) he recommends Windows as the platform of choice, largely due to to nVidia’s superior NVENC encoder.

Encoding a 30GB BluRay rip of Princess Mononoke took about 15 minutes, which from memory is about half the time taken compared to his older software driven scripts. The resulting file was 6GB which is far more manageable. I haven’t tried a lower bitrate for portable devices yet.

(Tip: The scripts auto-detect any installed hardware encoders and choose the best one, but I had to force it using the —hevc flag. Turns out it was because I hadn’t updated my nVidia drivers to the latest required version, which the script log shows when you use the specific flags. Otherwise it simply falls back to software encoding, which I hadn’t noticed.)

I still struggle slightly with the logic of doing this given the price of storage these days, though it does make sense for Plex streaming efficiency and portability. But it also means you’re watching a lossy source, which seems counter intuitive when we’re all buying 4K OLED screens precisely because of their image quality.

However I suspect it’s like high quality lossy audio - blind testing can’t differentiate there, so hopefully the same thing applies here. I haven’t spent time doing an A/B comparison, but I trust that these scripts are already pretty well tested given their popularity. In any case I think I’ll keep the full fidelity rips somewhere. The ripping process using MakeMKV is pretty slow for BluRay, so it’s not something you would want to repeat.




Attention eaters

Great Craigmod essay on ‘becoming readers’, and the fierce competition books face:

The main adversary of book publishing is: Anything that eats attention. Publishing has always been a game of competing for attention. Any number of media inventions have threatened to finally eviscerate the book market: radio, movies, television, et cetera. But smartphones tip the scales unlike any previous object. They do so by placing into our pockets a perfect, always-at-hand vector for lopsided user contracts, arriving in the form of apps and websites.

Fascinating how he conflates Netflix with apps like Instagram and Twitter - all delivering an endless barrage of guff to hold your attention:

Browsing Netflix is an endless sensation of falling forward into ever more content. Previews auto-play. As soon as one episode in a series ends, the next begins before credits finish rolling. If there’s no other episodes in the series, random trailers begin to play. The very design of Netflix itself is constructed to reduce your ability to a) think about what you want to do, and b) step away from the service. It’s designed to be a boundless slurry of content poured directly into your eyeballs. In a way, it’s training us to never step back or even consider, say, reading a book or going for a walk.




Hugo

Much like the missing 2012, this blog has been on hiatus for… wow, several years. I thought it was only twelve months, but it’s more like thirty.

Also similarly to that 2012 post, I’m motivated this time by moving to a new blog setup. Last time it was moving from Tumblr to self-hosted Wordpress, this time it’s moving from Wordpress to Hugo.

Wordpress has become frustrating to use on another of my blogs, especially with the release of the Gutenberg editor. Wordpress are certainly trying something new with the block editor, but it overcomplicates the ease of use of a simple text based blog. I tried it briefly but quickly found a plug-in that restored the basic text editor.

Even with the old editor restored, using the publishing interface is slow and fairly cumbersome - it feels like a very bloated content management system instead of a blog, with constant micro delays that all add up to a reluctance to logon and post. That may be due to my cheap hosting, but I’d prefer something quick and simple. Anything that feels like it gets in the way of being able to quickly post something seems problematic.

The final straw was finding that all the markdown content I’d created seemed to have reverted to HTML during the Gutenberg upgrade. This was super annoying, so I started looking around for a simpler - and faster - method.

Hugo seemed to be a good option (as long as you don’t need comments), and after some experimentation I found a great git-hosted utility by palaniraja that extracts markdown files from Wordpress (and Blogger) databases backups. It was amazingly painless, and very satisfying to see a directory with a simple list of text documents for each post. The process to get those posts appearing on a local Hugo site was also relatively painless, and the result was a super fast site of plain old HTML files.

Next it was a matter of finding and tweaking a theme, which was a good way to start to understand how Hugo works. I settled on hello-friend-ng, modifying it so it shows the full text of each post, creating year and month views, and making various other small changes.

For directory sanity I also wanted the post files should include the publication date in the filename. The date info was in the front-matter of the .md files, but automating extracting that to rename the files seemed too difficult - until I found this great post by Max Melcher which did exactly that via some Windows Powershell scripts. It’s amazing how it’s almost guaranteed that someone has done what you want, it’s just a matter of finding it - often that means StackExchange, but there’s also a million blogs out there of thoughtful people capturing their findings and processes. Thanks everyone.

I guess it’s kind of weird to be wanting a simpler method and ending up with a system where I have to build a site locally then FTP the entire site to a host in order to add a single post. Hardly simple, and I do have some reservations about whether that is going to be too frustrating. Plenty of people have obviously had similar thoughts, and there do seem to be some methods for automated build and deploy using tools like Git and hosts like Netlify. Ideally I could add a post on any device, copy just that post to somewhere, and have the site build and publish. I think that’s what the Netlify tool offers, but I haven’t tried it yet. That’s next.

In the meantime, this site is the end result of all of the above. Time to start posting again!




Social media ‘travel mode’

Maciej Cegłowski argues that given phones are somehow subject to the same invasive access as suitcases, social media developers should develop a ‘travel mode’ that severely limits access and data when activated:

Both Facebook and Google make lofty claims about user safety, but they’ve done little to show they take the darkening political climate around the world seriously. A ‘trip mode’ would be a chance for them to demonstrate their commitment to user safety beyond press releases and anodyne letters of support.

Not sure that it would really help though - he suggests it would be irrevocable once set (to thwart border agents just asking you to turn it back on), but that would seem to create a whole other set of problems (what if you have to cancel your trip at late notice with travel mode already set).

John Gruber is right, we should be fighting the entire premise:

“Travel mode” would be better than nothing, but no technical solution is a substitution for proper civil liberties. Our phones and devices should be protected against unwarranted search and seizure, period.

It will be fascinating to see what impact the election has had on US tourism at the end of all this. A planned exploration of some US National Parks is definitely off the agenda for me now, and I’m sure for many others.




Phone security at border checks

A flurry of articles recently about how secure your phone is at border checks, particularly in the US (for obvious reasons).

First a US born NASA scientist was detained returning home and told to unlock his phone - which he eventually did, not being sure what his rights were. Turns out no-one is really sure what rights you have - specifically whether you are obliged to unlock a locked device. One thing we can say is you have far less options to say ‘no’ if you’re non-native to the country you are entering.

It’s staggering to think that courts and laws now allow a border agent to demand you unlock a personal device, without any warrant or proof of suspicion, and that all the data on that device is fair game for them to copy and do with what they will. How did we get here?

Accordingly, here’s a very thorough guide to securing your data at border crossings. Some of it seems over the top - mailing yourself a SIM - but given the slow regression in privacy rights it’s probably all exactly right.




Metadata access denied

Ex-Sydney Morning Herald journalist Ben Grubb has had a long running legal battle with Australian telecommunications company Telstra over access to his own metadata. The Australian ‘security’ laws decree that all ISPs must retain two years of ‘metadata’ - a very poorly defined and broad concept - for ‘national security’ reasons.

Grubb set about trying to find out exactly what was in that cache of data, and has been through several court cases to establish what he is allowed to access. Unfortunately he has been stopped at the last hurdle, with the Australian Federal Court ruling that he cannot in fact have access to his own data.

Whilst the ruling seems to have been made on points-of-law rather than blanket ‘citizens can’t access their data’ grounds, it’s still incredibly disappointing. When a local council or debt agency can collect the data but the person generating it can’t, it certainly breaks any trust we can have that the data will be protected. Especially concerning now the fears that the data may end up being used for totally non-security related issues appear to be coming true.