Earth Notes: On Website Technicals (2021-01)

Updated 2024-09-15.
Tech updates: 2022 data, min.js, hosting, soft params, profile opt, hot, storage, unLooped, looong fsck, uptime, dark tweaks, INTIFA2.
New year, new lockdown: major works not planned for this month, but tweaking and brainwaves happen, including small UX tweaks, and system greybeard moments...

2021-01-23: Dark Mode Tweaks

I have been making a few tweaks to improve site dark-mode UX.

  • I have made the site green #cfc a little less glaring; #cec seems fine for normal light-mode use too.
  • Instead of setting text to #ccc and dimming images to 0.8, I have now made things slightly brighter (for less jarring contrast with non-dark-mode pages and UI elements that I do not control) using @media (prefers-color-scheme: dark) { body { background-color: #000; color: #eee; } img{ filter: brightness(.9); } }.

A downside is that #cec and #eee are not in the 'safe' 216-colour cube of olden days, but that simply may not matter.

Adding INTIFA2 Interconnector

Now that power seems to be flowing, I have added the the INTIFA2 second French interconnector to the live carbon intensity page.

(I have called it INTIFAD, changing "2" to "D"eux, temporarily, since the code does not seem to allow digits currently — this would be a first!)

2021-01-22: Uptime

The new RPi3 is proving stable so far, up 155 days as I write.

2021-01-21: Backup Disc Fsck

I went to do the usual daily backup yesterday, and the backup mount did not become available. Disk Utility could talk to the disc and was happy with the partition table, but failed immediately trying to check ("first aid") the HFS partition that Time Machine uses.

I tried a couple of times to unplug the USB and re-plug, even shift to a different port, but no dice.

I fished out the disc where I could see and hear it, and it seems it was making the sort of rattling noises that I would expect from fsck. And when I looked in the Activity Monitor, there was indeed an fsck_hfs running. (So whoops, I had been unplugging the drive with fsck trying to automagically repair the drive, several times!)

So much for "just works" macOS letting me know that something was going on, so that I did not make it worse. A notification along the lines of Please leave your disc plugged in while I try to fix it: this may take a while would have been helpful. Having to rely on *nix-greybeard-foo (the sound of fsck!) to deduce what was happening is bad...

62s "20210121 fsck of 2TB WD My Passport USB portable hard disc" Uploaded . Downloads:

I had the drive plugged in to allow an extended fix-up run in the evening and it had not finished by bedtime. So I let it run overnight ... and all day ... and into the evening again.

As of writing it's been going not far off 21h I estimate. The single-threaded process talking to a slow-ish external disc has racked up 4h20m of CPU! It's still rattling away with occasional quieter CPU-heavy (~50%) patches.

 fsck_hfs	11.4	4:20:51.24	1	0	0.0	0.00	10122	root	1.01 GB

Here's good and bad bits of the /var/log/fsck_hfs.log:

/dev/rdisk2s2: fsck_hfs started at Fri Jan 15 08:18:14 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Fri Jan 15 08:18:14 2021


/dev/rdisk2s2: fsck_hfs started at Fri Jan 15 08:26:43 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Fri Jan 15 08:26:43 2021


/dev/rdisk2s2: fsck_hfs started at Fri Jan 15 13:42:23 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Fri Jan 15 13:42:23 2021


/dev/rdisk2s2: fsck_hfs started at Fri Jan 15 13:42:23 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Fri Jan 15 13:42:23 2021


/dev/rdisk2s2: fsck_hfs started at Sat Jan 16 09:30:03 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Sat Jan 16 09:30:03 2021


/dev/rdisk2s2: fsck_hfs started at Sat Jan 16 09:30:04 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Sat Jan 16 09:30:04 2021


/dev/rdisk2s2: fsck_hfs started at Sat Jan 16 10:23:22 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Sat Jan 16 10:23:22 2021


/dev/rdisk2s2: fsck_hfs started at Sat Jan 16 14:32:48 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Sat Jan 16 14:32:48 2021


/dev/rdisk2s2: fsck_hfs started at Sat Jan 16 14:32:48 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Sat Jan 16 14:32:48 2021


/dev/rdisk2s2: fsck_hfs started at Sun Jan 17 10:21:26 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Sun Jan 17 10:21:26 2021

...

/dev/rdisk2s2: fsck_hfs started at Tue Jan 19 21:32:48 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk2s2: fsck_hfs completed at Tue Jan 19 21:32:48 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 08:49:30 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM DIRTY
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 08:49:30 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 08:49:31 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2: ** The volume BAK2020-- could not be verified completely.
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 08:54:02 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 08:54:13 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM DIRTY
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 08:54:13 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 08:54:13 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2: ** The volume BAK2020-- could not be verified completely.
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 08:55:15 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 08:58:30 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM DIRTY
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 08:58:30 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 08:58:30 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.

/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 09:00:33 2021
/dev/rdisk2s2: /dev/rdisk2s2: Can't open /dev/rdisk2s2: Resource busy
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 09:00:33 2021

/dev/rdisk2s2: ** The volume BAK2020-- could not be verified completely.
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 09:00:52 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 09:01:37 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM DIRTY
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 09:01:37 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 09:01:37 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2:    Keys out of order
/dev/rdisk2s2: (4, 1016906)
/dev/rdisk2s2: Records 30 and 31 (0-based); offsets 0x0176 and 0x01BE
/dev/rdisk2s2:    Keys out of order
/dev/rdisk2s2: (4, 1016906)
/dev/rdisk2s2: Records 32 and 33 (0-based); offsets 0x01CA and 0x01D6
/dev/rdisk2s2:    Invalid sibling link
/dev/rdisk2s2: (4, 503245)
/dev/rdisk2s2: ** Rebuilding catalog B-tree.
/dev/rdisk2s2: ** Rechecking volume.
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2: ** The volume BAK2020-- could not be verified completely.
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 11:24:06 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 11:24:27 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM DIRTY
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 11:24:27 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 11:24:28 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2:    Invalid index key
/dev/rdisk2s2: (3, 17)
/dev/rdisk2s2: ** Rebuilding extents overflow B-tree.
/dev/rdisk2s2: ** Rechecking volume.
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2:    Incorrect number of thread records
/dev/rdisk2s2: (4, 268945)
/dev/rdisk2s2:    Incorrect number of thread records
/dev/rdisk2s2: (4, 268945)
/dev/rdisk2s2: ** Checking multi-linked files.

/dev/rdisk3s2: fsck_hfs started at Wed Jan 20 12:54:31 2021
/dev/rdisk3s2: /dev/rdisk3s2: ** /dev/rdisk3s2 (NO WRITE)
/dev/rdisk3s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk3s2: fsck_hfs completed at Wed Jan 20 12:54:31 2021


/dev/rdisk3s2: fsck_hfs started at Wed Jan 20 12:56:33 2021
/dev/rdisk3s2: /dev/rdisk3s2: ** /dev/rdisk3s2 (NO WRITE)
/dev/rdisk3s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk3s2: fsck_hfs completed at Wed Jan 20 12:56:33 2021

/dev/rdisk2s2:    Incorrect number of file hard links
/dev/rdisk2s2: ** Checking catalog hierarchy.
/dev/rdisk2s2:    Invalid catalog record type
/dev/rdisk2s2: (4, 32766)
/dev/rdisk2s2: ** The volume BAK2020-- could not be verified completely.
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 13:01:29 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 21:01:06 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2 (NO WRITE)
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM DIRTY
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 21:01:07 2021


/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 21:01:07 2021
/dev/rdisk2s2: /dev/rdisk2s2: ** /dev/rdisk2s2
/dev/rdisk2s2:    Executing fsck_hfs (version hfs-556.60.1).
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.

/dev/rdisk2s2: fsck_hfs started at Wed Jan 20 21:03:54 2021
/dev/rdisk2s2: /dev/rdisk2s2: Can't open /dev/rdisk2s2: Resource busy
/dev/rdisk2s2: fsck_hfs completed at Wed Jan 20 21:03:54 2021

/dev/rdisk2s2:    Incorrect number of thread records
/dev/rdisk2s2: (4, 268945)
/dev/rdisk2s2:    Incorrect number of thread records
/dev/rdisk2s2: (4, 268945)
/dev/rdisk2s2: ** Checking multi-linked files.
/dev/rdisk2s2:    Incorrect number of file hard links
/dev/rdisk2s2: ** Checking catalog hierarchy.
/dev/rdisk2s2: ** Checking extended attributes file.
/dev/rdisk2s2:    Invalid sibling link
/dev/rdisk2s2: (8, 443612)
/dev/rdisk2s2: ** Rebuilding extended attributes B-tree.
/dev/rdisk2s2: ** Rechecking volume.
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2:    Incorrect number of thread records
/dev/rdisk2s2: (4, 268945)
/dev/rdisk2s2:    Incorrect number of thread records
/dev/rdisk2s2: (4, 268945)
/dev/rdisk2s2: ** Checking multi-linked files.
/dev/rdisk2s2:    Incorrect number of file hard links
/dev/rdisk2s2: ** Checking catalog hierarchy.
/dev/rdisk2s2: ** Checking extended attributes file.
/dev/rdisk2s2:    Incorrect number of extended attributes
/dev/rdisk2s2:    (It should be 8044690 instead of 8044665)
/dev/rdisk2s2:    Incorrect number of Access Control Lists
/dev/rdisk2s2:    (It should be 8044436 instead of 8044416)
/dev/rdisk2s2: ** Checking multi-linked directories.

/dev/rdisk3s2: fsck_hfs started at Thu Jan 21 14:20:26 2021
/dev/rdisk3s2: /dev/rdisk3s2: ** /dev/rdisk3s2 (NO WRITE)
/dev/rdisk3s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk3s2: fsck_hfs completed at Thu Jan 21 14:20:26 2021

/dev/rdisk2s2: ** Checking volume bitmap.
/dev/rdisk2s2: ** Checking volume information.
/dev/rdisk2s2:    Invalid volume file count
/dev/rdisk2s2:    (It should be 18226665 instead of 17919886)
/dev/rdisk2s2:    Invalid volume directory count
/dev/rdisk2s2:    (It should be 2549776 instead of 2535894)
/dev/rdisk2s2:    Invalid volume free block count
/dev/rdisk2s2:    (It should be 263705349 instead of 267713086)
/dev/rdisk2s2: ** Repairing volume.

/dev/rdisk3s2: fsck_hfs started at Thu Jan 21 14:22:26 2021
/dev/rdisk3s2: /dev/rdisk3s2: ** /dev/rdisk3s2 (NO WRITE)
/dev/rdisk3s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk3s2: fsck_hfs completed at Thu Jan 21 14:22:26 2021

/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 36967225)
/dev/rdisk2s2:    (It should be 0 instead of 37034803)
/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 36967227)
/dev/rdisk2s2:    (It should be 0 instead of 37034804)
/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 36967229)
/dev/rdisk2s2:    (It should be 0 instead of 37034805)
/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 36967231)
/dev/rdisk2s2:    (It should be 0 instead of 37034806)
/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 36967262)
/dev/rdisk2s2:    (It should be 37234467 instead of 37144934)
/dev/rdisk2s2:    Indirect node 36966482 needs link count adjustment
/dev/rdisk2s2:    (It should be 6 instead of 8)
/dev/rdisk2s2:    Indirect node 36966689 needs link count adjustment
/dev/rdisk2s2:    (It should be 8 instead of 9)
/dev/rdisk2s2:    Invalid first link in hard link chain (id = 36966836)
/dev/rdisk2s2:    (It should be 37034409 instead of 37144508)
/dev/rdisk2s2:    Indirect node 36966836 needs link count adjustment
/dev/rdisk2s2:    (It should be 1 instead of 2)
/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 37144850)
/dev/rdisk2s2:    (It should be 37234357 instead of 37144851)
/dev/rdisk2s2:    Indirect node 36967164 needs link count adjustment
/dev/rdisk2s2:    (It should be 2 instead of 3)
/dev/rdisk2s2:    Previous ID in a hard link chain is incorrect (id = 37144929)
/dev/rdisk2s2:    (It should be 37234460 instead of 37144930)
/dev/rdisk2s2:    Indirect node 36967252 needs link count adjustment
/dev/rdisk2s2:    (It should be 6 instead of 7)

Which is as far as it currently gets, with the last update ~2h30m ago, as I type...

I think this may be the longest fsck that I have ever encountered, possibly because it's HFS and because it's 2TB. But maybe in the bad old days at uni or in the City, I may have had worse and mentally bleached it...

(Will be continuing straight through 2021-01-21T21:21:21Z at this rate, ie the 21st second of 21st minute of 21st hour of 21st day of 21st year of 21st century...)

/dev/rdisk2s2:    Indirect node 36967252 needs link count adjustment
/dev/rdisk2s2:    (It should be 6 instead of 7)

/dev/rdisk3s2: fsck_hfs started at Thu Jan 21 19:24:59 2021
/dev/rdisk3s2: /dev/rdisk3s2: ** /dev/rdisk3s2 (NO WRITE)
/dev/rdisk3s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk3s2: fsck_hfs completed at Thu Jan 21 19:24:59 2021


/dev/rdisk3s2: fsck_hfs started at Thu Jan 21 19:26:59 2021
/dev/rdisk3s2: /dev/rdisk3s2: ** /dev/rdisk3s2 (NO WRITE)
/dev/rdisk3s2:    Executing fsck_hfs (version hfs-556.60.1).
QUICKCHECK ONLY; FILESYSTEM CLEAN
/dev/rdisk3s2: fsck_hfs completed at Thu Jan 21 19:26:59 2021

/dev/rdisk2s2: ** Rechecking volume.
/dev/rdisk2s2: ** Checking Journaled HFS Plus volume.
/dev/rdisk2s2: ** Detected a case-sensitive volume.
/dev/rdisk2s2:    The volume name is BAK2020--
/dev/rdisk2s2: ** Checking extents overflow file.
/dev/rdisk2s2: ** Checking catalog file.
/dev/rdisk2s2: ** Checking multi-linked files.
/dev/rdisk2s2: ** Checking catalog hierarchy.
/dev/rdisk2s2: ** Checking extended attributes file.
/dev/rdisk2s2: ** Checking multi-linked directories.
/dev/rdisk2s2: ** Checking volume bitmap.
/dev/rdisk2s2: ** Checking volume information.
/dev/rdisk2s2: ** The volume BAK2020-- was repaired successfully.
/dev/rdisk2s2: fsck_hfs completed at Fri Jan 22 02:47:18 2021

... and the drive is mounted!

2021-01-18: Loop Closed

Screenshot 20210118 System Admin Login Only legacy system closed

The Loop system is finally really closed. I have removed gas, electricity and network gizmos. This also means that there is not actually a dump load at the moment... I could make the router the dump, or remove the 12V adaptor entirely to save a little vampire load...

(I had been quietly continuing to capture the Loop daily electricity flow data, to review alongside the Enphase data, but even that was not available/updated this morning for my before-breakfast task...)

2021-01-17: Work Storage

(See previous work storage note and next.)

The work storage mechanism is definitely deferring work. I can see some files due a rebuild since ~5th, and now that we have some sunshine they have it. After a full VHIGH rebuild the oldest desktop page is timestamped the 13th, and there is one from the 14th. The vast majority have been rebuilt today.

Screenshot 20210117 URL inspection fails temporarily with scary red pop up

GSC funny turn

GSC and Googlebot seemed to take a funny turn during the evening: weird repeat fetches by the latter and this pop-up on the former!

I filed a report including a fragment of my logs for a not-very-exciting page updated just once, with Gary Illyes and the Google team, for their interest, in case it reveals a bug...

www.earth.org.uk:80 66.249.69.92 - - [17/Jan/2021:05:04:49 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:06:02:34 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:06:54:37 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:06:54:38 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:07:47:57 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.92 - - [17/Jan/2021:08:52:29 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:09:57:43 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:10:53:42 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.92 - - [17/Jan/2021:12:01:33 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:12:56:07 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:14:01:36 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20741 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:15:01:37 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 200 20700 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:15:39:26 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:16:01:41 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:16:41:53 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.88 - - [17/Jan/2021:16:56:06 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
www.earth.org.uk:80 66.249.69.90 - - [17/Jan/2021:17:56:19 +0000] "GET /electricity-storage-whole-household-2018.html HTTP/1.1" 304 260 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

2021-01-11: Distribution of Page Hits

I took a look at page hits, apparently by humans not bots, and not by me. The snapshot I looked at has a roughly 9-day window. Nearly 50% of hits are amongst the top 10 pages. Over 50% of hits are amongst the top 15 pages. Most pages get no hits at all in this time window.

# Pages% hitsComment
10 49The top-ten pages get nearly half the visits/hits.
15 56
20 63
140100All main pages with at least one hit.
298100Includes some pages not counted in stats above: ls *.html | wc -l

2021-01-10: Profile-driven Optimisation

Given that I have logs and know which pages and images are the most downloaded, I could make extra effort to, for example, (re)compress them or updated versions of them. Maybe use an extra -m option with zopflipng or more iterations in zopfli, or notch down the 'quality' slider one place for JPEG images for example.

It may also be possible to selectively omit some less-important content from popular pages, and spend more time choosing which related pages to link to.

It may also be sensible to treat warnings as errors on key pages and images, to encourage tuning them for optimal behaviour. I have set the desktop page build to do this initially, for top-ranked pages. This uses the last-archived popularity data if live data is not available, eg when working off-line.

This would reduce bandwidth for the site and for its clients, and possibly speed up page rendering and improve user experience a little, focussing CPU effort where it may gain most results.

Not that EOU is exactly swamped with requests, but still...

Delete Facebook

Given recent events I am even less keen than I was on Facebook and its tentacles. I have removed the WhatsApp social media button from the AMP site (when its pages are next rebuilt). In due course I will create a slimmed-down Share42 set of buttons, minus Facebook, for the lite and desktop sites.

I do not think that I was getting any significant traffic via those buttons in any case.

2021-01-09: Site Soft Parameters

I have added a set of 'soft parameters' to the site build process that do not change the visible logical content of pages. They can be changed easily without forcing page rebuilds.

These parameters include such values as how many days after the last edit of a page to inject ads into it if it does not otherwise qualify.

This also allows turning on run-time debugging in some key scripts.

2021-01-06: Hosting

Today's equation is: 1 megabit / second = 0.3285 terabytes / month

Thus EOU's outgoing FTTC connection could (if not limited by the RPi etc) nominally serve more than 5TB/month, which was my bandwidth budget for another site hosted in the US.

I am testing out a VPS with 250Mbps unmetered bandwidth, ie ~82TB/month.

This host would accommodate a DNS secondary, and a gallery.hd.org mirror.

2021-01-05: Lockdown Reboot

Under the new England lockdown all four of us are at home (nearly) all day, the two kids on remote learning, and the adults WFH. Everyone will be at home for at least about seven weeks.

Today as I went out for my exercise walk a little after 1pm, the Internet connection dropped out, booting the three in the house off-line.

The Vigor2862 router is a bit flaky, so I have set up a regular weekly reboot of the router. I have also set up a calendar reminder for me to check that the older RPi server has come back on line after, since it often does not in this circumstance.

2021-01-04: JavaScript Minimisation

I had a small brainwave to (marginally) reduce the weight of the first page for each new visitor. The share43 JavaScript is compact, but not minified. So I ran it through codebeautify.org/minify-js and re-inserted a slightly-trimmed version of the copyright line, and made the name slightly shorter too, and generated the pre-compressed versions, to go from:

2787 share42.js
1180 share42.jsgz
 900 share42.jsbr

to:

2630 min.js
1114 min.jsgz
 852 min.jsbr

A whole 48 bytes (~5%) lopped off the brotli-compressed version, in fact a little over 5% saved from all versions!

Those are then svn cp-ed to the m-dot area, with slightly different names (not shorter in that case).

I should have done this ages ago!

2021-01-01: Year-end and Month-end Data Munging

At the turn of the month and the turn of the year there's quite a lot of data collection and analysis to be done. I am part-way through as I write.

Given the very grey dull day yesterday and today (I encountered a little gentle sleet while on my lockdown exercise walk) I think that few if any pages will get published until tomorrow's forced make all. Given that, and the "work storage" scheme, I have guessed the datePublished to be 2021-01-02T14:00Z, even though I am writing it 24 hours ahead of that!

I still have not created canonical versions of some of the data files from the switch to the new RPi server around August, including (old server versions):

data/powermng/202008-old.log.gz
data/OpenTRV/pubarchive/localtemp/202008-old.log.gz
data/OpenTRV/pubarchive/remote/202008-old.json.gz
data/16WWHiRes/Enphase/202008-old.log.gz
data/16WWHiRes/Enphase/202008-old.daily.production.json.gz
data/SunnyBeam/202008-old.gz

I am dealing with the SunnyBeam and main Enphase logs (data/16WWHiRes/Enphase/202008-XXX.log.gz, merging old and new with sort -u) while making annual xz logs.

The powermng logs were merged by splicing them at the point that the Morningstar controller was moved from old to new RPi. This is evidenced by the AL -1 indicating when the Morningstar is not connected. The splice happens at these two lines:

2020/08/21T12:30:06Z AL 1334 B1 14057 B2 -1 P 7170 BV 13857 ST VH D V A1P 18058 B1T 23 UC 100
2020/08/21T12:40:06Z AL 921 B1 13575 B2 -1 P 7643 BV 13390 ST H D h A1P 12532 B1T 23 UC 100

For the main OpenTRV sensor logs data/OpenTRV/pubarchive/remote/202008-XXX.json.gz, the old and the new are simply concatenated. The splice happens at these two lines:

[ "2020-08-21T20:08:41Z", "", {"@":"FA97A8A7B7D2D3B6","+":3,"O":2,"H|%":56,"vac|h":0} ]
[ "2020-08-21T21:12:25Z", "", {"@":"96F0CED3B4E690E8","+":14,"tS|C":0,"vC|%":21,"gE":0} ]

For the older (local-only) OpenTRV sensor logs data/OpenTRV/pubarchive/localtemp/202008-XXX.log.gz, the old and the new are again simply concatenated. The splice happens at these two lines:

2020/08/21 20:06:51Z 26.5 =F@26C8;X0;T14 7 W255 0 F255 0 W255 0 F255 0;S 14 20 c;C5
2020/08/21 21:12:49Z 26.5625 =F@26C9;X0;T15 17 W255 0 F255 0 W255 0 F255 0;S 14 20 c;C5

The new server's log for 202008 is copied to use as the canonical version for the once-per-day snapshot of all Enphase values. This on the basis that a reliable merge is hard given the file format. Old and new should contain the same amount of information.

% svn cp 202008-new.daily.production.json.gz 202008.daily.production.json.gz

This completes the pending reconciliation work from September, I think!

The various -old and -new files should not be removed, to allow for other reconciliations and uses if desired.