Website Optimization Measures, Part XIX
Published on AugĀ 22, 2023, filed under development, design, optimization (feed). (Share this on Mastodon orĀ Bluesky?)
Web design is a process, iterating seems more valuable than redoing, and Jens likes to share what random things heās doing on his websites (hi). These are some maintenance activities from the last five months.
Improving handling of global dependencies. A happy user of Depfu (and a more and more satisfied user of Dependabot), I keep dependencies up-to-date for all projects. However, for (fortunately few) global packages, I couldnāt say the same thing. For a first improvement, Iāve set up npm-check, including an alias, to test occasional manual checks and updates.
Augmenting public by local link checks. I check on link rot on a regular basis, however the setup is manual, and it relies on the somewhat slow and limited W3C link checker. With Frontend Dogma growing and growing (now hosting more than 4,000 do-follow links), it was time to look into additional tests. Iām test-driving markdown-link-check, likewise set up with an alias and run manually, for a first improvement, likely to be automated in the future.
Tabbing through my websites. As a sighted user, itās common only to hover over and point to and click on things. Even with a focus on accessibility (with the team, Iām working on accessibility at Miro), this happens, at least to me. Occasionally, then, I tab around. Most of the time, given that accessibility is top of mind, things are fine. Here, too. (What are you trying to say, Jens?) Butāthere was some odd unnecessary issue with link outlines in meiert.com headings. Try to spot it:
h1, h1 a { margin: 15px 0 0 20px; } h1 a { margin: -15px 0 0 -20px; }
The issue had historical reasons (the code was 18 years old) and was trivial to fix, even leading to less code. The best kind of issue, I suppose.
Optimizing color contrast. The original design of Frontend Dogma used orange (
#f08713
) as the brand color. Thatās a prominent color, but not one with great contrast (one a bright background). After a friend flagged this given how Frontend Dogma flunked the WebAIM Million (199 times the same issue), I used this to revisit contrast. Ultimately I landed on some reddish color. With better contrast. A good improvement.Reviewing UI elements hidden on mobile. Just as with print, itās easy and may be prudent to hide elements on mobileātypically, in order to use space better. But just as with print, one may need to challenge and review what one is hiding. In this particular case, I had once hidden the meiert.com error program to conserve footer space. However, with growing mobile visitors, this may have led to fewer opportunities to even learn that program was there. Which led to reviewing and challenging and adding back the respective link, on mobile.
Revisiting multi-language tag handling. On this site, I use a limited number of tags for English posts, and a similar but not identical set of tags for German posts. While the language-specific pages donāt have to be identical, it makes sense for them to be similar. When it comes to tag placement on post pages, they were quite unlike. And therefore Iāve made them a little more similar again, by featuring tags in the āmetaā section below the headings of all posts.
Re-compressing images. I have a strong habit of compressing and optimizing images (often using ImageOptim), and I automate lossless compression through Imagemin Guard. However, as I may forget things, as tooling improves, as my taste or computing power for more aggressive optimizations raises, I re-compress image assets. Thatās what I did again recently, aggressively (but losslessly) re-compressing all assets. It didnāt save much (ā2.2% on averageā)ābut still.
Getting set up for web sign-in through IndieAuth. I set meiert.com up with IndieLogin.comāto look into contributions to indieweb.org, i.e., for testing.
Preventing AIāOpenAIācrawling. A part of a longer story, I dislike website content being shown and lifted without attribution, as with some Google search features, and I dislike it being used to train AI/ML models in a similar manner. For that reason, I blocked some of my sites from being crawled, using the two user agents named by OpenAI,
GPTBot
andChatGPT-User
, and double-checking against robots.txt rules.A day later I learned about Common Crawlās
CCBot
, which I decided to block as well.
This is a part of an open article series. Check out some of the other optimization posts!
About Me
Iām Jens (long: Jens Oliver Meiert), and Iām a web developer, manager, and author. Iāve been working as a technical lead and engineering manager for companies youāve never heard of and companies you use every day, Iām an occasional contributor to web standards (like HTML, CSS, WCAG), and I write and review books for OāReilly and Frontend Dogma.
I love trying things, not only in web development and engineering management, but also in other areas like philosophy. Here on meiert.com I share some of my experiences and views. (I value you being critical, interpreting charitably, and giving feedback.)