Website Optimization Measures, Part XIX
Published on August 22, 2023, filed under Development and Design (RSS feed for all categories).
Web design is a process, iterating seems more valuable than redoing, and Jens likes to share what random things he’s doing on his websites (hi). These are some maintenance activities from the last five months.
-
Improving handling of global dependencies. A happy user of Depfu (and a more and more satisfied user of Dependabot), I keep dependencies up-to-date for all projects. However, for (fortunately few) global packages, I couldn’t say the same thing. For a first improvement, I’ve set up npm-check, including an alias, to test occasional manual checks and updates.
-
Augmenting public by local link checks. I check on link rot on a regular basis, however the setup is manual, and it relies on the somewhat slow and limited W3C link checker. With Frontend Dogma growing and growing (now hosting more than 4,000 do-follow links), it was time to look into additional tests. I’m test-driving markdown-link-check, likewise set up with an alias and run manually, for a first improvement, likely to be automated in the future.
-
Tabbing through my websites. As a sighted user, it’s common only to hover over and point to and click on things. Even with a focus on accessibility (with the team, I’m working on accessibility at Miro), this happens, at least to me. Occasionally, then, I tab around. Most of the time, given that accessibility is top of mind, things are fine. Here, too. (What are you trying to say, Jens?) But—there was some odd unnecessary issue with link outlines in meiert.com headings. Try to spot it:
h1, h1 a { margin: 15px 0 0 20px; } h1 a { margin: -15px 0 0 -20px; }
The issue had historical reasons (the code was 18 years old) and was trivial to fix, even leading to less code. The best kind of issue, I suppose.
-
Optimizing color contrast. The original design of Frontend Dogma used orange (
#f08713
) as the brand color. That’s a prominent color, but not one with great contrast (one a bright background). After a friend flagged this given how Frontend Dogma flunked the WebAIM Million (199 times the same issue), I used this to revisit contrast. Ultimately I landed on some reddish color. With better contrast. A good improvement. -
Reviewing UI elements hidden on mobile. Just as with print, it’s easy and may be prudent to hide elements on mobile—typically, in order to use space better. But just as with print, one may need to challenge and review what one is hiding. In this particular case, I had once hidden the meiert.com error program to conserve footer space. However, with growing mobile visitors, this may have led to fewer opportunities to even learn that program was there. Which led to reviewing and challenging and adding back the respective link, on mobile.
-
Revisiting multi-language tag handling. On this site, I use a limited number of tags for English posts, and a similar but not identical set of tags for German posts. While the language-specific pages don’t have to be identical, it makes sense for them to be similar. When it comes to tag placement on post pages, they were quite unlike. And therefore I’ve made them a little more similar again, by featuring tags in the “meta” section below the headings of all posts.
-
Re-compressing images. I have a strong habit of compressing and optimizing images (often using ImageOptim), and I automate lossless compression through Imagemin Guard. However, as I may forget things, as tooling improves, as my taste or computing power for more aggressive optimizations raises, I re-compress image assets. That’s what I did again recently, aggressively (but losslessly) re-compressing all assets. It didn’t save much (“2.2% on average”)—but still.
-
Getting set up for web sign-in through IndieAuth. I set meiert.com up with IndieLogin.com—to look into contributions to indieweb.org, i.e., for testing.
-
Preventing AI—OpenAI—crawling. A part of a longer story, I dislike website content being shown and lifted without attribution, as with some Google search features, and I dislike it being used to train AI/ML models in a similar manner. For that reason, I blocked some of my sites from being crawled, using the two user agents named by OpenAI,
GPTBot
andChatGPT-User
, and double-checking against robots.txt rules.A day later I learned about Common Crawl’s
CCBot
, which I decided to block as well.
This is a part of an open article series. Check out some of the other posts!
About Me
I’m Jens (long: Jens Oliver Meiert), and I’m a frontend engineering leader and tech author/publisher. I’ve worked as a technical lead for companies like Google and as an engineering manager for companies like Miro, I’m a contributor to several web standards, and I write and review books for O’Reilly and Frontend Dogma.
I love trying things, not only in web development (and engineering management), but also in other areas like philosophy. Here on meiert.com I share some of my experiences and views. (Please be critical, interpret charitably, and give feedback.)
Read More
Maybe of interest to you, too:
- Next: WebGlossary.info
- Previous: 200 Web-Based, Must-Try Web Design and Development Tools
- More under Development or Design
- More from 2023
- Most popular posts
Looking for a way to comment? Comments have been disabled, unfortunately.
Get a good look at web development? Try WebGlossary.info—and The Web Development Glossary 3K. With explanations and definitions for thousands of terms of web development, web design, and related fields, building on Wikipedia as well as MDN Web Docs. Available at Apple Books, Kobo, Google Play Books, and Leanpub.