Knowing When To Say No to Front-end Optimization Advice

Following Website Performance Tools to a T Leads to Headaches, False Positives, & Complexity

Knowing When To Say No to Front-end Optimization Advice

Abstract

I’ve used a couple of the common online performance checking tools for a few years now, like PageSpeed Insights, WebPageTest, GTmetrix, Pingdom (corporate name prefixes omitted). These tools have helped me understand things I once overlooked, caught errors in my approach, & generally led to better websites. However, with enough experience, you will find that some of this advice is problematic. Some of the advice can be outdated. Some of it is micro-optimizations you shouldn’t worry about. Some of it recommends making changes that vastly add to the complexity of your build. Some of it recommends making changes that diminish your users’ security or privacy. Some of it is just flat out wrong. In this post, I’d like to highlight the advice that I, with my 13 years of front-end, think is wrong in the year 2023.

Good Advice

Let’s start on a good note however on the tools that had notable advice.

Don’t use third-party fonts

WebPageTest specifically included this line. It mainly points out the DNS issues of needing to connect to another server. While this can be mitigated with preconnect directives, leaking that info actual third parties such as Google & Adobe is a bad experience for users. To save yourself from possible data leakage, many recommend using a uBlock Origin rule like *$font,third-party, which if not clear is blocking all third-party fonts. Normally this should have no noticable effect as a users since fonts are just visual enhancement & should rarely detract from the content if loading fallback fonts provided by the user agent, however too many websites are coding like it’s 2009 & using icon fonts which are generally important to the experience (& the fallback Unicode tofu is ugly). All of this can be avoided if your site is just hosting everything first party & be done with it.

Note

This could easily be a false-ish positive where the third-party is on a domain you control, but even if it is static.your.domain when the main content is on your.domain this is still considered third-party & should be avoided since your users should wisely be blocking third-party fonts.

And more generally: Minimize third-party usage

The PageSpeed Insights link to Loading Third-Party JavaScript is pretty good post talking how & why sites need to cut down on all the ads, analytics, social this-&-thats. It even dares to state the privacy & security concerns noted above in the article too. But I agree, most stuff should be first-party hosted when possible if not straight up removed, but if you have to async, defer, preconnect, dns-prefetch, sandbox, & get you a solid CSP. These have severe pretty implications & you should definitely push back against marketing wanting 30 tracking scripts & audit that EZ-analytics plugin for WordPress. But what’s a bit maddening is getting so close here, but then later all tools seem to recommend trusting yours & your users’ data with cloud providers in the form of CDNs.

Anything about validating ARIA

This is stuff the generally able-body normals like myself have trouble understanding & doing correctly so all tooling aiming to help us with this is awesome.

Allows users to paste into input fields

Looking at you $MY_LOCAL_BANK

Grain of Salt

Things that are either not that important, or have some trade-offs/nuances that should be considered

Avoid serving legacy JavaScript to modern browsers

This depends on your tooling & target. While many ECMAScript2015, et al. features have been supported for around a decade & can result in smaller code, there is something to say for maintaining compatibility with grandma’s Chromebook that now won’t update, or Netsurf’s limited JS engine since after compression the size is not that different. The linked article quotes “legacy JavaScript is typically around 20% larger & slower than equivalent modern code” but if you are going from 100 KiB to 80 KiB uncompressed, you’re probably not making a noticeable gain for speed & have made a compatibility cost. If you are building a mostly informational site of static content with a couple of enhancements in the form of small scripts, aiming for compat is good; if you are building a medium/large web application, targeting & optimizing for the latest browsers is a great idea.

Serve images in next-gen formats

When it comes specifically to WebP & AVIF, I would be wary. Many users hate getting WebP sent to them for anything they tend to save/share for quality & compatibility reasons. AVIF requires hardware encoding or else the cost of generating the images is quite high (decoding is also high, but hardware decoding is not too uncommon). The ‘mysteriously absent’ format from this list is JPEG XL, or JXL, which has the ability to transparently compress JPEGs by 20% & also has size:quality numbers close to & sometimes exceeding AVIF while not taking up as many resources. The caveat here is only Apple is doing the right thing pushing this to production while Mozilla sits on its hands for big daddy Google’s judgment to rip its code despite industry leaders a) supporting the format in their image software & b) expressly trying to work together to change Google’s mind (entities that normally don’t agree like Meta + Adobe + Krita + GIMP). Google’s internal AVIF takes a political stance that it is superior it would seem. To help push against their flawed reasoning of not enough sites were trying to send JXL content (because why would they try when the monopoly browser isn’t yet supporting it), slap a *.jxl file in your <picture> & image-set() to support Apple & show Google it was wrong to pass on a good format (seriously, imagine if this were built into Android & all your photos were transparently 20% smaller on disk saving you space). Regardless, you should still have a PNG or JPEG fallback in any case.

Compress components with gzip

Compression is good, no debate here, but counter to the compatibility of message in other sections, you might only need to support Brotli or Zstd for your compression with almost everything moving beyond gzip to support something better optimized—meanwhile these messages are not docking users for not supporting DEFLATE which is even older. It’s not much to add gzip but it’s also not strictly ‘free’. I would be happier if these types of checks were just that some compression was on & give bonus points for supporting Brotli & Zstd (like how Mozilla Observatory gives bonus points).

Tap targets are not sized appropriately

Some of this seems inappropriate for a measurement. I had a 1rem font with 1.5 line-spacing in a list & yet was told my links were too small—bruh, these are like the defaults & I shouldn’t hurt my totally normal typography for this goal.

Uses HTTPS

I’m starting to come around to the fact that our websites should still host on port 80 as well. I understand encrypting is what keeps any data safe, prevents snoopers from reading, & ISPs from injecting adware into sites, & without TLS enabled, you (sadly) don’t get to leverage what HTTP/2 brings to the table, but we gotta talk about the mess that is the certificate authorities. CAs are starting to have more power than they should & places like the EU wants to override their own certificates to inject their own MitM attack for who knows what reason. Back in the days of Geocities & before, making/hosting a webpage was dead-ass simple too, but these certificate barriers have made it a lot harder for folk to just serve some content leading folks to social media where they don’t control their data. What I’m saying is that there are real trade-offs that we don’t talk about & the tools ding you for choosing not to & also leading to my mixed thoughts are users now thinking no HTTPS means it’s not secure when this is true, but more nuanced than a kneejerk (one instance I saw someone say the wouldn’t try darcs since the site, until just recently, was only hosting on port 80—& I similarly had the same reaction since LibreWolf gave me the big, bad warning screen over it). Even with Let’s Encrypt + ACME democratizing & making it easier than ever (& free) to get a cert, it’s also just as easy for the bad actors to now be considered ‘secure’ too by the padlock which might confuse some users; even still I see expired certs due to misconfigured cron jobs which makes sites look broken because even with all the work into ACME, it’s still not that easy.

Generally Bad Advice

There are reasons behind the suggestions, ultimately, I think a majority of cases should ignore any of these suggestions.

CDN Usage

CDNs are just one tool to handle scale, but there are many problems. There are the obvious issue like pointed out in Public CDNs Are Useless and Dangerous (which I seemed to have linked a lot) from privacy/security, not as reliable as folks think, caching not working like it did a decade ago, etc.. On the reliability note, if you host your assets on the same server as your HTML when that server is down you have an issue, but when these public CDNs go down, & all of them have, part of your site is broken. When it comes to CDN usage detection, many of these tools only consider if you are using one of the big, corporate services out there which means handing user info out to those big, corporate entities. These tools are generally not testing if you set up your own CDN nor are they able to assess if your site is big enough to warrant the complexity of putting your assets in places aside from your main server; everyone might believe one day they’ll need to scale up to warrant a CDN but this can be something your project tackles later in its lifecycle when demand is near saturating your main server. As a result of these recommendations, I see tons of sites prematurely slap Cloudflare or other CDN in front of their site which gives me & others CAPTCHAs, is automatically turning good images into WebP sludge, & when those public CDNs inevitable have an outage, the site loses all assests.

Paying any mind to PWA when you aren’t an app

Know your content: if it ain’t an application you shouldn’t need to see any of these number. PageSpeed Insights lets you ignore the PWA stuff if you choose, but defaults to as the others do to showing you all of this information about service workers & what not that is just going to slow you down. This isn’t your problem, but the tools should do a better job explaining that you can skip in many place situations.

Put JavaScript at bottom

For compatibility reasons you might choose to do this, but defer support goes back over a decade—but even still, if defer is not supported, the script can still execute (you could wrap the main() or whatever in a DOMContentLoaded event listener if you need to be concerned with execution timing for older browser). The big benefit with choosing defer is as the script is in the head, the user agent gets hint to start downloading & parsing this content while saving execution til after the DOMContentLoaded. The effect is the same, but you get the processing jump that the bottom of the page won’t get you as the browser will only at the end see that it needs to halt what it’s doing to fetch/parse/execute the script—so performance is being left on the table in practical cases.

Suggesting AMP (Accelerated Mobile Pages)

Luckily this is dying down, but don’t let Google own the content of the internet & create it’s own spec.

Anything about worrying about render-blocking CSS suggesting you inline

The ‘speed up’ gained here involves a bit of wasteful data (duplicated from your CSS), either ruining your CSP to allow unsafe-inline or adding complexity to the server to parse to get a nonce, but then to figure out what sort of CSS is required for that “above the fold” content involves some complex tooling running headless browsers, parsing/executing CSS + HTML + JS to find what styles are needed in a way that some treat like a black box of magic, but is ultimately incredibly wasteful & hardly anyone knows how it works. Luckily, this opinion has been showing up more & Lighthouse is now less likely to ding you for this. A better solution is to keep your CSS simple, take the 10 microsecond penalty & know that your styles are all now cached just fine for subsequent page views so it’s not the end of the world. Side story: we had automated testing around Lighthouse & this was the line we absolutely turned off.

My Quick Tips

Make sure it looks good in a TUI browser

w3m & elinks are my favorites, but any TUI browser should do. Checking in such a browser tends to highlight big flaws in your HTML (like putting that alt text on a decorative image, not treating icons with care, making duplicate content, etc.). It’s an imperfect analog, but you can make some assumptions about how bots & screen readers will approach your content giving it a look. This definitely prompt you to…

Include a non-generic <noscript>

This is so helpful for a lot of readers who allowlist their scripts, are using TUI browsers, as well as robots for SEO. “This app requires JavaScript” is not good enough. What app? Why does it require JavaScript? My rule of thumb if this is for like a SPA is to include the name of the application as well as a short description of what the app is & does. Luckily, these tools will ‘yell at you’ for skipping your meta description & often this is can be good enough. If it’s for something on a mostly static page, ideally you’d do the whole song & dance of having an image fallback, but you probably don’t have time/resource/audience to bother with that, but you can still leave a <noscript> giving a general description of what is missing ala “This chart requires you to enable/allow JavaScript”. It’s this latter argument it seems folks get all wound up about when you say add a <noscript> since they think it need to be big futz, but in the case of JavaScript allowlisters, we just want to know we are missing out on a feature & appreciate you giving us consent on if we want to enable scripts.

Try to keep your tools simple, understandable

Missing out on 5% optimization is worth it in the long-term form a maintenance side if you can actually understand how your tools work. Don’t feel bad if you aren’t hopping on a Webpack, Snowpack, Rollup, Parcel bundler when your content is simple enough to use some basic scripting.

The <abbr> tag is underated

Have jargon & abbreviations, mark it up.

Avoid Markdown for writing blog/documentation content

Markdown has a wonky spec & does not have enough features to cover good markup & will lead you to making bad markup choices without your callouts, definition lists, attributes for <img>s, <summary> + <details>, <figure>, no attribution in <blockquote>s, etc. You will certainly be better served with reStructuredText or AsciiDoc which have better community cohesion than the Wild West that is Markdown forks (they will call them “flavor” to distract you from realizing you may have no control of the spec because it is closed and/or that it will be incompatible with other forks).

Summary

All advice from tools & folks likely has good intentions & you can probably assume it is good advice or makers wouldn’t bother to say it. But as with many things online, you should consider these things both from a fact & an opinion perspective & be willing to reject the bad ideas or realize at least that these heuristics don’t apply to what you are building.