Jeremy Keith's Blog, page 11
May 28, 2024
Trust
In their rush to cram in ���AI��� ���features���, it seems to me that many companies don���t actually understand why people use their products.
Google is acting as though its greatest asset is its search engine. Same with Bing.
Mozilla Developer Network is acting as though its greatest asset is its documentation. Same with Stack Overflow.
But their greatest asset is actually trust.
If I use a search engine I need to be able to trust that the filtering is good. If I look up documentation I need to trust that the information is good. I don���t expect perfection, but I also don���t expect to have to constantly be thinking ���was this generated by a large language model, and if so, how can I know it���s not hallucinating?���
���But���, the apologists will respond, ���the results are mostly correct! The documentation is mostly true!���
Sure, but as Terence puts it:
The intern who files most things perfectly but has, more than once, tipped an entire cup of coffee into the filing cabinet is going to be remembered as “that klutzy intern we had to fire.”
Trust is a precious commodity. It takes a long time to build trust. It takes a short time to destroy it.
I am honestly astonished that so many companies don���t seem to realise what they���re destroying.
May 25, 2024
InstAI
If you use Instagram, there may be a message buried in your notifications. It begins:
We’re getting ready to expand our AI at Meta experiences to your region.
Fuck that. Here���s the important bit:
To help bring these experiences to you, we’ll now rely on the legal basis called legitimate interests for using your information to develop and improve AI at Meta. This means that you have the right to object to how your information is used for these purposes. If your objection is honoured, it will be applied going forwards.
Follow that link and fill in the form. For the field labelled ���Please tell us how this processing impacts you��� I wrote:
It���s fucking rude.
That did the trick. I got an email saying:
We���ve reviewed your request and will honor your objection.
Mind you, there���s still this:
We may still process information about you to develop and improve AI at Meta, even if you object or don���t use our products and services.
May 23, 2024
Speculation rules and fears
After I wrote positively about the speculation rules API I got an email from David Cizek with some legitimate concerns. He said:
I think that this kind of feature is not good, because someone else (web publisher) decides that I (my connection, browser, device) have to do work that very often is not needed. All that blurred by blackbox algorithm in the browser.
That���s fair. My hope is that the user will indeed get more say, whether that���s at the level of the browser or the operating system. I���m thinking of a prefers-reduced-data setting, much like prefers-color-scheme or prefers-reduced-motion.
But this issue isn���t something new with speculation rules. We���ve already got service workers, which allow the site author to unilaterally declare that a bunch of pages should be downloaded.
I���m doing that for Resilient Web Design���when you visit the home page, a service worker downloads the whole site. I can justify that decision to myself because the entire site is still smaller in size than one article from Wired or the New York Times. But still, is it right that I get to make that call?
So I���m very much in favour of browsers acting as true user agents���doing what���s best for the user, even in situations where that conflicts with the wishes of a site owner.
Going back to speculation rules, David asked:
Do we really need this kind of (easily turned to evil) enhancement in the current state of (web) affairs?
That question could be asked of many web technologies.
There���s always going to be a tension with any powerful browser feature. The more power it provides, the more it can be abused. Animations, service workers, speculation rules���these are all things that can be used to improve websites or they can be abused to do things the user never asked for.
Or take the elephant in the room: JavaScript.
Right now, a site owner can link to a JavaScript file that���s tens of megabytes in size, and the browser has no alternative but to download it. I���d love it if users could specify a limit. I���d love it even more if browsers shipped with a default limit, especially if that limit is related to the device and network.
I don���t think speculation rules will be abused nearly as much as client-side JavaScript is already abused.
May 22, 2024
Fluid
I really like the newly-launched website for this year���s XOXO festival. I like that the design is pretty much the same for really small screens, really large screens, and everything in between because everything just scales. It���s simultaneously a flyer, a poster, and a billboard.
Trys has written about the websites he���s noticed using fluid type and spacing: There it is again, that fluid feeling.
I know what he means. I get a similar feeling when I���m on a site that adjusts fluidly to any browser window���it feels very ���webby.
I���ve had this feeling before.
When responsive design was on the rise, it was a real treat to come across a responsive site. After a while, it stopped being remarkable. Now if I come across a site that isn���t responsive, it feels broken.
And now it���s a treat to come across a site that uses fluid type. But how long will it be until it feels unremarkable? How will it be until a website that doesn���t use fluid type feels broken?
May 21, 2024
Speculation rules
There���s a new addition to the latest version of Chrome called speculation rules. This already existed before with a different syntax, but the new version makes more sense to me.
Notice that I called this an addition, not a standard. This is not a web standard, though it may become one in the future. Or it may not. It may wither on the vine and disappear (like most things that come from Google).
The gist of it is that you give the browser one or more URLs that the user is likely to navigate to. The browser can then pre-fetch or even pre-render those links, making that navigation really snappy. It���s a replacement for the abandoned link rel="prerender".
Because this is a unilateral feature, I���m not keen on shipping the code to all browsers. The old version of the API required a script element with a type value of ���speculationrules���. That doesn���t do any harm to browsers that don���t support it���it���s a progressive enhancement. But unlike other progressive enhancements, this isn���t something that will just start working in those other browsers one day. I mean, it might. But until this API is an actual web standard, there���s no guarantee.
That���s why I was pleased to see that the new version of the API allows you to use an external JSON file with your list of rules.
I say ���rules���, but they���re really more like guidelines. The browser will make its own evaluation based on bandwidth, battery life, and other factors. This feature is more like srcset than source: you give the browser some options, but ultimately you can���t force it to do anything.
I���ve implemented this over on The Session. There���s a JSON file called speculationrules.js with the simplest of suggestions:
{ "prerender": [{ "where": { "href_matches": "/*" }, "eagerness": "moderate" }]}The eagerness value of ���moderate��� says that any link can be pre-rendered if the user hovers over it for 200 milliseconds (the nuclear option would be to use a value of ���immediate���).
I still need to point to that JSON file from my HTML. Usually this would be done with something like a link element, but for this particular API, I can send a response header instead:
Speculation-Rules: ���/speculationrules.json"I like that. The response header is being sent to every browser, regardless of whether they support speculation rules or not, but at least it���s just a few bytes. Those other browsers will ignore the header���they won���t download the JSON file.
Here���s the PHP I added to send that header:
header('Speculation-Rules: "/speculationrules.json"');There���s one extra thing I had to do. The JSON file needs to be served with mime-type of ���application/speculationrules+json���. Here���s how I set that up in the .conf file for The Session on Apache:
Header set Content-type application/speculationrules+jsonA bit of a faff, that.
You can see it in action on The Session. Open up Chrome or Edge (same same but different), fire up the dev tools and keep the network tab open while you navigate around the site. Notice how hovering over a link will trigger a new network request. Clicking on that link will get you that page lickety-split.
Mind you, in the case of The Session, the navigations were already really fast���performance is a feature���so it���s hard to guage how much of a practical difference it makes in this case, but it still seems like a no-brainer to me: taking a few minutes to add this to your site is worth doing.
Oh, there���s one more thing to be aware of when you���re implementing speculation rules. You have the option of excluding URLs from being pre-fetched or pre-rendered. You might need to do this if you���ve got links for adding items to shopping carts, or logging the user out. But my advice would instead be: stop using GET requests for those actions!
Most of the examples given for unsafe speculative loading conditions are textbook cases of when not to use links. Links are for navigating. They���re indempotent. For everthing else, we���ve got forms.
May 17, 2024
Labels
I love libraries. I think they���re one of humanity���s greatest inventions.
My local library here in Brighton is terrific. It���s well-stocked, it���s got a welcoming atmosphere, and it���s in a great location.
But it has an information architecture problem.
Like most libraries, it���s using the Dewey Decimal system. It���s not a great system, but every classification system is going to have flaws���wherever you draw boundaries, there will be disagreement.
The Dewey Decimal class of 900 is for history and geography. Within that class, those 100 numbers (900 to 999) are further subdivded in groups of 10. For example, everything from 940 to 949 is for the history of Europe.
Dewey Decimal number 941 is for the history of the British Isles. The term ���British Isles��� is a geographical designation. It���s not a good geographical designation, but technically it���s not a political term. So it���s actually pretty smart to use a geographical rather than a political term for categorisation: geology moves a lot slower than politics.
But the Brighton Library is using the wrong label for their shelves. Everything under 941 is labelled ���British History.���
The island of Ireland is part of the British Isles.
The Republic of Ireland is most definitely not part of Britain.
Seeing books about the history of Ireland, including post-colonial history, on a shelf labelled ���British History��� is ���not good. Frankly, it���s offensive.
(I mentioned this situation to an English friend of mine, who said ���Well, Ireland was once part of the British Empire���, to which I responded that all the books in the library about India should also be filed under ���British History��� by that logic.)
Just to be clear, I���m not saying there���s a problem with the library using the Dewey Decimal system. I���m saying they���re technically not using the system. They���ve deviated from the system���s labels by treating ���History of the British Isles��� and ���British History��� as synonymous.
I spoke to the library manager. They told me to write an email. I���ve written an email. We���ll see what happens.
You might think I���m being overly pedantic. That���s fair. But the fact this is happening in a library in England adds to the problem. It���s not just technically incorrect, it���s culturally clueless.
Mind you, I have noticed that quite a few English people have a somewhat fuzzy idea about the Republic of Ireland. Like, they understand it���s a different country, but they think it���s a different country in the way that Scotland is a different country, or Wales is a different country. They don���t seem to grasp that Ireland is a different country like France is a different country or Germany is a different country.
It would be charming if not for, y���know, those centuries of subjugation, exploitation, and forced starvation.
British history.
May 15, 2024
Baseline progressive enhancement
Support for view transitions for regular websites (as opposed to single-page apps) will ship in Chrome 126. As someone who���s a big fan���to put it mildly���I am very happy about this!
Hopefully Firefox and Safari won���t be too far behind. But it���s still worth adding view transitions to your website even if not every browser supports them. They���re the perfect example of a progressive enhancement.
The browsers that don���t yet support view transitions won���t be harmed in any way if you give them the CSS for view transitions. They���ll just ignore it. For users of those browsers, nothing changes.
Then when those browsers do ship support for view transitions, your website automatically gets an upgrade for those users. Code you���ve already written starts working from one day to the next.
Don���t wait, is what I���m saying.
I really like the Baseline initiative as a way to track browser support. It���s great to see it in use on MDN and Can I Use. It���s very handy having a glanceable indication of which browser features are newly available and which are widely available.
But���
Not all browser features work the same way. For features that work as progressive enhancements you don���t need to wait for them to be widely available.
Service workers. Preference queries. View transitions.
If a browser doesn���t support one of those features, that���s fine. Your website won���t break in that browser.
Now that���s not true of all browser features, particularly some JavaScript APIs. If a feature is critical for your site to function then you definitely want to wait until it���s widely supported.
Baseline won���t tell you the difference between those two different kinds of features.
I don���t want Baseline to get too complicated. Like I said, I really like how it���s nice and glanceable right now. But it would be nice if there way some indication that a newly-available feature is a progressive enhancement.
For now it���s up to us to make that distinction. So don���t fall into the trap of thinking that just because a feature isn���t listed as widely-available you can���t use it yet.
Really you want to ask two questions:
How widely available is this feature?Can this feature be used as a progressive enhancement?If Baseline tells you that the answer to the first question is ���newly-available���, move on to the second question. If the answer to that is ���no, it can���t be used as a progressive enhancement���, don���t ship that feature in production just yet.
But if the answer to that second question is ���hell yeah, it���s a progressive enhancement!��� then go for it, regardless of the answer to the first question.
Y���know, there���s a real irony in a common misunderstanding around progressive enhancement: some people seem to think it���s about not being able to use advanced browser features. In reality it���s the opposite. Progressive enhancement allows you to use advanced browser features even before they���re widely supported.
Responsibility
My colleague Chris has written a terrific post over on the Clearleft blog: Is the planet the missing member of your project team?
Rather than hand-wringing and finger-wagging, it gets down to some practical steps that you���we���can take on every project.
Chris finishes by asking:
Let me know how you design with the environment in mind. What practical advice would you suggest?
Well, here���s something that I keep coming up against���
Chris shows that the environment can be part of project management, specifically the RACI methodology:
We list who is responsible, accountable, consulted, and informed within the project. It���s a simple exercise but the clarity is useful for identifying what expertise and input we should seek from the named individuals.
Having the planet be a proactive partner in your project ensures its needs are considered.
Whenever responsibilities are being assigned there are some things that inevitably fall through the cracks. One I���ve seen over and over again is responsibility for third-party scripts.
On the face of it this seems like another responsibility for developers. We���re talking about code here, right?
But in my experience it is never the developers adding ���beacons��� and other third-party embedded scripts.
Chris rightly points out:
Development decisions, visual design choices, content approach, and product strategy all contribute to the environmental impact of your website.
But what about sales and marketing? Often they���re the ones who���ll drop in a third-party script to track user journeys. That���s understandable. That���s kind of their job.
Dropping in one line of JavaScript seems like a victimless crime. It���s just one small script, right? But JavaScript can import more JavaScript. Tools like Request Map Generator can show just how much destruction third-party JavaScript can wreak:
You pop in a URL, it fetches the page and maps out all the subsequent requests in a nifty interactive diagram of circles, showing how many requests third-party scripts are themselves generating. I���ve found it to be a very effective way of showing the impact of third-party scripts to people who aren���t interested in looking at waterfall diagrams.
Just to be clear, the people adding third-party scripts to websites usually aren���t doing so maliciously. They often don���t realise the negative effect the scripts will have on performance and the environment.
As is so often the case, this isn���t a technical problem. At root it���s about understanding people���s needs (like ���I need a way to see what pages are converting!���) and finding a way to meet those needs without negatively impacting the planet. A good open-minded discussion can go a long way.
So I echo Chris���s call to think about environmental impacts from the very start of a project. Establish early on who will have the ability to add third-party scripts to the site. Do all of those people understand the responsibility that gives them?
I saw this lack of foresight in action on a project recently. The front-end development was going really well and the site was going to be exceptionally performant: green Lighthouse scores across the board. But when the site went live it had tracking scripts. That meant that users needed to consent to being tracked. That meant adding another third-party script to generate a consent banner. It completely tanked the Lighthouse scores.
I���m sure the people who added the tracking scripts and consent banners thought they had no choice. But there are alternatives. There are ways to get the data you need without the intrusive surveillance and performance-wrecking JavaScript.
The problem is that it���s not the norm. ���Everyone else is doing it��� was the justification for Flash intros two decades ago and it���s the justification for enshittification via third-party scripts now.
It doesn���t have to be this way.
May 14, 2024
Germanity
I haven���t had this much FOMO since the total solar eclipse across North America last month. Beyond Tellerrand is happening right now in D��sseldorf. Marc always puts on an excellent event.
I can take great comfort in knowing that it���s not too long until an equally excellent event: UX London is happening next month! Three days of design excellence. And if you still haven���t got your ticket, now���s the time to snap one up. There���s a flash sale happening this week. Use the code FLASH20 to get 20% off any ticket. It���s going to be great!
Still, I wish I could���ve made it to D��sseldorf for Beyond Tellerrand.
Ironically, I���ve been in Germany for the past few days. I was down in my old stomping ground of Freiburg in the heart of the Black Forest.
It was kind of like travelling back in time for Jessica and me. We were there to celebrate with our dear friends Birgit and Schorsch who were celebrating 30 years of getting together. When Jessica and I ran the numbers we realised that it was also 30 years since we got together.
It was kind of weird though. There were people there I literally handn���t seen in three decades. On more than one occasion I���d be looking blankly at someone and they���d be looking blankly back at me until someone said our names and we���d both experience instantaneous recognition and time dilation.
But a good time was had by all. There was a party with live bands, beer, and currywurst. Best of all though, people stuck around for a few days to just hang out and experience the delights of the Schwarzwald together. I���m not saying I can���t still party on ���but I very much enjoyed the trip up into the hills the next day, and the leisurely wine-tasting in a nearby village the day after that.
And boy, did we eat well. Plenty of pretzels, sausages, and Black Forest cake of course, but Freiburg also has a fantastic market every single morning with the most amazing produce from the local region. Right now it���s the time for strawberries, aspargus, and bountiful lettuces.
Jessica and I finished the trip with a break from all the socialising. While everyone else was watching the Eurovision Song Contest we slipped away for a splendid meal at Restaurant Jacobi. It was the perfect way to wrap up a wonderful few days.
May 5, 2024
Securing client-side JavaScript
I mentioned that I overhauled the JavaScript on The Session recently. That wasn���t just so that I could mess about with HTML web components. I���d been meaning to consolidate some scripts for a while.
Some of the pages on the site had inline scripts. These were usually one-off bits of functionality. But their presence meant that my content security policy wasn���t as tight as it could���ve been.
Being a community website, The Session accepts input from its users. Literally. I do everything I can to sanitise that input. It would be ideal if I could make sure that any JavaScript that slipped by wouldn���t execute. But as long as I had my own inline scripts, my content security policy had to allow them to be executed with script-src: unsafe-inline.
That���s why I wanted to refactor the JavaScript on my site and move everything to external JavaScript files.
In the end I got close, but there are still one or two pages with internal scripts. But that���s okay. I found a way to have my content security policy cake and eat it.
In my content security policy header I can specifiy that inline scripts are allowed, but only if they have a one-time token specified.
This one-time token is called a nonce. No, really. Stop sniggering. Naming things is hard. And occassionally unintentionally hilarious.
On the server, every time a page is requested it gets sent back with a header like this:
content-security-policy: script-src 'self' 'nonce-Cbb4kxOXIChJ45yXBeaq/w=='That gobbledegook string is generated randomly every time. I���m using PHP to do this:
base64_encode(openssl_random_pseudo_bytes(16))Then in the HTML I use the same string in any inline scripts on the page:
���Yes, HTML officially has an attribute called nonce.
It���s working a treat. The security headers for The Session are looking good. I have some more stuff in my content security policy���check out the details if you���re interested.
I initially thought I���d have to make an exception for the custom offline page on The Session. After all, that���s only going to be accessed when there is no server involved so I wouldn���t be able to generate a one-time token. And I definitely needed an inline script on that page in order to generate a list of previously-visited pages stored in a cache.
But then I realised that everything would be okay. When the offline page is cached, its headers are cached too. So the one-time token in the content security policy header still matches the one-time token used in the page.
Most pages on The Session don���t have any inline scripts. For a while, every page had an inline script in the head of the document like this:
document.documentElement.classList.add('hasJS');This is something I���ve been doing for years: using JavaScript to add a class to the HTML. Then I can use the presence or absence of that class to show or hide elements that require JavaScript. I have another class called requiresJS that I put on any elements that need JavaScript to work (like buttons for copying to the clipboard, for example).
Then in my CSS I���d write:
:not(.hasJS) .requiresJS { display: none;}If the hasJS class isn���t set, hide any elements with the requiresJS class.
I decided to switch over to using a scripting media query:
@media (scripting: none) { .requiresJS { display: none; }}This isn���t bulletproof by any means. It doesn���t account for browser extensions that disable JavaScript and it won���t get executed at all in older browsers. But I���m okay with that. I���ve put the destructive action in the more modern CSS:
I feel that the more risky action (hiding content) should belong to the more complex selector.
This means that there are situations where elements that require JavaScript will be visible, even if JavaScript isn���t available. But I���d rather that than the other way around: if those elements were hidden from browsers that could execute JavaScript, that would be worse.
Jeremy Keith's Blog
- Jeremy Keith's profile
- 55 followers
