Tags: parsing

38

Friday, March 1st, 2024

ongoing by Tim Bray · Money Bubble

What we’re seeing is FOMO-driven dumb money thrown at technology by people who have no hope of understanding it. Just because everybody else is and because the GPTs and image generators have cool demos.

What’s going to happen, I’m pretty sure, is that AI/ML will, inevitably, disappoint; in the financial sense I mean, probably doing some useful things, maybe even a lot, but not generating the kind of profit explosions that you’d need to justify the bubble. So it’ll pop, and my bet it is takes a bunch of the finance world with it.

This is mostly about the intersection of finance, hype, and technology, but Tim mentions something that I’ve also been saying:

I’m super impressed by something nobody else seems to talk about: Prompt parsing.

Maybe it’s because I spent formative users playing text-only adventure games, but I am way more impressed by the way generative tools do natural language parsing than I am by their output.

Tuesday, February 27th, 2024

JavaScript Bloat in 2024 @ tonsky.me

This really is a disgusting exlusionary state of affairs.

I hate to be judgy, but I honestly wonder how the people behind some of these decisions can call themselves web developers.

Tuesday, July 4th, 2023

The Cost Of JavaScript - 2023 - YouTube

A great talk from Addy on just how damaging client-side JavaScript can be to the user experience …and what you can do about it.

The Cost Of JavaScript - 2023

Wednesday, August 19th, 2020

radEventListener: a Tale of Client-side Framework Performance | CSS-Tricks

Excellent research by Jeremy Wagner comparing the performance impact of React, Preact, and vanilla JavaScript. The results are simultaneously shocking and entirely unsurprising.

Thursday, February 6th, 2020

Hydration

As you may have noticed, I’m a fan of progressive enhancement.

It’s not cool. It’s often at odds with “modern” web development, so I end up looking like an old man yelling at a cloud to get off my lawn. Or something.

At its heart though, progressive enhancement seems fairly uncontroversial and inoffensive to me. It’s an approach. A mindset. Here’s how I describe it in Resilient Web Design:

  1. Identify core functionality.
  2. Make that functionality available using the simplest possible technology.
  3. Enhance!

Progressive enhancement makes use of the principle of least power:

Choose the least powerful language suitable for a given purpose.

That’s step two of the three-step process. But the third step is vital.

I think a lot of the hostility towards progressive enhancement comes from a misunderstanding of that three-step process, perhaps thinking that it stops at step two. I’m sure that some have intrepreted progressive enhancement as preventing developers from using the latest and greatest technology. Nothing could be further from the truth!

Taking a layered approach to building on the web gives you permission to try cutting‐edge JavaScript APIs, regardless of how many or how few browsers currently implement them.

The most common misunderstanding of progressive enhancement is that it’s inherently about JavaScript. That’s not true. You can apply progressive enhancement at every step of front-end development: HTML, CSS, and JavaScript.

But because of JavaScript’s strict error-handling model (at least compared to HTML and CSS), it’s in the JavaScript layer that the lack of a progressive enhancement mindset is most often felt.

That’s why I was saddened by the rise of frameworks and mindsets that assume the availability of JavaScript. Single page apps generally follow this assumption. Everything is delivered via JavaScript: content, markup, styles, and behaviour.

This leads to a terrible situation for performance. The user is left staring at a blank screen, waiting for something—anything!—to appear. Browsers are optimised to stream HTML as soon as they can. Delivering your content via JavaScript rather than HTML means you’re not taking advantage of that optimisation. Your users suffer.

But I was very heartened when I saw the pendulum start to swing back the other way a bit…

Let’s say you’re using a JavaScript framework like React. But the reason you’re using it isn’t because you’re doing anything particularly complex in the browser involving state management. You might be using React because you really like the way it encourages modularity and componentisation.

A few years ago, making a single page app was pretty much the only way you could use React. For you as a developer to experience the benefits of modularity and componentisation, users had to pay the price in the payload (and fragility) of client-side JavaScript.

That’s no longer the case. Now that we can run JavaScript on the server, it’s possible to build in a modular, componentised way and still use progressive enhancement.

When I first heard about Gatsby and Next.js, I thought that was the selling point. Run React on the server; send pre-generated HTML down the wire to the user; then enhance with client-side JavaScript.

But that’s not exactly how it works. The pre-generated HTML isn’t functional. It still needs a bucketload of JavaScript before it can do anything. The actual process is: Run React on the server; send pre-generated HTML down the wire to the user; then send everything again but this time in JavaScript, bundled with the entire React library.

This leads to a situation for users that’s almost worse than before. Instead of staring at a blank screen, now they get HTML lickety-split—excellent! But if they try to interact with what’s on screen, they’ll find that nothing is working yet. Even worse, once the JavaScript is delivered, and is being parsed, they probably can’t even scroll—their device is too busy interpreting all that JavaScript. Your users suffer.

All your content is sent twice. First HTML is sent from the server. These days this is called “server-side rendering”, even though for decades the technical term was “serving a web page” (I’m pretty sure the rendering part happens in a browser). Then a JavaScript library—plus all your bespoke JavaScript—is loaded. Then all your content is loaded again as JSON.

So you’ve got a facade of an interface that you can’t actually interact with until a deluge of JavaScript has been loaded, parsed and executed. The term used for this stage of the process is “hydration”, which makes it sound more like a relaxing treatment from Gwyneth Paltrow than the horrible user experience it is.

The idea is that subsequent navigations—which will happen with Ajax—should be snappy. But the price has already been paid by then. The initial loading experience is jagged and frustrating.

Don’t get me wrong: server-side rendering is great …if what you’re sending from the server is functional. It’s the combination of hollow HTML sent from the server, followed by a huge browser-freezing dump of JavaScript that is an anti-pattern.

This use of server-side rendering followed by hydration feels like progressive enhancement, because it separates out the delivery of markup and scripts. But it’s missing the mindset.

The layered approach of progressive enhancement echoes the separation of concerns in the front-end stack: HTML, CSS, and JavaScript—each layer expressing more power. But while these concepts are related, they’re not interchangable. Separating out the layers of your tech stack isn’t necessarily progressive enhancement. If you have some HTML that relies on JavaScript to be useful, then there’s no benefit in separating that HTML into a separate payload. The HTML that you initially send down the wire needs to be functional (at least at a basic level) before the JavaScript arrives.

I was a little disappointed to see Kyle Simpson—who I admire greatly—conflate separation of concerns with progressive enhancement in his talk from JSCamp 2019:

This content is here. I can see it, and it’s even styled. But I can’t click on the damn button because nothing has loaded in the JavaScript layer yet.

Anybody experienced that where you’ve been on a web page and it’s not really fully functional yet? I can see something but I can’t actually make any usage of it yet.

These are all things that cropped out of our thought process that said: “Let’s build the web in layers. Let’s deliver it progressively in layers. Because that’s morally right. We call this progressive enhancement. And let’s not worry too much about all these potential user experience flaws that may happen.”

That’s a spot-on description of server-side rendering and hydration, but it’s a gross mischaracterisation of progressive enhancement.

That button that requires JavaScript to work? That should’ve been generated with JavaScript. (For example, if you’re building a complex web app, consider sending a read-only view down the wire in HTML—then add any interactive interface elements with JavaScript in the browser.)

If people are equating progressive enhancement with thoughtless server-side rendering and hydration, then I can see why they’d be hostile towards it.

Users would be better served with unprogressive non-enhancement:

You take some structured content, which follows the vertical flow of the document in a way that everyone understands.

Which people traverse easily by either dragging their scroll bar with their mouse, or operating the keyboard using the up and down keys, or using the spacebar.

Or if they’re using a touch device, simply flicking backwards and forwards in that easy way that we’ve all become used to. What you do is you take that, and you fucking well leave it alone.

Alas, that’s not what tools like Gatsby offer. The latest post on their blog is called Why Gatsby is better with JavaScript:

But what about sites or pages where there is no client-side interactivity? Even for those pages, Gatsby offers performance benefits by including JavaScript.

I beg to differ.

(By the way, that same blog post also initially tried to equate the performance hit of client-side JavaScript with the performance hit of images. Andy explains why that’s disingenuous.)

Hope is on the horizon for React in the form of partial hydration. I sincerely hope that it will become the default way of balancing server-side rendering with just-in-time client-side interaction.

The situation we have now is the worst of both worlds: server-side rendering followed by a tsunami of hydration. It has a whiff of progressive enhancement to it (because there’s a cosmetic separation of concerns) but it has none of the user benefits.

Wednesday, January 22nd, 2020

28c3: The Science of Insecurity - YouTube

I understand less than half of this great talk by Meredith L. Patterson, but it ticks all my boxes: Leibniz, Turing, Borges, and Postel’s Law.

(via Tim Berners-Lee)

28c3: The Science of Insecurity

Friday, September 13th, 2019

5G Will Definitely Make the Web Slower, Maybe | Filament Group, Inc.

The Jevons Paradox in action:

Faster networks should fix our performance problems, but so far, they have had an interesting if unintentional impact on the web. This is because historically, faster network speed has enabled developers to deliver more code to users—in particular, more JavaScript code.

And because it’s JavaScript we’re talking about:

Even if folks are on a new fast network, they’re very likely choking on the code we’re sending, rendering the potential speed improvements of 5G moot.

The longer I spend in this field, the more convinced I am that web performance is not a technical problem; it’s a people problem.

Saturday, August 24th, 2019

“Never-Slow Mode” (a.k.a. “Slightly-Fast Mode”) Explained

I would very much like this to become a reality.

Never-Slow Mode (“NSM”) is a mode that sites can opt-into via HTTP header. For these sites, the browser imposes per-interaction resource limits, giving users a better user experience, potentially at the cost of extra developer work. We believe users are happier and more engaged on fast sites, and NSM attempts to make it easier for sites to guarantee speed to users. In addition to user experience benefits, sites might want to opt in because browsers could providing UI to users to indicate they are in “fast mode” (a TLS lock icon but for speed).

Tuesday, April 2nd, 2019

Idiosyncrancies of the HTML parser - The HTML Parser Book

This might just be the most nerdily specific book I’ve read and enjoyed. Even if you’re not planning to build a web browser any time soon, it’s kind of fascinating to see how HTML is parsed—and how much of an achievement the HTML spec is, for specifying consistent error-handling, if nothing else.

The last few chapters are still in progress, but you can read the whole thing online or buy an ePub version.

Saturday, February 23rd, 2019

AddyOsmani.com - JavaScript Loading Priorities in Chrome

A table showing how browsers prioritise a) the loading of JavaScript and b) the execution of JavaScript.

Saturday, December 29th, 2018

The 100 Year Web (In Praise of XML)

I don’t agree with Steven Pemberton on a lot of things—I’m not a fan of many of the Semantic Web technologies he likes, and I think that the Robustness Principle is well-suited to the web—but I always pay attention to what he has to say. I certainly share his concern that migrating everything to JavaScript is not good for interoperability:

This is why there are so few new elements in HTML5: they haven’t done any design, and instead said “if you need anything, you can always do it in Javascript”.

And they all have.

And they are all different.

Read this talk transcript, and even if you don’t agree with everything in it today, you may end up coming back to it in the future. He’s playing the long game:

The web is the way now that we distribute information. We will need the web pages we create now to be readable in 100 years time, just as we can still read 100-year-old books.

Requiring a webpage to depend on a particular 100-year-old implementation of Javascript is not exactly evidence of future-thinking.

Friday, November 23rd, 2018

Archiving web sites [LWN.net]

As it turns out, some sites are much harder to archive than others. This article goes through the process of archiving traditional web sites and shows how it falls short when confronted with the latest fashions in the single-page applications that are bloating the modern web.

Thursday, August 2nd, 2018

The Cost Of JavaScript In 2018 – Addy Osmani – Medium

Addy takes a deep, deep dive into JavaScript performance on mobile, and publishes his findings on Ev’s blog, including this cra-yay-zy suggestion:

Maybe server-side-rendered HTML would actually be faster. Consider limiting the use of client-side frameworks to pages that absolutely require them.

Monday, January 1st, 2018

Why Web Developers Need to Care about Interactivity — Philip Walton

Just to be clear, this isn’t about interaction design, it’s about how browsers and become unresponsive to interaction when they’re trying to parse the truckloads of Javascript web developers throw at them.

Top tip: lay off the JavaScript. HTML is interactive instantly.

Sunday, December 3rd, 2017

Monday, November 27th, 2017

Exploring the Linguistics Behind Regular Expressions

Before reading this article, I didn’t understand regular expressions. But now, having read this article, I don’t understand regular expressions and I don’t understand linguistics. Progress!

Tuesday, October 31st, 2017

Netflix functions without client-side React, and it’s a good thing - JakeArchibald.com

A great bucketload of common sense from Jake:

Rather than copying bad examples from the history of native apps, where everything is delivered in one big lump, we should be doing a little with a little, then getting a little more and doing a little more, repeating until complete. Think about the things users are going to do when they first arrive, and deliver that. Especially consider those most-likely to arrive with empty caches.

And here’s a good way of thinking about that:

I’m a fan of progressive enhancement as it puts you in this mindset. Continually do as much as you can with what you’ve got.

All too often, saying “use the right tool for the job” is interpreted as “don’t use that tool!” but as Jake reminds us, the sign of a really good tool is its ability to adapt instead of demanding rigid usage:

Netflix uses React on the client and server, but they identified that the client-side portion wasn’t needed for the first interaction, so they leaned on what the browser can already do, and deferred client-side React. The story isn’t that they’re abandoning React, it’s that they’re able to defer it on the client until it’s was needed. React folks should be championing this as a feature.

Sunday, October 29th, 2017

Can You Afford It?: Real-world Web Performance Budgets – Infrequently Noted

Alex looks at the mindset and approaches you need to adopt to make a performant site. There’s some great advice in here for setting performance budgets for JavaScript.

JavaScript is the single most expensive part of any page in ways that are a function of both network capacity and device speed. For developers and decision makers with fast phones on fast networks this is a double-whammy of hidden costs.

Friday, October 27th, 2017

Transpiled for-of Loops are Bad for the Client - daverupert.com

This story is just a personal reminder for me to repeatedly question what our tools spit out. I don’t want to be the neophobe in the room but I sometimes wonder if we’re living in a collective delusion that the current toolchain is great when it’s really just morbidly complex. More JavaScript to fix JavaScript concerns the hell out of me.

Yes! Even if you’re not interested in the details of Dave’s story of JavaScript optimisation, be sure to read his conclusion.

I am responsible for the code that goes into the machine, I do not want to shirk the responsibility of what comes out. Blind faith in tools to fix our problems is a risky choice. Maybe “risky” is the wrong word, but it certainly seems that we move the cost of our compromises to the client and we, speaking from personal experience, rarely inspect the results.

Thursday, September 14th, 2017

Deploying ES2015+ Code in Production Today — Philip Walton

The reality is transpiling and including polyfills is quickly becoming the new norm. What’s unfortunate is this means billions of users are getting trillions of bytes sent over the wire unnecessarily to browsers that would have been perfectly capable of running the untranspiled code natively.

Phil has a solution: serve up your modern JavaScript using script type="module" and put your transpiled fallback in script nomodule.

Most developers think of <script type="module"> as way to load ES modules (and of course this is true), but <script type="module"> also has a more immediate and practical use-case—loading regular JavaScript files with ES2015+ features and knowing the browser can handle it!