Journal tags: protocol

4

No code

When I wrote about democratising dev, I made brief mention of the growing “no code” movement:

Personally, I would love it if the process of making websites could be democratised more. I’ve often said that my nightmare scenario for the World Wide Web would be for its fate to lie in the hands of an elite priesthood of programmers with computer science degrees. So I’m all in favour of no-code tools …in theory.

But I didn’t describe what no-code is, as I understand it.

I’m taking the term at face value to mean a mechanism for creating a website—preferably on a domain you control—without having to write anything in HTML, CSS, JavaScript, or any back-end programming language.

By that definition, something like WordPress.com (as opposed to WordPress itself) is a no-code tool:

Create any kind of website. No code, no manuals, no limits.

I’d also put Squarespace in the same category:

Start with a flexible template, then customize to fit your style and professional needs with our website builder.

And its competitor, Wix:

Discover the platform that gives you the freedom to create, design, manage and develop your web presence exactly the way you want.

Webflow provides the same kind of service, but with a heavy emphasis on marketing websites:

Your website should be a marketing asset, not an engineering challenge.

Bubble is trying to cover a broader base:

Bubble lets you create interactive, multi-user apps for desktop and mobile web browsers, including all the features you need to build a site like Facebook or Airbnb.

Wheras Carrd opts for a minimalist one-page approach:

Simple, free, fully responsive one-page sites for pretty much anything.

All of those tools emphasise that don’t need to need to know how to code in order to have a professional-looking website. But there’s a parallel universe of more niche no-code tools where the emphasis is on creativity and self-expression instead of slickness and professionalism.

neocities.org:

Create your own free website. Unlimited creativity, zero ads.

mmm.page:

Make a website in 5 minutes. Messy encouraged.

hotglue.me:

unique tool for web publishing & internet samizdat

I’m kind of fascinated by these two different approaches: professional vs. expressionist.

I’ve seen people grapple with this question when they decide to have their own website. Should it be a showcase of your achievements, almost like a portfolio? Or should it be a glorious mess of imagery and poetry to reflect your creativity? Could it be both? (Is that even doable? Or desirable?)

Robin Sloan recently published his ideas—and specs—for a new internet protocol called Spring ’83:

Spring ‘83 is a protocol for the transmission and display of something I am calling a “board”, which is an HTML fragment, limited to 2217 bytes, unable to execute JavaScript or load external resources, but otherwise unrestricted. Boards invite publishers to use all the richness of modern HTML and CSS. Plain text and blue links are also enthusiastically supported.

It’s not a no-code tool (you need to publish in HTML), although someone could easily provide a no-code tool to sit on top of the protocol. Conceptually though, it feels like it’s an a similar space to the chaotic good of neocities.org, mmm.page, and hotglue.me with maybe a bit of tilde.town thrown in.

It feels like something might be in the air. With Spring ’83, the Block protocol, and other experiments, people are creating some interesting small pieces that could potentially be loosely joined. No code required.

Democratising dev

I met up with a supersmart programmer friend of mine a little while back. He was describing some work he was doing with React. He was joining up React components. There wasn’t really any problem-solving or debugging—the individual components had already been thoroughly tested. He said it felt more like construction than programming.

My immediate thought was “that should be automated.”

Or at the very least, there should be some way for just about anyone to join those pieces together rather than it requiring a supersmart programmer’s time. After all, isn’t that the promise of design systems and components—freeing us up to tackle the meaty problems instead of spending time on the plumbing?

I thought about that conversation when I was listening to Laurie’s excellent talk in Berlin last month.

Chatting to Laurie before the talk, he was very nervous about the conclusion that he had reached and was going to share: that the time is right for web development to be automated. He figured it would be an unpopular message. Heck, even he didn’t like it.

But I reminded him that it’s as old as the web itself. I’ve seen videos from very early World Wide Web conferences where Tim Berners-Lee was railing against the idea that anyone would write HTML by hand. The whole point of his WorldWideWeb app was that anyone could create and edit web pages as easily as word processing documents. It’s almost an accident of history that HTML happened to be just easy enough—but also just powerful enough—for many people to learn and use.

Anyway, I thoroughly enjoyed Laurie’s talk. (Except for a weird bit where he dunks on people moaning about “the fundamentals”. I think it’s supposed to be punching up, but I’m not sure that’s how it came across. As Chris points out, fundamentals matter …at least when it comes to concepts like accessibility and performance. I think Laurie was trying to dunk on people moaning about fundamental technologies like languages and frameworks. Perhaps the message got muddled in the delivery.)

I guess Laurie was kind of talking about this whole “no code” thing that’s quite hot right now. Personally, I would love it if the process of making websites could be democratised more. I’ve often said that my nightmare scenario for the World Wide Web would be for its fate to lie in the hands of an elite priesthood of programmers with computer science degrees. So I’m all in favour of no-code tools …in theory.

The problem is that unless they work 100%, and always produce good accessible performant code, then they’re going to be another example of the law of leaky abstractions. If a no-code tool can get someone 90% of the way to what they want, that seems pretty good. But if that person than has to spend an inordinate amount of time on the remaining 10% then all the good work of the no-code tool is somewhat wasted.

Funnily enough, the person who coined that law, Joel Spolsky, spoke right after Laurie in Berlin. The two talks made for a good double bill.

(I would link to Joel’s talk but for some reason the conference is marking the YouTube videos as unlisted. If you manage to track down a URL for the video of Joel’s talk, let me know and I’ll update this post.)

In a way, Joel was making the same point as Laurie: why is it still so hard to do something on the web that feels like it should be easily repeatable?

He used the example of putting an event online. Right now, the most convenient way to do it is to use a third-party centralised silo like Facebook. It works, but now the business model of Facebook comes along for the ride. Your event is now something to be tracked and monetised by advertisers.

You could try doing it yourself, but this is where you’ll run into the frustrations shared by Joel and Laurie. It’s still too damn hard and complicated (even though we’ve had years and years of putting events online). Despite what web developers tell themselves, making stuff for the web shouldn’t be that complicated. As Trys put it:

We kid ourselves into thinking we’re building groundbreakingly complex systems that require bleeding-edge tools, but in reality, much of what we build is a way to render two things: a list, and a single item. Here are some users, here is a user. Here are your contacts, here are your messages with that contact. There ain’t much more to it than that.

And yet here we are. You can either have the convenience of putting something on a silo like Facebook, or you can have the freedom of doing it yourself, indie web style. But you can’t have both it seems.

This is a criticism often levelled at the indie web. The barrier to entry to having your own website is too high. It’s a valid criticism. To have your own website, you need to have some working knowledge of web hosting and at least some web technologies (like HTML).

Don’t get me wrong. I love having my own website. Like, I really love it. But I’m also well aware that it doesn’t scale. It’s unreasonable to expect someone to learn new skills just to make a web page about, say, an event they want to publicise.

That’s kind of the backstory to the project that Joel wanted to talk about: the block protocol. (Note: it has absolutely nothing to do with blockchain—it’s just an unfortunate naming collision.)

The idea behind the project is to create a kind of crowdsourced pattern library—user interfaces for creating common structures like events, photos, tables, and lists. These patterns already exist in today’s silos and content management systems, but everyone is reinventing the wheel independently. The goal of this project is make these patterns interoperable, and therefore portable.

At first I thought that would be a classic /927 situation, but I’m pleased to see that the focus of the project is not on formats (we’ve been there and done that with microformats, RDF, schema.org, yada yada). The patterns might end up being web components or they might not. But the focus is on the interface. I think that’s a good approach.

That approach chimes nicely with one of the principles of the indie web:

UX and design is more important than protocols, formats, data models, schema etc. We focus on UX first, and then as we figure that out we build/develop/subset the absolutely simplest, easiest, and most minimal protocols and formats sufficient to support that UX, and nothing more. AKA UX before plumbing.

That said, I don’t think this project is a cure-all. Interoperable (portable) chunks of structured content would be great, but that’s just one part of the challenge of scaling the indie web. You also need to have somewhere to put those blocks.

Convenience isn’t the only thing you get from using a silo like Facebook, Twitter, Instagram, or Medium. You also get “free” hosting …until you don’t (see GeoCities, MySpace, and many, many more).

Wouldn’t it be great if everyone had a place on the web that they could truly call their own? Today you need to have an uneccesary degree of technical understanding to publish something at a URL you control.

I’d love to see that challenge getting tackled.

Browser defaults

I’ve been thinking about some of the default behaviours that are built into web browsers.

First off, there’s the decision that a browser makes if you enter a web address without a protocol. Let’s say you type in example.com without specifying whether you’re looking for http://example.com or https://example.com.

Browsers default to HTTP rather than HTTPS. Given that HTTP is older than HTTPS that makes sense. But given that there’s been such a push for TLS on the web, and the huge increase in sites served over HTTPS, I wonder if it’s time to reconsider that default?

Most websites that are served over HTTPS have an automatic redirect from HTTP to HTTPS (enforced with HSTS). There’s an ever so slight performance hit from that, at least for the very first visit. If, when no protocol is specified, browsers were to attempt to reach the HTTPS port first, we’d get a little bit of a speed improvement.

But would that break any existing behaviour? I don’t know. I guess there would be a bit of a performance hit in the other direction. That is, the browser would try HTTPS first, and when that doesn’t exist, go for HTTP. Sites served only over HTTP would suffer that little bit of lag.

Whatever the default behaviour, some sites are going to pay that performance penalty. Right now it’s being paid by sites that are served over HTTPS.

Here’s another browser default that Rob mentioned recently: the viewport meta tag:

I thought I might be able to get away with omitting meta name="viewport". Apparently not! Maybe someday.

This all goes back to the default behaviour of Mobile Safari when the iPhone was first released. Most sites wouldn’t display correctly if one pixel were treated as one pixel. That’s because most sites were built with the assumption that they would be viewed on monitors rather than phones. Only weirdos like me were building sites without that assumption.

So the default behaviour in Mobile Safari is assume a page width of 1024 pixels, and then shrink that down to fit on the screen …unless the developer over-rides that behaviour with a viewport meta tag. That default behaviour was adopted by other mobile browsers. I think it’s a universal default.

But the web has changed since the iPhone was released in 2007. Responsive design has swept the web. What would happen if mobile browsers were to assume width=device-width?

The viewport meta element always felt like a (proprietary) band-aid rather than a long-term solution—for one thing, it’s the kind of presentational information that belongs in CSS rather than HTML. It would be nice if we could bid it farewell.

A matter of protocol

The web is made of sugar, spice and all things nice. On closer inspection, this is what most URLs on the web are made of:

protocol://domain/path
  1. The protocol—e.g. http—followed by a colon and two slashes (for which Sir Tim apologises).
  2. The domain—e.g. adactio.com or huffduffer.com.
  3. The path—e.g. /journal/tags/nerdiness or /js/global.js.

(I’m leaving out the whole messy business of port numbers—which can be appended to the domain with a colon—because just about everything on the web is served over the default port 80.)

Most URLs on the web are either written in full as absolute URLs:

a href="http://adactio.com/journal/tags/nerdiness"
script src="https://huffduffer.com/js/global.js"

Or else they’re written out relative to the domain, like this:

a href="/journal/tags/nerdiness"
script src="/js/global.js"

It turns out that URLs can not only be written relative to the linking document’s domain, but they can also be written relative to the linking document’s protocol:

a href="//adactio.com/journal/tags/nerdiness"
script src="//huffduffer.com/js/global.js"

If the linking document is being served over HTTP, then those URLs will point to http://adactio.com/journal/tags/nerdiness and https://huffduffer.com/js/global.js but if the linking document is being served over HTTP Secure, the URLs resolve to https://adactio.com/journal/tags/nerdiness and https://huffduffer.com/js/global.js.

Writing the src attribute relative to the linking document’s protocol is something that Remy is already doing with his :

<!--[if lt IE 9]>
<script src="//html5shim.googlecode.com/svn/trunk/html5.js"></script>
<![endif]-->

If you have a site that is served over both and , and you’re linking to a -hosted JavaScript library—something I highly recommend—then you should probably get in the habit of writing protocol-relative URLs:

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js">
</script>

This is something that HTML5 Boilerplate does by default. HTML5 Boilerplate really is a great collection of fantastically useful tips and tricks …all wrapped in a terrible, terrible name.