A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

index.md 8.6KB

title: “HTTPS considered harmful”, yes, but isn’t HTTP too? url: https://medium.com/@MattiSG/https-considered-harmful-yes-but-isnt-http-too-1ee1f4a36358 hash_url: 5c4908deae

Since David doesn’t allow public responses, I’ll try and follow his way of publishing letters in personal spaces.

Dear David,

Following on our conversation regarding your point that HTTPS is harmful:

Encouraging everybody to switch to HTTPS promotes strong dependency to a third-party mafia, increases load time, makes your content inaccessible if you have any trouble reconducting your certificate, avoids migrating easily from one hosting platform to another, forces upgrading on a lot more security issues if you are hosting yourself. Even worse, when you switch there is no harmless turning back! That’s not the Web I’m aiming for.

To which I replied earlier a technical answer:

There is now a new set of arguments you’re putting forward. Here is my reply:

When you turn an oligopole into a monopole, it cannot be a mafia anymore, heh.

Let’s Encrypt not being a mafia does not mean it’s not a SPOF. The centralised model is bad. Question is not “is HTTPS perfect”, it is “is HTTPS better than HTTP”. Maybe we do not mean the same thing by “mafia”.

“0-RTT will reduce initial load time.” One day, maybe. But for now it’s quite limited to say the least.

We’ll see. TLS 1.2 did get a nice push forward thanks to Let’s Encrypt. With more cloud providers, docker images, shared Ansible configurations, and default Nginx setups, upgrading server setups goes faster than it used to.

“HTTP2 is good for performances.” […] HTTPS highly impacts my First Byte Time though.

Now we go into the details of what is performance. Of course if you consider TTFB, HTTP+TLS 1.2 is slower than HTTP. No argument here. Have you measured the difference in time to first meaningful paint though? So often, that entails loading some CSS and images.

HTTP2 multiplexing allows me to keep CSS files split by components with no additional request cost, which then allows me to leverage cache at a very granular level, highly speeding up navigation.

“You have the guarantee your content is not altered.” Except if done once downloaded.

I don’t understand your point. Views being rendered on the client-side from controlled code is one thing, intermediaries injecting trackers or serving ads within your code is another.

“Don’t use HSTS!” I don’t get the point of providing content over HTTPS if you do not force it somehow

“somehow”, yes. A 301 or 302 is not the same as HSTS. That was a reply to your impression that having a certificate “makes your content inaccessible if you have any trouble reconducting your certificate”. Refusing to serve over HTTP but being able to do so if needed is not the same as explicitly forbidding recovery in case you cannot renew. One should use HSTS only if one has the resources to maintain that infrastructure properly, with recovery keys.


I don’t argue HTTPS is overkill for many uses, especially for websites that provide read-only, low-importance information, and I do agree with your underlying expectations of simplicity and performance. My replies are only there because you used several technical arguments that I consider slightly exaggerated. You mention “not in my case”, then it would be worth describing that case more precisely in the article (though I think I see the kind of small, server-rendered, simple website you’re talking about and that I also tend to ship).

I’d argue that there’s a flip side to the ethical arguments too, though, and these are the ones that personally convinced me to take the burden of the added complexity.

You’re right, “That’s not the Web I’m aiming for.”. But the Web where corps inject ad trackers in traffic and where government agencies massively tap HTTP isn’t either. And in this game, as a professional developer, I prefer to take on the complexity and work harder to optimise my content for performance in order to protect my users, and the users of the Web in general. For sure, certificate issuers are now honeypots, and we need to keep pushing for decentralised trust. But the IX were already, and it has become way too easy for intermediaries to do anything they want on clear text to refuse taking care of the privacy of our users. I am glad Wikipedia is served over HTTPS, not because I’m afraid my ISP will change the birth date of Napoleon, but because I don’t want it to know which articles I’m reading at what time. The profiling power of aggregated metadata is too strong for us not to make it as hard as we can for spies to leverage it against our fellow netizens.