title: The Content Management System of my Dreams (part 2) - The trouble with dynamic publishing
url: https://www.padawan.info/en/2023/02/the-content-management-system-of-my-dreams-part-2-the-trouble-with-dynamic-publishing.html
hash_url: f8b7c3246c
PHP, Personal Home Page, popularized dynamic publishing. It gave way to blog engines, such as b2, b2evolution, Dotclear. Speaking of history, Movable Type was the number one blog engine loved by geeks and non geeks alike, and the few developers that would not burst in flames in front of Perl like sun-bathing vampires. When Six Apart made a brusque U-turn in killing the Open Source version of MT to pursue questionnable commercial ventures (because it was already well known that it is easier to make money out of other people’s content rather than selling a CMS), a certain Matt Mullenweg saw the uproar of the community, took an opportunity by forking b2/cafelog and rebranded it as WordPress. The crowd of bloggers took the bait and followed in droves. In one fell swoop Movable Type faded in history and WordPress took over the world.
Web developers got so enamored with instantaneous “dynamic publishing” that the thought of having to click one more button and wait more than a few seconds to see a code change became unbearable. It would waste their time and by some twisted reasoning they decided that static publishing was old school and dynamic publishing was soooo in.
If it’s good for them, it must be good for everyone else, right?
This is why most “dynamic sites” will make a hundred calls to a database just to display one page to one visitor, even if its content never changes. Think about it this way: for each page view you are assembling the whole following scene along with summoning actors, the sea and some clouds for style, instead of just showing a print:
“Ah! But you are doing it wrong! Just put a cache in front of it. Problem solved!”
And there you have two problems. If you need an external cache in front of your CMS, you are doing it wrong.
Cache is one of the most difficult problems in IT. Its invalidation certainly is. If the cache is added in front of the CMS, rather than being managed by the CMS itself (like MT does), you now have two different systems that need more development and maintenance work to handle changes. Your site just became more costly and more difficult to manage.
I have a concrete example of this point. A site is generating its home page dynamically. Because of performance issues (it can take more than 10 seconds to generate the page) an external cache has been added in front of the site. Now the site has a cache invalidation issue that prevents a small promotion block to update when a new item is added in the CMS. The adopted solution consists of throwing an ugly truck of javascript at the visitor’s browser and have it make a call to an API to fetch the last two promotions. Yes, this is a “dynamic” site that requires a double request and three computations from two servers and the browser, for each view (at least for the latter when the web server cache works, I won’t even talk about caching the API server response). I reckon that this site is wasting at least 10 times the energy and money that is reasonable. And of course absolutely nothing can be done on it, beyond content edition, without a highly specialized developer and a complex deployment process.
Another obvious auto-inflicted problem are performance issues, which are almost always offset by throwing more horsepower in front of the slow carriage. On the Internet, nobody knows we are entertaining a whole menagerie to serve you this page.
At a time where we all must seriously think about our ecological impact and energy use, it is not ok to not take a step back and question every architectural decision that needlessly multiply servers and computing cycles when we should know, and do, better. Lazyness is half-jokingly said to be good in development. This is not funny when it turns into self-centered decisions like switching to a dynamic system because a developer or a content editor does not care to wait while rebuilding a page. A few seconds saved for one person turn into a fantastic waste of time, money and energy for everyone else.
I am serious about these aspects. There are concerning reports, most of them totally out of whack, pointing fingers at the Internet sector for its ecological waste. Let alone suicidal (literally), it would be absolutely hypocritical to dismiss them by cherry-picking where those reports are wrong (that’s easy) while we are conveniently forgetting about the ecological impact of some of our decisions. We are responsible, and at some point in the near future, we will be held responsible for our footprint in the climate disaster.
“But I need a dynamic site!”
Do you, really?
Think very hard about that word. What exactly is dynamic on your home page? Are you speaking about that Top News thingy? How often do they change? Are you doing this to satisfy yourself (some content editors have the same proclivity than developers to throw a tantrum because their new content does not appear instantaneously on the site)? Is this a business requirement or a real need of your users?
Of course you might need dynamic pages, for example on backoffice sites, profile pages, pages that are unique to one user. But an outstanding majority of pages floating in the cloud do not need to be recalculated for each view. Even if they contain information that changes sometimes, especially if those information are not unique to any visitor.
And if you do need some dynamic pages, nothing prevents you to publish the rest as static files, especially for the most visited pages of your site.
Here is an easy check point: if your pages show the same content to all visitors then you do not need a dynamic site, regardless of the frequency of its changes.
There is an ironic turn at this point in history. Stay tuned…