JAMstack vs traditional monolithic workflow
In this article, we will be diving into the JAMstack workflow with a side-by-side comparison of the traditional, monolithic workflow. We will also understand that JAMstack and SPA(single page application) are not the same.
I have introduced the phrase
prebuilt markup to you in the introductory article. It is a very powerful concept and promises lots to perform. We will see how the pre-building is going to help us achieve an important pillar or user experience, i.e.,
If you haven't read the previous article of the series yet, you can find it here, JAMstack for All: An Introduction. I would recommend you to read this for better clarity on the history, what, why part of the JAMstack story.
The ever-changing user experience and usage
As per the report from wearesocial.com, roughly 53% of all web page requests come from mobile phones, computers account for 44% of the total. This report was published in January 2020 and it shows that the mobile usage share is increasing steadily over the years.
Just like myself, many of the mobile users are impatient. They do not like to wait longer for a page load, hate an initial white screen, or a loading indicator for long. Depending on the economy, place of living, and earning, the types of mobile and the computation power differ.
There are still many users out there with single-core mobile devices. There are users with high-end mobiles with great hardware support. However, speed is one of the common needs of both user classes. Do you know what? The users with high-end mobile devices are more impatient than others, as they are used to the
fast environment so much.
As an application developer, we need to count this important user expectation and design the experience accordingly. We need to make sure, the page loads faster. We must get users the required initial bytes of page information as soon as possible to reduce a
One of the characteristics of JAMstack is the markup should be
prebuilt. It means, we as developers can spend a few extra seconds in the build phase of an application than, expecting a customer to get frustrated by burning those extra seconds at the run time. Our customers wouldn't care at all if we spend the extra time in building the application.
A typical user mindset for an application load where lots are happening in the browser (screen-shot from Matt Biilmann's JAMstack at Scale: why pre-built markup is critical presentation):
A great amount of time goes into the initial load itself and it leads your users to bounce off the application or website easily. There is also a problem of not meeting user's expectations about the design. What if the users are not finding it useful and we have to roll back to the old state? Some production nightmares, isn't it?
The heavy lifting is done at once at the build time, which takes the processing time out of the request and eventually uses less computation at the run time.
The sections below show the difference in the build vs load time between the server-rendered, SPA, and JAMstack applications.
In a server-rendered application, all the heavy lifting is done by the server. The browser needs to request a page and the server computes, generates the page. Once done, it sends the page to the browser as part of the response. The browser downloads the page and renders it. This cycle repeats for each of the page loads and all happen over the wires all the time.
Single Page Application(SPA)
A single page application solves the above problem greatly. The server doesn't handle the page computations, routing, and request based serving. But the problem is, lots are happening on the client-side. We are relying on the browser and the device power here for all the computations at the run time.
prebuilt mechanism, the content is already built. As the content was already built, there is no need for an origin server at all. The content can be served from a CDN. It solves both the problems we have seen with the
server rendered apps and
There are several advantages of pre-building the content,
- Pull data from remote services.
- Build C into WebAssembly.
- Lint your code for accessibility (
Netlifyhas introduced the build time plug-in system, will see it in the future article of the series).
- Shaping up the data that are required by the UI component at the build time.
How about the rollback issue we spoke about above? We will learn about it shortly.
As we know about the benefit of pre-building the app now, we need to talk a bit about CDN, i.e, content delivery network. There is actually not much advantage if a prebuilt mark up is served from an origin server. It will be almost similar to the server-rendered applications.
Let us take an example. Assume that, application content is being served from the origin server,
abcd.com which is located in some part of the USA. A user like me who is from India wants to access a page from
abcd.com. It may be a poor experience for me to render this page on my browser depending on my network speed, hardware capabilities, the distance between my browser and the origin server, etc.
How about, I have the page(or the content) hosted in my proximity in a secured manner? This where the CDN comes in.
- CDN reduces the distance between the users and website resources.
- CDN reduces the amount of data to be transferred using minification techniques.
- Helps in cache invalidation so that, users do not see the stale data.
- It is secured.
Traditional vs JAMstack workflow
I am sure, we have a good ground on the
prebuilt content and
CDN by now. With that, let us understand some critical differences between a traditional workflow and JAMstack workflow.
In a traditional client-server workflow,
- Developers write code, test, build the
- Ships the
applicationto a server(
- Users request a resource from the
origin serverspecifying a URL.
- The origin server does the computations, produces the
required HTML, and sends it to the user. When the user requests a new resource, the process continues.
In JAMstack workflow,
- Developers write code and push to a
source repositorylike git.
workflowkicks off which start the build to create
- The prebuilt content then gets deployed to a
- Users request for the resources from the
CDN(available in proximity) and the prebuilt content is served. No need to reach out to the origin server.
It is also easy to manage customer expectations with JAMstack. The process of reverting a fix or rolling back to a specific state of the application with the traditional approach is difficult. It requires the process to plan a release, onboard developers, tester, DevOps. Build the entire application again and then finally ship it to the server.
With JAMstack, the workflow is managed very easily. Here is an example from Netlify where all my branch deploys are available for me to make an easy switch and serve the application from it with a single click. We will learn about this workflow in detail later in the series.
- Matt Biilmann - JAMstack at Scale: why pre-built markup is critical
- JAMstack best practices
- What is CDN
Great, so we know all about JAMstck and the fundamentals of it. Hope it was useful to you. In the next article, we will see the usage of Static Site Generators(SSG) and Headless CMS together.
We will go through the step-by-step way to build a JAMstack application using gatsbyjs, tie it with the git workflow, and then deploy it with netlify. We will also learn to manage the content of our app using a CMS. It is going to be fun learning with all the concepts we have got so far. Stay tuned!
If it was useful to you, please Like/Share so that, it reaches others as well. To get an email notification on my latest posts, please subscribe to my blog by hitting the Subscribe button at the top of the page. You can also follow me on twitter @tapasadhikary.