Dear Ueno: How do you make websites load so blazingly fast?

Davíð Arnarsson
Ueno.
Published in
6 min readSep 10, 2019

--

Dear Ueno is an advice column for people who for some weird reason think we know what we’re doing. Read more about all this, or check out our old advice.

From Dragos Lupascu on Twitter:

I don’t know how they do it, but absolutely every Ueno-made website is loading instantly. Blazing fast. Every website, every page. Amazing!

Davíð Arnarsson, developer at Ueno Reykjavík, is happy to take the compliment:

Hey Dragos, thanks for the compliment!

When it comes to speed, you could say we’re standing on the shoulders of giants. We use open source tools which provide a lot of optimisations when working, like we do, with a stack based on React.

But before we go further, let’s back up a bit, and split performance up into two categories: server-side and client-side.

Server-side performance

To measure server-side performance we like to use the “time-to-first-byte” (TTFB) metric. TTFB is the time the client (in our case, the browser) spends waiting for data from the server after sending the initial request.

A screenshot from the Google Chrome Dev Tools Network tab showing the TTFB of ueno.co

For our projects we primarily use Gatsby and Next.js. Both are application frameworks based on React, but they differ in approach.

Gatsby is a static site generator, which means that it ingests data from a data source (such as Prismic.io, a headless content management system we tend to use), and outputs static HTML, CSS and Javascript, using our React code. Being static, the TTFB tends to be very low, requiring no server-side preprocessing to render. We deploy these files to an Amazon S3 bucket for example, and set correct Cache-Control HTTP headers for the files.

Sometimes we need a running application server to implement some special logic, for example consuming some user input from a form, invoking a service with that data, and then rendering a result of some sort. In those cases we use Next.js, which renders our React application per request and enables us to do server-side logic before serving our application to the user.

Performing this logic can be time-consuming, so we do our best to cache the result of it. We use redis as a cache store because of its ease of use within Heroku, our hosting environment. Heroku is a platform-as-a-service which offers a wide variety of add-ons and integrations. Redis is one of those add-ons. Heroku serves our apps via web workers. Heroku can, depending on the TTFB, spin up many worker instances that share the load.

These worker instances are completely isolated from one another, meaning that if we were to use in-memory caching, each worker would need to build up a cache for its own use. By storing cached values in redis, the web workers share cached values, meaning that if worker A does some processing and commits some value to the cache, worker B can use that value without doing any processing of its own.

We like sticking CloudFlare in front of our apps, because it gives us a global content delivery network (CDN) for our static resources without much hassle.

[From the archives: How do I pick and set up a CMS for my site?]

Client-side performance

Client-side performance is hard to measure. It’s all about how fast the user perceives the site to be. There are some metrics we can use to our aid, of course, but more as a guidance rather than something concrete. Time to Interactive (TTI) is one such metric, which estimates how quickly the page becomes responsive to user input after rendering is completed based on network activity, among other things. You can try it with Google Lighthouse, built into the Google Chrome Dev Tools under the Audits tab. Google also has an excellent section on performance in their Lighthouse audit tool documentation.

Lighthouse performance report for ueno.co — some room for improvement!

Need for speed

Our tools provide optimisations to help lower the TTI of our React apps. Both Gatsby and Next.js serve pre-rendered versions of our React apps, reducing the time it takes to start the React app in the browser, as well as helping search engines index your content. Both frameworks split all compiled javascript and css into separate page bundles, so the user only has to download the minimum required to display each page. Keeping the file size down certainly helps!

Once the user has fetched the page and is happily scrolling down the page, we can make another optimisation, prefetching pages. Both Gatsby and Next.js have built-in support for this. Their approach differs slightly, but essentially what they do is detect anchor tags with links to other pages on the site, and add this …

<link rel=”prefetch” href=”…” /> 

… to the <head> tag of the page pointing to javascript/css for those pages. The browser then fetches and caches these resources, so that they’re ready if the user wants to navigate to that page. Take a look at this example.

Image optimisation is important too, especially if your client is also uploading images to the site, for example into an image gallery. If a site is littered with large images the user might not see anything but the site background while the images are being downloaded, leading them to perceive the site as slow. To counter this we might bake (very) low resolution versions of the images into the HTML markup with data URIs:

<img src=”data:image/jpg;(insert tiny base64 encoded image here)” /> 

Instead of seeing absolutely nothing, the user will now see the lower resolution images first. Then, once the full resolution images have been downloaded we can do something nice like animate the new images in. Also, by defining the image container size in CSS before loading an image, we ensure that the page content doesn’t jump around when the image starts loading. Here is an example of image baking.

In some cases we extract the minimal CSS required to render the “above the fold” content to a separate <style> tag included in the <head> of the HTML document. This prevents a “flash of unstyled content”, where your markup is rendered without styles while the browser is working on downloading and parsing our stylesheets.

Ultimately, being mindful of the code you write yields the best results. For example, don’t do anything super computationally expensive in your application without due consideration for the users who won’t be viewing your site on a powerful laptop like us devs. Remember mobile users — they matter too.

That’s more or less it. I hope this gave you an insight into what makes a Ueno site perform well!

— Davíð

Davíð Arnarsson is a developer at Ueno Reykjavík. He moves so fast that our photographers haven’t been able to take a picture of him yet, but he won’t be able to evade them forever. Stay tuned, and follow Davíð on Github,

Need advice? Send your question by email to hi@ueno.co or via Twitter.
Want to make fast websites? Because Ueno is hiring.
Want more email? Because Ueno has a newsletter.
Want these questions to stop? Because this was the last one.

--

--