Performance Icon


84 Stories
All Topics


Parcel 2 is getting a 10x compiler speedup (thanks, Rust!)

The Parcel team is excited to release Parcel 2 beta 3! This release includes a ground up rewrite of our JavaScript compiler in Rust, which improves overall build performance by up to 10x. In addition, this post will cover some other improvements we’ve made to Parcel since our last update, along with our roadmap to a stable Parcel 2 release.

A growing trend in the JS tooling world is to replace bits and pieces with Rust || Go where it makes sense and reap the performance benefits. Congrats to the Parcel team on epic results from this rewriting effort.

Parcel 2 is getting a 10x compiler speedup (thanks, Rust!)


Speed is the killer feature

Brad Dickason:

… teams consistently overlook speed. Instead, they add more features (which ironically make things slower). Products bloat over time and performance goes downhill.

New features might help your users accomplish something extra in your product. Latency stops your users from doing the job they already hire your product for.

Slow ui acts like tiny papercuts. Every time we have to wait, we get impatient, frustrated, and lose our flow.


How the V8 team made JS calls faster with this clever trick

Victor Gomes details the elegant hack (in the best sense of the word) he and the V8 team came up with to significantly increase V8’s JavaScript function call performance (by up to 40% in some cases).

Until recently, V8 had a special machinery to deal with arguments size mismatch: the arguments adaptor frame. Unfortunately, argument adaption comes at a performance cost, but is commonly needed in modern front-end and middleware frameworks. It turns out that, with a clever trick, we can remove this extra frame, simplify the V8 codebase and get rid of almost the entire overhead.

A fascinating read and fantastic performance improvements for all to enjoy.


Why wasn't Ruby 3 faster?

Noah Gibbs tries to reason through why some folks are disappointed in Ruby 3’s lack of speed improvements:

I think some of the problem was misplaced expectations. People didn’t understand what “three times faster” was supposed to mean. I don’t think people thought it through, but I also don’t think it was communicated very clearly.

So: some people understood what was promised, and some people didn’t.

What was promised?

I think Noah hits on a lot of solid points here.

 Itamar Turner-Trauring

CI for performance: Reliable benchmarking in noisy environments

Benchmarking is often not done in CI because it’s so hard to get consistent results; there’s a lot of noise in cloud VMs, so you ideally want dedicated hardware. But, it turns out you can use a tool called Cachegrind to get consistent benchmarks results across different computers, allowing you to run benchmarks in GitHub Actions, GitLab CI, etc. and still get consistent results.

CSS-Tricks Icon CSS-Tricks

Comparing static site generator build times

Sean C Davis writing on CSS-Tricks:

A colleague of mine built a static site generator evaluation cheatsheet. It provides a really nice snapshot across numerous popular SSG choices. What’s missing is how they actually perform in action.

Sean set out to test 6 of the most popular SSGs on the market today. The results are somewhat expected (Hugo is the super fast), but there are some surprises in there as well (Hugo scales poorly, but it doesn’t matter so much because it’s super fast)

Comparing static site generator build times

Zach Leatherman

Use Speedlify to continuously measure site performance

Zach Leatherman:

Instantaneous measurement is a good first step. But how do we ensure that the site maintains good performance and best practices when deploys are happening every day? How do we keep the web site fast? The second step is continuous measurement. This is where Speedlify comes in. It’s an Eleventy-generated web site published as an open source repository to help automate continuous performance measurements.

Demo here.

Use Speedlify to continuously measure site performance


How the most popular Chrome extensions affect browser performance

I used to be the guy with dozens of Chrome extensions. These days I limit my use of both (Google Chrome and browser plugins). Performance and reliability are features I desire more than what most plugins have on offer.

That being said, if you have a lot of extensions and you’re curious which ones might be bogging down your machine’s resources, this is a great analysis of the top 1000.

How the most popular Chrome extensions affect browser performance

Achiel van der Mandele Cloudflare

Cloudflare launches

There’s a new speed test in town…

With many people being forced to work from home, there’s increased load on consumer ISPs. You may be asking yourself: how well is my ISP performing with even more traffic? Today we’re announcing the general availability of, a way to gain meaningful insights into exactly how well your network is performing.

We’ve seen a massive shift from users accessing the Internet from busy office districts to spread out urban areas. Although there are a slew of speed testing tools out there, none of them give you precise insights into how they came to those measurements and how they map to real-world performance.

Cloudflare launches

Christian Scott

Making Rust as fast as Go (fake news)

Is this more proof of Cunningham’s law, which says, “The best way to get the right answer on the Internet is not to ask a question; it’s to post the wrong answer.”

Update: as some keen HN commenters have pointed out, it looks like the rust program is not actually equivalent to the go program. The go program parses the string once, while the rust program parses it repeatedly inside every loop. It’s quite late in Sydney as I write this so I’m not up for a fix right now, but this post is probably Fake News.

Read Christian’s post then have fun in the comments and discussions on HN and Reddit analyzing his hypothesis, which includes a repo of code to backup his ideas.

Addy Osmani

Automating web performance testing with Puppeteer 🎪

Addy Osmani has created an excellent resource for all developers interested in optimizing their web performance (which should be pretty much all of us).

You’ve probably heard of Puppeteer, which lets you control Chromium headlessly over the DevTools protocol. This repo shows you how to use Puppeteer to automate performance measurement, such as getting a performance trace for a page load:

const puppeteer = require('puppeteer');

(async () => {
  const browser = await puppeteer.launch();
  const page = await browser.newPage();
  // Drag and drop this JSON file to the DevTools Performance panel!
  await page.tracing.start({path: 'profile.json'});
  await page.goto('');
  await page.tracing.stop();
  await browser.close();

Which produces the results in the image below.

Automating web performance testing with Puppeteer 🎪


The cost of JavaScript frameworks

We all know our users pay a cost when we push our JS framework in to their browser. Now, thanks to Tim Kadlec doing the yeoman’s work of crunching the numbers, we can approximate just how much that cost really is.

There is no faster (pun intended) way to slow down a site than to use a bunch of JavaScript. The thing about JavaScript is you end up paying a performance tax no less than four times:

  1. The cost of downloading the file on the network
  2. The cost of parsing and compiling the uncompressed file once downloaded
  3. The cost of executing the JavaScript
  4. The memory cost

Thanks to HTTP Archive, we can figure that out.

I’m pretty happy with how sites using jQuery size up. Granted, it’s not really a UI framework like the others are, but you have to imagine that many of those sites also use jQuery UI and their overall cost still compares well to the more modern solutions.


The Axios API, as an 800 byte Fetch wrapper

For those searching for ways to shave a few kilobytes off of their bundles, that’s less than 1/5th of the size. This is made possible by using the browser’s native [Fetch API][fetch], which is supported in all modern browsers and polyfilled by most tools including Next.js, Create React App and Preact CLI.

Of course, you could always use Axios directly if/when you can justify the dependency.

Eric Meyer

It’s time to get static

Eric Meyer says…

If you are in charge of a web site that provides even slightly important information, or important services, it’s time to get static.

…too many sites are already crashing because their CMSes can’t keep up with the traffic surges. And too many sites are using dynamic frameworks that drain mobile batteries and shut out people with older browsers. That’s annoying and counter-productive in the best of times, but right now, it’s unacceptable.


An extremely fast JavaScript bundler and minifier

Why build another JavaScript build tool? The current build tools for the web are at least an order of magnitude slower than they should be. I’m hoping that this project serves as an “existence proof” that our JavaScript tooling can be much, much faster.

According to the author, esbuild is fast because..

  1. it’s written in Go
  2. much of the work is done in parallel
  3. it takes very few passes and avoids data transformations
  4. it was coded with performance in mind
An extremely fast JavaScript bundler and minifier
0:00 / 0:00