When shifting one of our applications towards client-side rendering from server-side, performance became important. With server-side rendering, there was no thought around how the user perceived our application.
Since then, we have moved towards a client-side approach to make user experience better. The stack is now using ReactJS, Laravel and VueJS. The mixture between a server-side library and two client-side libraries, made it reasonable to think that the application had opportunities to improve.
When developing applications, many developers think that performance is a field of its own. It certainly is, and the majority of the time a developer spends on performance isn't necessarily around making optimizations. That being said, some of the optimizations we had were already in reach. Libraries themselves had embedded production settings.
To build our entire application, we use a combination of webpack, (PHP) composer and gulp. We chose webpack because it is stable. The tool is useful for optimizing code and abstracting source code into smaller pieces of code (chunks). To build global stylesheets, webpack wasn't as useful as gulp is. In addition to these two, a part of our stack is written in Laravel, which would need to use our generated chunks and CSS.
With no prior work on performance, we did not have a estimated budget. When starting to optimize your application, an estimated bundle size and a goal to work towards is important. Not is it only motivational, but it also gives an indication of how small the application needs to be in order to be fast. U.S Web design system has a primer website to get you started with perf budgets that makes sense.
Lighthouse has helped the team to find out where performance optimizations were needed. It provided us insight on where and how to improve and has been a helpful tool for us. Another great resource is the opportunity to analyze generated JavaScript files - commonly referred to as bundles. By using webpack-bundle-analyzer we found duplicate code that we could have abstracted to chunks containing this code.
We also used webpagetest to measure speed index and general website stats in order to get an estimate of our performance budget. Snapshots of things like First Meaningful Paint and Time To First Byte became a valueable label for our measurements.
Route based loading made it easier for us to provide a user interface that was friendly with respect to performance. By showing a loading spinner while the route component was fetching, we were able to reduce amount of code shipped to the user. We made use of a library in React, react-lodable
.
import React from 'react';
import ReactDOM from 'react-dom';
import Loadable from 'react-loadable';
import FidgetSpinner from './spinner';
import Landing from './landing';
const mountNode = document.getElementById('mypages');
const LoadableComponent = Loadable({
loader: () => import('./auth'),
loading: FidgetSpinner
});
ReactDOM.render(<LoadableComponent />, mountNode);
Making sure that a PHP library complies with a JavaScript library wasn't as hard as one might think. The major blocker when using a PHP library with a JavaScript library was to make sure that performance wasn't suffering. A hybrid application, using server-side rendering and client-side rendering has the potential to scale well if done right.
That might sound easy, but it wasn't. An issue we faced early was using webpack with chunk splitting to save loading times for assets compiled in JavaScript. Laravel has a built-in library named laravel-mix, which did the same thing as webpack-manifest-plugin. The difference between the two is not that much, with the execption being Laravel setting defaults and a near production ready configuration. Laravel mix is easy to implement, but it does not scale well with advanced builds where you would want full control over resource management. Both libraries supported a key to value pair, mapping a bundle name to a hash for a new bundle, which is what we were trying to do.
Webpack as a stand alone build tool was preferable. For instance, Laravel used an older version of webpack (v3). Webpack v4 had some major performance improvements and that alone was a reason to make a custom build step for us. Laravel didn't allow Service Workers either, so offline first websites would have been harder to implement without having a direct access to a webpack configuration to set plugins and modifying assets.
When auditing performance using Lighthouse, the tradeoff between server response and Time To First Byte is essential. In our application, we did not render the landing page on the client-side, which would mean that performance and following the PRPL pattern did not follow best practice for performance.
This being said, switching to a hybrid application made room for us to gradually turn towards a more client-sided infrastructure while keeping performance within our performance budget. That is why performance budgets are good: they help to limit your team to a goal that is not impossible. It may have been that our technology stack, which contains some legacy, wasn't ideally suited for some optimizations. An application with a different stack might have had a better starting point than ours.
By gradually loading fonts without blocking the rendering path, performance optimizations were made. Previously, all of our fonts blocked the rendering path with no fallback font. That is bad user experience.
By using font-observer
we appended a className to display the fonts once they were done downloading. One major blocker for us was that our React Components had local font-families declared, and there seems like it's no obvious way of working around that issue without using some variation of a context manager. To learn more about font loading and how, here's a reference.
When optimizing CSS, we made sure that we used built-in optimizations for CSS. Webpack and its plugins had built-in performance. Webpack 4 has a plugin named mini-css-extract-plugin
, which makes it possible for us to convert SCSS back to CSS files.
We decided to cache JavaScript and CSS files using workbox. This was relatively easy and we got a simple service worker up quite quickly.
Babel is a great tool and has done great with respect to developer experience lately. We were able to add concepts like browser support (through polyfills) and tree shaking, with the hardest thing being to install the correct dependency. With a fully-configured front-end build, we ended up with these babel packages:
{
"@babel/core": "^7.0.0-beta.51",
"@babel/preset-env": "^7.0.0-beta.51",
"@babel/preset-react": "^7.0.0-beta.51",
"babel-core": "^7.0.0-bridge.0",
"babel-jest": "^23.2.0",
"babel-loader": "^8.0.0-beta.4",
"babel-plugin-syntax-dynamic-import": "^6.18.0",
"babel-preset-minify": "^0.5.0-alpha.a24dd066"
}
.babelrc
{
"presets": [["@babel/preset-env", {
"modules": false,
"useBuiltIns": "usage"
}], "@babel/react"],
"env": {
"test": {
"presets": [["@babel/preset-env"], "@babel/react"]
},
"production": {
"presets": ["minify"]
}
},
"plugins": ["syntax-dynamic-import", "react-hot-loader/babel"]
}
Our webpack configuration ended up (roughly) like this:
webpack.base.js
const webpack = require('webpack');
const ManifestPlugin = require('webpack-manifest-plugin');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
const WorkboxPlugin = require('workbox-webpack-plugin');
const { join } = require('path');
const CopyWebpackPlugin = require('copy-webpack-plugin');
const VueLoaderPlugin = require('vue-loader/lib/plugin');
module.exports = {
entry: [/* ... */],
output: {
filename: '[name].js',
chunkFilename: '[contenthash].js',
path: join(__dirname, 'dist'),
publicPath: '/'
},
optimization: {
splitChunks: {
cacheGroups: {
main: {
name: 'main',
chunks: 'initial',
minChunks: 2,
maxInitialRequests: 5
},
vendor: {
test: /node_modules/,
name: 'vendor',
priority: 10,
enforce: true
}
}
}
},
target: 'web',
module: {
rules: [
{
test: /\.(jpg|png|gif|svg)$/,
loader: 'image-webpack-loader',
enforce: 'pre'
},
{
test: /\.(jpe?g|png)$/,
use: [
{
loader: 'url-loader',
options: {
limit: 10 * 1024
}
},
{
loader: 'file-loader',
options: {
name: '[path][name].[ext]'
}
}
]
},
{
test: /\.css$|sass$|\.scss$/,
use: [
{
loader: MiniCssExtractPlugin.loader
},
{
loader: 'css-loader',
options: {
minimize: true
}
},
{ loader: 'postcss-loader' },
{ loader: 'sass-loader' }
]
},
{
test: /\.(jpe?g|png)$/,
use: [
{
loader: 'url-loader',
options: {
limit: 10 * 1024
}
},
{
loader: 'file-loader',
options: {
name: '[path][name].[ext]'
}
}
]
},
{
test: /\.svg$/,
use: [
{
loader: 'svg-inline-loader',
options: {
limit: 10 * 1024,
noquotes: true
}
},
{
loader: 'url-loader',
options: {
limit: 10 * 1024
}
},
{
loader: 'file-loader',
options: {
name: '[path][name].[ext]'
}
}
]
},
{
test: /\.vue$/,
loader: 'vue-loader'
},
{
test: /\.js$/,
exclude: /(node_modules|bower_components)/,
use: {
loader: 'babel-loader'
}
},
{
test: require.resolve('jquery'),
use: [
{
loader: 'expose-loader',
options: 'jQuery'
},
{
loader: 'expose-loader',
options: 'window.$'
},
{
loader: 'expose-loader',
options: '$'
},
{
loader: 'expose-loader',
options: 'jquery'
},
{
loader: 'expose-loader',
options: 'window.jQuery'
},
{
loader: 'expose-loader',
options: 'window.jquery'
}
]
}
]
},
plugins: [
new WorkboxPlugin.GenerateSW({
swDest: 'service-worker.js',
clientsClaim: true,
skipWaiting: true
}),
new MiniCssExtractPlugin({
filename: '[name].css'
}),
new VueLoaderPlugin(),
new webpack.ProvidePlugin({
$: 'jquery',
jQuery: 'jquery',
Popper: ['popper.js', 'default'],
Util: 'exports-loader?Util!bootstrap/js/dist/util'
}),
new webpack.ContextReplacementPlugin(/moment[\/\\]locale$/, /en-gb/),
new ManifestPlugin({
writeToFileEmit: true,
})
],
resolve: {
extensions: ['*', '.js', '.vue', '.json', '.css', '.scss'],
alias: {
vue$: 'vue/dist/vue.esm.js'
}
}
};
webpack.prod.js
const webpack = require('webpack');
const CleanWebpackPlugin = require('clean-webpack-plugin');
const merge = require('webpack-merge');
const webpackConfig = require('./webpack.base');
const UglifyJsPlugin = require('uglifyjs-webpack-plugin');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
const OptimizeCSSAssetsPlugin = require('optimize-css-assets-webpack-plugin');
module.exports = merge(webpackConfig, {
mode: 'production',
output: {
filename: '[name].js',
chunkFilename: '[chunkhash].js'
},
plugins: [
new CleanWebpackPlugin(['public']),
new MiniCssExtractPlugin({
filename: '[id].[hash].css',
chunkFilename: '[id].[hash].css'
})
],
optimization: {
minimizer: [
new UglifyJsPlugin({
cache: true,
parallel: 4,
sourceMap: false
}),
new OptimizeCSSAssetsPlugin()
]
}
});
We are experimenting with intelligent fetching of assets using GuessJS. It will allow us to load routes that we are confident a user will visit based on data from Google Analytics.
We didn't quite get chunk splitting to work, as our application is written in PHP. Each of the initial chunks needs to be included as a script in the index.php file. It turns out that chunk splitting with the default rules didn't help us that much.
As length of CSS classes might be large, one way to ensure that the CSS file remains slim is to audit CSS using the coverage tab in chrome and to shorten class and ID names.
The team has discussed using Varnish in order to reduce requests in PHP.
Brotli is a compression algorithm developed by Google. It might be beneficial for us to switch our compression to Brotli and using gzip as a fallback algorithm. After investigation we found out that compressing assets using Brotli might lead to significant reduction in filesize. In addition to this, we are thinking of PHP compression and stripping away comments and how to make use of long term caching for our assets, using more fine-grained service workers in order to save us time used on the main thread.
One thing that throttles down First Meaningful Paint is the fact that our application is partly in React, Vue and client-side frameworks, while our landing page is server-side. By converting the landing page to client-side, we would be able to follow best practices more in depth and gradually load content to the user following the PRPL pattern.
After spending some time trying to optimize our application, a lot of work was used in gathering information about best practices. Performance is well documented, but it might be harder in practise for different of reasons. Sometimes it is because your infrastructure isn't suited for a given optimization, and sometimes it might be a bottleneck optimization.
Fully configuring a site to follow best performance practise takes time, although it is worth investing time adding libraries that give you performance wins with little-to-no hassle.
Thanks to Alex York for reviewing and refining the post.