This post, contains a list of new tools and practices that can help us build faster Angular apps and monitor their performance over time. In each section, you’ll find links for further reference on how to incorporate them in your project. The focus of this post is on decreasing initial load time and speeding up page navigation using code-splitting and preloading.
We’ll look at the following topics:
- Code-splitting
- Preloading strategies
- Performance budgets
- Efficient serving
JavaScript and initial load time
One of the most expensive types of assets in an app is JavaScript. Once the browser downloads a JavaScript file, it often has to decompress it, after that parse it, and finally execute it. That is why it’s critical for the performance of an app to ship fewer bytes of JavaScript during the initial load time.
There are variety of practices we can apply to shrink our bundles. Two of the most popular ones are:
- Minification and dead code elimination
- Code-splitting
The Angular CLI has been doing a great job minifying bundles and eliminating dead code. In version 8, the CLI also introduced differential loading support, which can reduce the amount of JavaScript for modern browsers even further. All this is completely automated by the tooling Angular provides.
On the other hand, code-splitting is entirely in our hands. The next section is dedicated on how to shrink our JavaScript bundles by using this technique.
Code-splitting with Angular
There are two main approaches to code-splitting:
- Component level code-splitting
- Route level code-splitting
The main difference between these two techniques is that with component level code-splitting, we can load individual components lazily even without a route navigation. For example, we can load the component associated with a chatbox only after the user clicks on a placeholder.
With route level code-splitting, we load the individual routes lazily. For example, if the user is in the home page of an app and they navigate to the settings page, Angular will first download the corresponding bundle and after that render the route.
Component level code-splitting
Component level code-splitting has been hard in Angular because of the factories that the current version of the Angular compiler generates. The good news is that Ivy will enable simpler mechanism for it. In the future releases of the framework, we’ll work on using these capabilities to deliver ergonomic APIs for component level code-splitting. Until then, there are two community libraries you can use to achieve ergonomic code-splitting on a component level:
Route-level code-splitting
Now let’s focus on route-level code-splitting. You can learn more about it here. This technique involves boilerplate code. To create a lazy route manually, we need to:
- Generate a new module
- With
loadChildren
, declare a lazy route in a parent module - Generate a new component in the lazy loaded module
- Declare an eager route declaration in the lazy module
With Angular CLI version 8.1, you can now achieve this with a single command! To generate a lazy module use:
For example:
The command above will:
- Generate a lazy-loaded module called
RankingModule
- Insert a lazy route in
app.module.ts
- Generate an eager default route inside the
RankingModule
- Generate a component that will handle the eager default route
Once we introduce a lazy route in our app, when the user navigates to it, Angular will first download the corresponding bundle from the network. On a slow internet connection, this could lead to a bad UX.
To work around that, Angular provides router’s preloading strategy.
Preloading Modules
There’s a built-in strategy that preloads all the modules in the application. You can use it by configuring the Angular router:
The risk with this strategy is that in an application with many modules, it may increase the network consumption and also block the main thread when Angular registers the routes of the preloaded modules.
For larger apps, we can apply more advanced preloading heuristics:
- Quicklink — preload only modules associated with visible links in the viewport
- Predictive prefetching — preload only the modules that are likely to be needed next
The Angular implementation of the quicklink preloading strategy is ngx-quicklink. You can find a detailed guide on how to use it here.
Predictive prefetching uses a report for your app’s usage. You can find how to introduce predictive prefetching with Guess.js to your Angular CLI app in the video below:
Code-splitting can greatly improve the performance of our apps but it doesn’t provide us with any guarantees that it won’t regress over time. For this purpose, we can use performance budgets.
Performance budgets
To monitor our apps over time, the Angular CLI supports performance budgets. The budgets allow us to specify limits in which the production bundles of our app can grow. In the workspace configuration, under the
budgets
section, we can specify several different types of budgets:
All of them accept
maximumWarning
and maximumError
. If we exceed the budget’s maximumWarning
value, the CLI will show a warning. If we exceed the maximumError
value, the build will fail.
Here’s a short video which shows how to quickly setup budgets for your project:
It’s very convenient to introduce budgets as part of your CI so that you can keep track of them across code changes. Read more about this here.
Efficient serving
Looking at datasets of a lot of Angular apps running into the wild, we noticed that over 25% of them do not use content compression. Even more, don’t use a Content Delivery Network (CDN). These are two easy wins that are very simple to implement as part of the deployment pipeline.
To allow developers to ship fast Angular apps from end-to-end as part of Angular CLI version 8.3 we’ll introduce a new command called
deploy
. By simply running ng deploy
, you’ll be able to:- Produce a highly efficient build of your app using the build functionality of the CLI
- Deploy your app to a selected hosting provider
We’ve been working very closely with Google Cloud & Firebase, Azure, and Zeit to introduce deployment capabilities for their platforms directly from the CLI. There are also third-party packages for Netlify and GitHub pages.
The list of packages available at the moment of writing of this article are:
@angular/fire
@azure/ng-deploy
@zeit/ng-deploy
@netlify-builder/deploy
angular-cli-ghpages
You can use them by running:
For example:
Conclusion
In this post we looked at few practical approaches to speed up an Angular app.
We saw how to reduce the size of our JavaScript bundles using component-level and route-level code-splitting. As the next step, we discussed preloading strategies that can speed up page navigation. To monitor our app’s performance over time we introduced performance budgets.
Finally, we talked about efficient serving and the integration of CDN and content compression enabled cloud services with the Angular CLI.
It is common to see Angular apps gradually getting slower over time. Even through Angular is a performant framework, apps will start to become slow as they grow if you are not aware of how to develop performant Angular apps. It is, therefore, a must for any serious Angular developer to be aware of what will make an Angular app slow, so they can avoid it becoming slow in the first place.
In this post, I will go through different methods that can be used to performance tune Angular apps. Some of them involve improving the change detection, others involve the page load and UX in general.
The book Code Complete writes about performance tuning as a tradeoff between code quality and performance. If you can do something that will both improve code quality and performance it is considered best practice and it is a good idea to do that from the start. This post will deal with both the best practices you should almost always strive to do and performance tuning for a specific problem.
Improving change detection
Change detection can be the most performance heavy in Angular apps and therefore it is necessary to have some awareness of how to render the templates most effectively, so you are only rerendering a component if it has new changes to show.
OnPush change detection
The default change detection behavior for components is to re-render every time an asynchronous event has happened in the app such as click, XMLHttpRequest, setTimout. This can become a problem because this will cause many unnecessary renderings of the templates, that may not have been changed
OnPush change detection fixes this by only re-rendering a template if either:
- One of its input properties has gotten a new reference
- An event from the component or one of its children eg. click on a button in the component
- Explicit run of change detection
To apply this strategy you just need to set the change-detection strategy in the component’s decorator:
Design for immutability
To leverage this method you need to make sure, that all state changes are happening immutably because we need a new reference provided to a component’s input to trigger change detection with onPush. If you are using Redux for state management, then you would naturally get a new instance every time the state changes, which will trigger change detection for onPush components when provided to a component’s inputs. With this approach you want to have container components, that is responsible for getting the data from the store and presentation component, which will only interact with other components using
input
and output
.
The easiest way to provide the store data to the template is using the
async
pipe. This will look like having the data outside of an observable and will make sure to clean up the stream when the component gets destroyed automatically.Make onPush the default change detection strategy
Using schematics you can make onPush the default changeDetection strategy when generating new components with Angular CLI. You simply add this to the schematics property in Angular.json:
Using pipes instead of methods in templates
Methods in a template will get triggered every time a component gets rerendered. Even with onPush change detection, that will mean that it gets triggered every time there is interaction with the component or any children of the component (click, type). If the methods are doing heavy computations, this will make the app slow as it scales as it keeps recomputing every time there is interaction with the component.
What you can do instead is using a pure pipe to make sure, that you are only recomputing when the input to the pipe changes.
async
pipe, as we looked at before, is an example of a pure pipe. It will only recompute when the observable emits a value. We want to make sure, that we are only recomputing when the input changes if we are dealing with pure functions. A pure function is a function that always will return the same output given the same input. For that reason, it doesn’t make sense to recompute the output if the input has not changed.With method
Let’s start looking at what happens if you use a template method instead of a pipe.
Consider we have the following method:
Being called in the template like this:
This has the consequence of triggering the method every time a button is clicked inside of the component that is even using onPush change detection:
With pipe
We fix this by converting the method to a pipe, as a pipe as default is pure it will rerun the logic if the input changes (reference change).
By creating a new pipe and moving the logic we used before inside of the pipe we get:
Which is used like this in the template:
Now, this pipe is only being triggered when the input (todolist) has changed.
Cache values from pure pipes and functions
Even when using pure pipes, we can optimize this further by remembering/caching previous values, so we don’t need to recompute if we already run the pipe with the same input in the past. Pure pipes don’t remember the previous values, but will just make sure that if the input hasn’t changed the reference, it will not recompute. To do the caching of previous value we need to combine it with something else.
An easy way to do this is to use Lodash memorize method. In this case, this is not very practical as the input is an array of objects. If the pipe was taking a simple data type, such as number as input, it could be beneficial to use this as a key to cache results and thus avoid recomputation.
An easy way to do this is to use Lodash memorize method. In this case, this is not very practical as the input is an array of objects. If the pipe was taking a simple data type, such as number as input, it could be beneficial to use this as a key to cache results and thus avoid recomputation.
Using trackBy in ngFor
When using ngFor and updating the list, Angular will by default remove the whole list from the DOM and create it again, because it has no way, by default, to know which item has been added or removed from the list. The trackBy function is solving this by allowing you to provide Angular with a function used for evaluating, which item has been updated or removed from the ngFor list, and then only rerender that.
The track by function looks like this:
This will track changes in the list based on the id property of the items (todo items).
The trackBy function is used in the template like this:
For a list where you can interact with it (add, delete), then it is a good idea to use trackBy. For static lists, which are not being changed, this will not make a difference to the user experience.
For heavy computations: Detach change detection
In extreme cases, you would want to only trigger change detection manually for some components. That is if a component is instantiated 100’s of times on the same page and rerendering every one of them is expensive you can turn off automatic change detection completely for the component and only trigger changes manually in the places it is necessary.
If we wanted to do this for the todo items we could detach change detection and only run this when the todo Item is set in the
todoItem
set property:Improving page load
The page load time is an important aspect of user experience today. Every millisecond a user is waiting, potentially means a loss in revenue, because of a higher bounce rate and worse user experience, so this is a place you should optimize. Page load time also has an impact on SEO, as faster websites are rewarded by search engines.
For improving page load we want to use caching using Angular PWA, lazy loading and bundling.
Cache static content using Angular PWA
Caching the static content will make your Angular app load faster as it will already be in the browser. This is easily done using Angular PWA which will use service workers to cache the static content, that is the js, css bundles, images and static served files, and present them without making a call to the server.
I have already created a guide to how to setup caching with Angular PWA you can read here.
Cache HTTP calls using Angular PWA
With Angular PWA you can easily set up caching rules for HTTP calls to give a faster user experience without cluttering your app with a lot of caching code. Either you can optimize for freshness or performance, that is, you can either choose to only read the cache if the HTTP call times out or first check the cache and then only call the API then the cache expires.
I have a guide with a video showing you how to do this here.
Lazy load routes
Lazy loading routes will make sure that a feature will be bundled in its own bundle and that this bundle can be loaded when it is needed.
To set up lazy loading we simply create a child route file like this in a feature:
Then add this route to
imports
:
And finally lazy load it using
loadChildren
in the root route:Optimizing bundling and preloading
To optimize page load even further you can choose to preload the feature modules, so navigation is instant when you want to render a lazily loaded feature module.
This can be done by setting the: preloadingStrategy to PreloadModules as:
This can be done by setting the: preloadingStrategy to PreloadModules as:
On load, all the feature modules will be loaded, giving you both a faster page load as well as instant navigation when you want to load other feature modules. This can even be optimized further by creating your own custom preloadingStrategy like shown here to load only a subset of the routes on app initialization.
Server-side rendering with Angular Universal
For Angular apps that are containing indexed pages, it is recommended to server-side render the app. This will make sure the pages are being fully rendered by the server before shown to the browser which will give a faster page load. This will require that the app is not dependent on any native DOM elements, and you should instead inject eg.
document
from the Angular providers.
Read more about how to setup Angular Universal in an app here.
Improving UX
Performance tuning is all about optimizations at the bottleneck, that is the part of the system that is affecting your user experience the most. Sometimes the solution could just be to handle actions more optimistically and thus less waiting for the user.
Optimistic updates
Optimistic updates are where an update is reflected in the UI before it is saved on the server. This gives a more snappy native-like experience to the user. The consequence with this is that you need to roll the state back in case the server fails to save the changes. Strongbrew has written a post about how you can implement this in a generic way, making optimistic updates easy to incorporate in your app. You can read it here.
How should I prioritize performance tuning?
Start with low hanging fruits: onPush, Lazy loading and then PWA and then gain awareness of where your performance bottlenecks are in the system. Every improvement that is not at the bottleneck is an illusion as it will not improve the user experience with the app. Tuning methods like detaching the change detection should only be used if you have a specific problem with a component’s change detection impacting performance.
The Angular performance tuning steps
This is the steps you will go through as you performance tune your Angular application. You should only move up the steps until the performance problems are fixed, no reason to over-engineer improvements that will not improve the UX.
- OnPush
- Lazy loading modules
- Improve page load with Angular PWA
- trackBy for ngFor
- Pure pipes instead of methods (including async)
- Cache values from pipes and pure functions
- Cache HTTP requests better
- Detach/manual change detection
- Angular Universal
Why is Angular Universal the last one? Because introducing server-side rendering can cause big changes to the development setup (need to maintain another server, cannot reference DOM and need to maintain a server and a client bundle) and should be used either for performance reasons that can not be fixed with the previous steps or SEO purposes.
A complete demo project can be found here.
Conclusion
In this post, we saw how to performance tune your Angular app. We dived into a couple of different performance tuning categories: change detection, page load, and UX improvements. The way you should go about any improvement in a system is that you should first identify bottlenecks and try to solve them first using one of the methods from this post. Anything else might just be a waste of time if it is not improving the user experience.
No comments:
Post a Comment