This document contains a list of practices that will help us boost the performance of our Angular applications. "Angular Performance Checklist" covers different topics - from server-side pre-rendering and bundling of our applications to runtime performance and optimization of the change detection performed by the framework.
The document is divided into two main sections:
Some practices impact both categories so there could be a slight intersection, however, the differences in the use cases and the implications will be explicitly mentioned.
Most subsections list tools, related to the specific practice, that can make us more efficient by automating our development flow.
Note that most practices are valid for both HTTP/1.1 and HTTP/2. Practices which make an exception will be mentioned by specifying to which version of the protocol they could be applied.
Some of the tools in this section are still in development and are subject to change. The Angular core team is working on automating the build process for our applications as much as possible so a lot of things will happen transparently.
Bundling is a standard practice aiming to reduce the number of requests that the browser needs to perform in order to deliver the application requested by the user. In essence, the bundler receives as an input a list of entry points and produces one or more bundles. This way, the browser can get the entire application by performing only a few requests, instead of requesting each individual resource separately.
As your application grows bundling everything into a single large bundle would again be counterproductive. Explore Code Splitting techniques using Webpack.
Additional http requests will not be a concern with HTTP/2 because of the server push feature.
Tooling
Tools which allows us to bundle our applications efficiently are:
Resources
These practices allow us to minimize the bandwidth consumption by reducing the payload of our application.
Tooling
Resources
Although we don't see the whitespace character (a character matching the \s
regex) it is still represented by bytes which are transferred over the network. If we reduce the whitespace from our templates to the minimum we will be respectively able to drop the bundle size of the AoT code even further.
Thankfully, we don't have to do this manually. The ComponentMetadata
interface provides the property preserveWhitespaces
which by default has value false
meaning that by default the Angular compiler will trim whitespaces to further reduce the size of our application. In case we set the property to true
Angular will preserve the whitespace.
For the final version of our applications, we usually don't use the entire code which is provided by Angular and/or any third-party library, even the one that we've written. Thanks to the static nature of the ES2015 modules, we're able to get rid of the code which is not referenced in our apps.
Example
// foo.js
export foo = () => 'foo';
export bar = () => 'bar';
// app.js
import { foo } from './foo';
console.log(foo());
Once we tree-shake and bundle app.js
we'll get:
let foo = () => 'foo';
console.log(foo());
This means that the unused export bar
will not be included into the final bundle.
Tooling
Note: GCC does not support export *
yet, which is essential for building Angular applications because of the heavy usage of the "barrel" pattern.
Resources
Since the release of Angular version 6, The angular team provided a new feature to allow services to be tree-shakeable, meaning that your services will not be included in the final bundle unless they're being used by other services or components. This can help reduce the bundle size by removing unused code from the bundle.
You can make your services tree-shakeable by using the providedIn
attribute to define where the service should be initialized when using the @Injectable()
decorator. Then you should remove it from the providers
attribute of your NgModule
declaration as well as its import statement as follows.
Before:
// app.module.ts
import { NgModule } from '@angular/core'
import { AppRoutingModule } from './app-routing.module'
import { AppComponent } from './app.component'
import { environment } from '../environments/environment'
import { MyService } from './app.service'
@NgModule({
declarations: [
AppComponent
],
imports: [
...
],
providers: [MyService],
bootstrap: [AppComponent]
})
export class AppModule { }
// my-service.service.ts
import { Injectable } from '@angular/core'
@Injectable()
export class MyService { }
After:
// app.module.ts
import { NgModule } from '@angular/core'
import { AppRoutingModule } from './app-routing.module'
import { AppComponent } from './app.component'
import { environment } from '../environments/environment'
@NgModule({
declarations: [
AppComponent
],
imports: [
...
],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
// my-service.service.ts
import { Injectable } from '@angular/core'
@Injectable({
providedIn: 'root'
})
export class MyService { }
If MyService
is not injected in any component/service, then it will not be included in the bundle.
Resources
A challenge for the available in the wild tools (such as GCC, Rollup, etc.) are the HTML-like templates of the Angular components, which cannot be analyzed with their capabilities. This makes their tree-shaking support less efficient because they're not sure which directives are referenced within the templates. The AoT compiler transpiles the Angular HTML-like templates to JavaScript or TypeScript with ES2015 module imports. This way we are able to efficiently tree-shake during bundling and remove all the unused directives defined by Angular, third-party libraries or by ourselves.
Resources
Compression of the responses' payload standard practice for bandwidth usage reduction. By specifying the value of the header Accept-Encoding
, the browser hints the server which compression algorithms are available on the client's machine. On the other hand, the server sets the value for the Content-Encoding
header of the response in order to tell the browser which algorithm has been chosen for compressing the response.
Tooling
The tooling here is not Angular-specific and entirely depends on the web/application server that we're using. Typical compression algorithms are:
Resources
Resource pre-fetching is a great way to improve user experience. We can either pre-fetch assets (images, styles, modules intended to be loaded lazily, etc.) or data. There are different pre-fetching strategies but most of them depend on specifics of the application.
In case the target application has a huge code base with hundreds of dependencies, the practices listed above may not help us reduce the bundle to a reasonable size (reasonable might be 100K or 2M, it again, completely depends on the business goals).
In such cases, a good solution might be to load some of the application's modules lazily. For instance, let's suppose we're building an e-commerce system. In this case, we might want to load the admin panel independently from the user-facing UI. Once the administrator has to add a new product we'd want to provide the UI required for that. This could be either only the "Add product page" or the entire admin panel, depending on our use case/business requirements.
Tooling
Let's suppose we have the following routing configuration:
// Bad practice
const routes: Routes = [
{ path: '', redirectTo: '/dashboard', pathMatch: 'full' },
{ path: 'dashboard', loadChildren: () => import('./dashboard.module').then(mod => mod.DashboardModule) },
{ path: 'heroes', loadChildren: () => import('./heroes.module').then(mod => mod.HeroesModule) }
];
The first time the user opens the application using the url: https://example.com/ they will be redirected to /dashboard
which will trigger the lazy-route with path dashboard
. In order Angular to render the bootstrap component of the module, it will has to download the file dashboard.module
and all of its dependencies. Later, the file needs to be parsed by the JavaScript VM and evaluated.
Triggering extra HTTP requests and performing unnecessary computations during the initial page load is a bad practice since it slows down the initial page rendering. Consider declaring the default page route as non-lazy.
Caching is another common practice intending to speed-up our application by taking advantage of the heuristic that if one resource was recently been requested, it might be requested again in the near future.
For caching data we usually use a custom caching mechanism. For caching static assets, we can either use the standard browser caching or Service Workers with the CacheStorage API.
To make the perceived performance of your application faster, use an Application Shell.
The application shell is the minimum user interface that we show to the users in order to indicate them that the application will be delivered soon. For generating an application shell dynamically you can use Angular Universal with custom directives which conditionally show elements depending on the used rendering platform (i.e. hide everything except the App Shell when using platform-server
).
Tooling
Resources
We can think of the Service Worker as an HTTP proxy which is located in the browser. All requests sent from the client are first intercepted by the Service Worker which can either handle them or pass them through the network.
You can add a Service Worker to your Angular project by runningng add @angular/pwa
Tooling
Resources
This section includes practices that can be applied in order to provide smoother user experience with 60 frames per second (fps).
enableProdMode
In development mode, Angular performs some extra checks in order to verify that performing change detection does not result in any additional changes to any of the bindings. This way the frameworks assures that the unidirectional data flow has been followed.
In order to disable these changes for production do not forget to invoke enableProdMode
:
import { enableProdMode } from '@angular/core';
if (ENV === 'production') {
enableProdMode();
}
AoT can be helpful not only for achieving more efficient bundling by performing tree-shaking, but also for improving the runtime performance of our applications. The alternative of AoT is Just-in-Time compilation (JiT) which is performed runtime, therefore we can reduce the amount of computations required for the rendering of our application by performing the compilation as part of our build process.
Tooling
ng serve --prod
Resources
The usual problem in the typical single-page application (SPA) is that our code is usually run in a single thread. This means that if we want to achieve smooth user experience with 60fps we have at most 16ms for execution between the individual frames are being rendered, otherwise, they'll drop by half.
In complex applications with a huge component tree, where the change detection needs to perform millions of checks each second it will not be hard to start dropping frames. Thanks to the Angular's agnosticism and being decoupled from DOM architecture, it's possible to run our entire application (including change detection) in a Web Worker and leave the main UI thread responsible only for rendering.
Tooling
Resources
A big issue of the traditional SPA is that they cannot be rendered until the entire JavaScript required for their initial rendering is available. This leads to two big problems:
Server-side rendering solves this issue by pre-rendering the requested page on the server and providing the markup of the rendered page during the initial page load.
Tooling
Resources
On each asynchronous event, Angular performs change detection over the entire component tree. Although the code which detects for changes is optimized for inline-caching, this still can be a heavy computation in complex applications. A way to improve the performance of the change detection is to not perform it for subtrees which are not supposed to be changed based on the recent actions.
ChangeDetectionStrategy.OnPush
The OnPush
change detection strategy allows us to disable the change detection mechanism for subtrees of the component tree. By setting the change detection strategy to any component to the value ChangeDetectionStrategy.OnPush
will make the change detection perform only when the component has received different inputs. Angular will consider inputs as different when it compares them with the previous inputs by reference, and the result of the reference check is false
. In combination with immutable data structures, OnPush
can bring great performance implications for such "pure" components.
Resources
Another way of implementing a custom change detection mechanism is by detach
ing and reattach
ing the change detector (CD) for given a component. Once we detach
the CD Angular will not perform check for the entire component subtree.
This practice is typically used when user actions or interactions with external services trigger the change detection more often than required. In such cases we may want to consider detaching the change detector and reattaching it only when performing change detection is required.
The Angular's change detection mechanism is being triggered thanks to zone.js. Zone.js monkey patches all asynchronous APIs in the browser and triggers the change detection at the end of the execution of any async callback. In rare cases, we may want the given code to be executed outside the context of the Angular Zone and thus, without running change detection mechanism. In such cases, we can use the method runOutsideAngular
of the NgZone
instance.
Example
In the snippet below, you can see an example for a component that uses this practice. When the _incrementPoints
method is called the component will start incrementing the _points
property every 10ms (by default). The incrementation will make the illusion of an animation. Since in this case, we don't want to trigger the change detection mechanism for the entire component tree, every 10ms, we can run _incrementPoints
outside the context of the Angular's zone and update the DOM manually (see the points
setter).
@Component({
template: '<span #label></span>'
})
class PointAnimationComponent {
@Input() duration = 1000;
@Input() stepDuration = 10;
@ViewChild('label') label: ElementRef;
@Input() set points(val: number) {
this._points = val;
if (this.label) {
this.label.nativeElement.innerText = this._pipe.transform(this.points, '1.0-0');
}
}
get points() {
return this._points;
}
private _incrementInterval: any;
private _points: number = 0;
constructor(private _zone: NgZone, private _pipe: DecimalPipe) {}
ngOnChanges(changes: any) {
const change = changes.points;
if (!change) {
return;
}
if (typeof change.previousValue !== 'number') {
this.points = change.currentValue;
} else {
this.points = change.previousValue;
this._ngZone.runOutsideAngular(() => {
this._incrementPoints(change.currentValue);
});
}
}
private _incrementPoints(newVal: number) {
const diff = newVal - this.points;
const step = this.stepDuration * (diff / this.duration);
const initialPoints = this.points;
this._incrementInterval = setInterval(() => {
let nextPoints = Math.ceil(initialPoints + diff);
if (this.points >= nextPoints) {
this.points = initialPoints + diff;
clearInterval(this._incrementInterval);
} else {
this.points += step;
}
}, this.stepDuration);
}
}
Warning: Use this practice very carefully only when you're sure what you are doing because if not used properly it can lead to an inconsistent state of the DOM. Also, note that the code above is not going to run in WebWorkers. In order to make it WebWorker-compatible, you need to set the label's value by using the Angular's renderer.
Angular uses zone.js to intercept events that occurred in the application and runs a change detection automatically. By default this happens when the microtask queue of the browser is empty, which in some cases may call redundant cycles.From v9, Angular provides a way to coalesce event change detections by turning ngZoneEventCoalescing
on, i.e
platformBrowser()
.bootstrapModule(AppModule, { ngZoneEventCoalescing: true });
The above configuration will schedule change detection with requestAnimationFrame
, instead of plugging into the microtask queue, which will run checks less frequently and consume fewer computational cycles.
Warning: ngZoneEventCoalescing: true may break existing apps that relay on consistently running change detection.
Resources
As argument the @Pipe
decorator accepts an object literal with the following format:
interface PipeMetadata {
name: string;
pure: boolean;
}
The pure flag indicates that the pipe is not dependent on any global state and does not produce side-effects. This means that the pipe will return the same output when invoked with the same input. This way Angular can cache the outputs for all the input parameters the pipe has been invoked with, and reuse them in order to not have to recompute them on each evaluation.
The default value of the pure
property is true
.
*ngFor
directiveThe *ngFor
directive is used for rendering a collection.
trackBy
optionBy default *ngFor
identifies object uniqueness by reference.
Which means when a developer breaks reference to object during updating item's content Angular treats it as removal of the old object and addition of the new object. This effects in destroying old DOM node in the list and adding new DOM node on its place.
The developer can provide a hint for angular how to identify object uniqueness: custom tracking function as the trackBy
option for the *ngFor
directive. The tracking function takes two arguments: index
and item
. Angular uses the value returned from the tracking function to track items identity. It is very common to use the ID of the particular record as the unique key.
Example
@Component({
selector: 'yt-feed',
template: `
<h1>Your video feed</h1>
<yt-player *ngFor="let video of feed; trackBy: trackById" [video]="video"></yt-player>
`
})
export class YtFeedComponent {
feed = [
{
id: 3849, // note "id" field, we refer to it in "trackById" function
title: "Angular in 60 minutes",
url: "http://youtube.com/ng2-in-60-min",
likes: "29345"
},
// ...
];
trackById(index, item) {
return item.id;
}
}
Rendering the DOM elements is usually the most expensive operation when adding elements to the UI. The main work is usually caused by inserting the element into the DOM and applying the styles. If *ngFor
renders a lot of elements, browsers (especially older ones) may slow down and need more time to finish rendering of all elements. This is not specific to Angular.
To reduce rendering time, try the following:
*ngFor
section of your template. Usually, unneeded/unused DOM elements arise from extending the template again and again. Rethinking its structure probably makes things much easier.ng-container
where possibleResources
*ngFor
Angular executes template expressions after every change detection cycle. Change detection cycles are triggered by many asynchronous activities such as promise resolutions, http results, timer events, keypresses, and mouse moves.
Expressions should finish quickly or the user experience may drag, especially on slower devices. Consider caching values when their computation is expensive.
Resources
The list of practices will dynamically evolve over time with new/updated practices. In case you notice something missing or you think that any of the practices can be improved do not hesitate to fire an issue and/or a PR. For more information please take a look at the "Contributing" section below.
In case you notice something missing, incomplete or incorrect, a pull request will be greatly appreciated. For discussion of practices that are not included in the document please open an issue.
MIT
定义和用法 ng-checked 指令用于设置复选框(checkbox)或单选按钮(radio)的 checked 属性。 如果 ng-checked 属性返回 true,复选框(checkbox)或单选按钮(radio)将会被选中。 语法 < input type= "checkbox|radio" ng-checked= "expression" > < /input > type 为 c
<select ng-model="vm.selectVal" ng-options="o.id as o.title for o in vm.optionsData" ng-change="selectChange()"> <option value="">请选择</option> </select> vm.optionsData = [{ id : 4543, titl
问题 Is there a pattern in Angular apps for computation-heavy tasks? (Beyond just using $timeout with a delay of 0 to let them get off the call stack?) Even using $timeout seems to make the UI unrespons
版本说明 使用 angular 7.2.0 使用工具检测性能问题 Chrome 浏览器,开发者工具 强烈推荐 Performance 工具,非常复杂和详细 如何使用 Chrome performance source-map-explorer angular 问题总结 页面和组件多,导致访问速度很慢 打开页面后,显示数据也非常卡顿 开发模式下,从修改到页面刷新,再到页面完全展示,这一过程太慢,严重
遇到的问题angular 里面的checkbox 的使用 当$scope.das=true or false 时 不起作用 $scope.das={ value:true } 为对象时才起作用 <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>
最近前后端代码写完了,研究下angularjs单元测试,网上找了好多资料,都是一知半解,很散,为了记录下痛苦的学习历程和为即将要学习的战友提供点帮助,决定写一下。 1.windows下安装nodeJS+NPM+Bower安装配置 2.安装和使用Karma-Jasmine 以上两个步骤去google,很多,这里不做介绍 注意:如果https下载不了,使用下面命令: npm conf set regi
8. Performance Checklist Client-Side reduce HTTP requests reduce DNS lookup times avoiding resource redirect cache Ajax request data delay loading precache assets reduce DOM elements number split cont
当看到这两个东西的时候,我们首先要明白ng-checked是angularjs中的指令,只不过该指令以属性的方式调用,checked是html的一个属性。 在单/多选按钮框中,checked表示但单选按钮被选中,不需要对其赋任何值。如果你给checked赋予一个非空字符串的话,它就会默认checked一直存在,不管字符串是什么值,都会是被选中的状态。标准的方式,只要将checked属性添加就可以了
Material.TS: import { NgModule } from '@angular/core'; import { CommonModule } from '@angular/common'; import {MatToolbarModule} from '@angular/material/toolbar'; import {MatGridListModule} from '@ang
PS: 个人学习备忘,一开始看Angular的手册碰到这里看不懂了,囧 Developer Guide Free courses 啥也不说了, Angular太酷了,不愧是谷歌出品,必出精品!设计很完善,开发模式很好,可以让整个项目工程化!另外,测试支持也很完善,比如单元测试用 karma,端到端测试用 protractor 1. controller里面的最下面的那段 controller te
在input 上添加onchange="",大部分浏览器可解决,但ie有部分版本还是不能监测, 解决方式: <input class="Wdate" onclick="WdatePicker({dateFmt:'yyyy-MM-dd HH:mm:ss',onpicked:function(){$(this).trigger('change')}})" type="text" ng-model=
<div> <div v-for="item in options" :key="item.value"> <input type="checkbox" :id="item.value" :name="item.value" :value="item.value" v-model="selTime"> <label :for='item.value'>{{item.label}
测试环境 类型 配置 MySQL A Intel(R) Xeon(R) CPU E5-2430 0 @ 2.20GHz (24core 96G) MySQL B Intel(R) Xeon(R) CPU E5-2430 0 @ 2.20GHz (24core 96G) 日常业务库 Canal Server Intel(R) Xeon(R) CPU E5-2430 0 @ 2.20GHz (24co
HTTP Request Performance Comparision This section will compare the performance of different Ingress controller(F5, Nginx, HAProxy) to process simple http request. Software Used wrk 4.1.0 k8s 1.15.12,
Handsontable performs multiple calculations to display the grid properly. The most demanding actions are performed on load, change and scroll events. Every single operation decreases the performance,
This guide contains a collection of best practices for optimizing your TensorFlow code. The best practices apply to both new and experienced Tensorflow users. Best Practices While optimizing implement
Performance is often a significant issue when training a machine learning model. This section explains various ways to optimize performance. Start your investigation with the following guide: @{$perfo
When it comes to mobile apps, performance is critical to user experience. Users expect apps to have smooth scrolling and meaningful animations free of stuttering or skipped frames, known as “jank.” Ho