Tips for Building Angular Apps on Large Data Sets
Join the DZone community and get the full member experience.Join For Free
Web applications often need to work with lots of data. While AngularJS has the capability to search and manipulate large sets of objects, that doesn’t necessarily mean that it should. With other frameworks that are already purpose-built for searching and manipulating large object sets, using AngularJS intelligently can help prevent needlessly storing large amounts of data that consume both excessive processing time and browser process memory. Below we’ll look at a few techniques for dealing with Angular applications built on top of large data sets, and how you can present data the way you like without introducing needless load on your frontend.
Pagination and Infinite Scroll
Any website’s first approach to load management typically centers around pagination. Pagination is the process of grabbing small batches of records (say 25 or so) and displaying them in sequence, as a number of 25-entry pages of results. Instead of dealing with the problem of large numbers of records head-on, this circumvents handling the load by cutting the large number of results into smaller/manageable pieces. This can be done a number of ways – one of the most common is by implementing user-driven navigation (clickable page numbers at the bottom or top of the list). Each click fires off an AJAX call that retrieves the next batch from the server.
Another approach – one a bit more in line with Angular’s principles of letting the data drive the presentation – is the infinite scroll method. AngularJS provides a number of tools that make implementing an infinite scroll a snap. Once a certain record in the set has been reached, then an AJAX call is automatically initiated by the backend to fetch the next page of records, which are then displayed to the user as they continue to scroll. The seamless page integration is in some ways a design choice, but this approach does raise a problem: what should be done with the previous page’s contents in memory? Unfortunately the answer to that question isn’t simple, so we’ll leave it as something to be determined based upon your application’s architecture.
Plugins and Optimization
In some cases, though, pagination is just not suitable. Many individuals may prefer an approach that presents large numbers of records to the user at once (it’s been reported that many in the financial industry prefer large grids of text and numbers, for example). In that case, the approach to take is to manage the usage of resources by AngularJS such that the absolute minimum of processing is done on these items. There are some plugins, like ng-grid, that support larger data sets, though as with any community-supported product the quality of the code may vary widely.
As with any programming effort, a little up-front analysis can go a long way towards reducing headaches at deploy time. One of the common misconceptions about AngularJS is that it is best for single-shot webpages with small amounts of data to show, and common wisdom holds that for heavy record processing and display you may be better off moving the related functionality to the backend. However, as we have seen above, simple design choices and smart approaches to how you write your AngularJS code can have a major impact on the performance of your web application when placed on top of very large data sets, making Angular an excellent choice in these situations.Build your Angular app and connect it to any database with Backand today. – Get started now.
Published at DZone with permission of Itay Herskovits, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Superior Stream Processing: Apache Flink's Impact on Data Lakehouse Architecture
How To Use Git Cherry-Pick to Apply Selected Commits
New ORM Framework for Kotlin
TDD vs. BDD: Choosing The Suitable Framework