Scale Your Frontend Application
Join the DZone community and get the full member experience.
Join For FreeAs a continuation to my first article, I prepared an implementation of a simple project to show a workable solution.
Before going into detail, I’d like to note that I decided to change the name to Scale instead of Pluggable, as it better highlights the potential of this architecture. The idea is not only to create plugins for an application but actually to drive the development of frontend itself from multiple sources. And the core feature of this itself has to be to split the application into pieces and not create monolith.
The only mechanism that must be in the heart of the system is the line of communication between components, the logic that loads necessary scripts dynamically, and the process of gluing it together with defined rules. There are of course many other aspects that you must take into account like:
- How to share the same event messages across different evolutions of the system (like versioning, etc.).
- How to upgrade libraries across all modules in a less intrusive way.
- How to set up the development environment of one module without a need to boot up a monster application.
- How to set up an infrastructure for E2E tests and keep these tests in the repository of the module.
- And many more.
Before diving into all of these challenges, in this article, I’ll introduce you to the first implementation of a simple application. Currently, the code for this article is in master. Once I polish it and start working on a more advanced part of it, I’ll move it to branch part-1.
To set up the project locally, you need to have the latest version of Node.js and yarn and gulp globally installed.
You may also like: The Fundamentals of Redux.
Once that's done, running gulp
is enough to set up the entire infrastructure of the project. To make it simple for setting it up and demonstrating from the single place, I placed three modules into one GitHub repository. Though in a real-life scenario, they would be broken up into three separate repositories.
The example I created is for a User Management System, where it is possible to:
- See a home page with some basic information about the site (Home)
Home page - View users (User)
User page - Add new users (Admin)
Admin page - Grant/revoke permissions of allowing individuals to remove users or hiding/showing admin page (Settings)
- When the "allow to remove users” permission is enabled, a trash icon appears on the user page.
- When “allow to view admin page” is disabled, the navigation to “Admin” disappears.
Setting admin permissions
“Home" and “User” pages come together with the main application and two others are loaded during system boot-up.
First, we need to configure our backend to read metadata about our core modules and provide the API to load. In this project, we created folder “modules”
for that. Here, the compiled modules thta are ready for loading are distributed.
The only file created manually is modules-metadata.json
. Again, this is done for simplicity. In a real-world scenario, it will be inside of each module, like modules/admin/metadata.json
with snippet
{
"name": "admin",
"entry": "/modules/admin/index.js",
"options": {
"tab": {
"title": "Admin"
}
}
}
And modules/setting/metadata.json
with snippet:
x
{
"name": "settings",
"entry": "/modules/settings/index.js",
"options": {
"tab": {
"title": "Settings"
}
}
}
In Express, to make this folder easily accessible from our frontend, we need to make this folder serve a static folder.
x
gulp.task('server:start', (cb) => {
app.use(express.json());
app.get('/', (req, res) => {
res.sendFile(`${paths.distDir}/index.html`);
});
registerDatabaseApi();
app.use(express.static(`${paths.projectDir}/dist`));
app.use('/modules', express.static(`${paths.projectDir}/modules`)); // that happens here
app.listen(serverPort, () => {
log.info(`DSFA is started on port ${serverPort}!`);
cb();
});
});
On the frontend, we need to be aware that this file has to be there and request it in api-resource.js
.
xxxxxxxxxx
export const getModulesMetadata = () => httpRequest('GET', 'modules/modules-metadata.json');
That call is done from redux-saga, so once data is received, it’s going to be saved in the redux store.
xxxxxxxxxx
export function* loadCustomModulesSaga() {
try {
const {data: {modules = []}} = yield call(getModulesMetadata);
yield putResolve(customModulesActions.set(modules));
} catch (exception) {
yield put(toastrActions.show('Error', exception, TOASTR_TYPE.error));
}
}
The main application component listens to changes, and when it finds changes in modules, it reacts to them (app.js
).
xxxxxxxxxx
const mapStateToProps = (state) => ({
bootstrappedModules: state.bootstrap.bootstrappedModules,
permissions: state.permissions,
modules: state.customModules, // here
users: state.users
});
@connect(mapStateToProps)
export class App extends Component {
There are two places that are adapted to module changes:
xxxxxxxxxx
<CustomLinks modules={modules} restrictedModuleNames={restrictedModuleNames}></div>
This part adds more navigational components, on which user can click and go there.
xxxxxxxxxx
<CustomRoutes
bootstrappedModules={bootstrappedModules}
modules={modules}
restrictedModuleNames={restrictedModuleNames}
></div>
This component actually loads the plugin through an API call, attaches it to the HTML page, pulls all sources from that bundle, and connects it to the appropriate parts of the core module. CustomRoutes
uses the LoadModule
component to perform these actions.
Let’s watch this process piece-by-piece:
First, we load the script and wait until it is mounted to the page
xxxxxxxxxx
useEffect(() => {
if (R.not(R.includes(moduleName, bootstrappedModules))) {
const script = document.createElement('script’);
script.src = scriptUrl; // that’s a backend URL by which the JS bundle of custom module is accessible.
script.async = true; // no need to do it synchronously.
script.onload = () => {
setLoadedModuleName(moduleName); // once module is loaded we can proceed with connecting it to the core module
};
document.body.appendChild(script);
} else {
setLoadedModuleName(moduleName);
}
return () => {
if (context.saga) { // once component is unmounted, we also cancel all running sagas.
context.saga.cancel();
}
};
}, []);
Then, we get code out of a mounted bundle and incorporating it into a core module
xxxxxxxxxx
if (R.complement(R.isNil)(loadedModuleName)) {
const {
component: Component,
reducers,
saga
} = window[loadedModuleName].default;
if (R.not(R.includes(loadedModuleName, bootstrappedModules))) {
if (saga) { // it makes not compulsory for module to expose sagas, if it doesn’t have it
context.saga = window.dsfaSaga.run(saga); // window.dsfaSaga - is a reference to a joint middleware created in core module, so then we can run sagas brought by this custom module.
}
if (reducers) { // // it makes not compulsory for module/plugin to expose reducers, if it doesn’t have it
window.dsfaReducerRegistry.register(reducers); // window.dsfaReducerRegistry - is a joint register, created by a core module, and is used to register/unregister reducers from the entire system. Here we add new reducers brought by this custom module.
}
dispatch(applicationActions.moduleBootstrapped(loadedModuleName)); // marks that current module is loaded and prevents of loading it once again.
}
return <Component></div>;
}
To understand where window.dsfaReducerRegistry
and window.dsfaSaga
come from, have a look at configure-store.js
file:
xxxxxxxxxx
const sagaMiddleware = createSagaMiddleware();
window.dsfaSaga = sagaMiddleware;
export const reducerRegistry = new ReducerRegistry(allReducers);
window.dsfaReducerRegistry = reducerRegistry;
For making it possible to register reducers from different parts of an application a bit of wrapper is created, as you noticed that ReducerRegister
:
xxxxxxxxxx
export class ReducerRegistry {
constructor(initialReducers = {}) {
this.reducers = {initialReducers};
this.emitChange = null;
}
register(newReducers) {
this.reducers = {this.reducers, newReducers};
if (this.emitChange !== null) {
this.emitChange(this.getReducers());
}
}
getReducers() {
return {this.reducers};
}
setChangeListener(listener) {
if (this.emitChange !== null) {
throw new Error('Can only set the listener for a ReducerRegistry once.');
}
this.emitChange = listener;
}
}
And when the store is created, you need to create a reducer:
xxxxxxxxxx
export const configureStore = (history) => {
const mainReducer = configureReducers(reducerRegistry.getReducers());
const store = createStoreWithMiddleware(history)(mainReducer);
reducerRegistry.setChangeListener((reducers) => {
store.replaceReducer(configureReducers(reducers));
});
window.dsfaStore = store;
return store;
};
As new reducers come, they are replaced in the store. And the same thing happens here with sharing the Redux store via a window with custom modules. Why use a window? Because that’s a way how is it possible to make a shared objects between different modules compiled individually by webpack.
Let’s have a look at this configuration, which is required to be done in custom module webpack.config.plugin.common.js
:
xxxxxxxxxx
output: {
filename: 'index.js',
globalObject: 'window’, // That’s the way how to make some points of the system shared between bundles.
library: pluginName,
libraryTarget: 'umd’, // ! globalObject will work only if target is udm
path: `${paths.modulesDistDir}/${moduleName}`,
publicPath: '/'
},
And also externals, to not duplicate third-party libraries:
xxxxxxxxxx
const createItem = (key, globalName) => ({
[key]: {
amd: key,
commonjs: key,
commonjs2: key,
root: globalName
}
});
export const externals = {
createItem('react', 'React'),
createItem('react-dom', 'ReactDOM')
};
That scrapes the bundle of React and react-dom from a custom plugin. React doesn’t complain about the two versions of itself in the application (and crashes with react-redux in that case) and makes plugin really slight in comparison with a fully-packed version provided in case of iframe usage. The list of these externals will grow with the number of libraries you will use.
On the side of core module, you need to register such modules globally globals.js
.
xxxxxxxxxx
import React from 'react';
import ReactDOM from 'react-dom';
window.React = React;
window.ReactDOM = ReactDOM;
If in some cases plugin can’t be loaded successfully, to not crash the whole application, we have to isolate it, catch exception and show to the user information about it. For that, the CatchError
component is created.
xxxxxxxxxx
export default class CatchError extends PureComponent {
static propTypes = {children: PropTypes.element};
constructor(props) {
super(props);
this.state = {hasError: false};
}
componentDidCatch() { // This part does the logic to keep the application alive.
this.setState({hasError: true});
}
render() {
if (this.state.hasError) {
return <h1>Plugin has not been loaded successfully due to found errors</h1>;
}
return this.props.children;
}
}
This is, in essence, all of the major parts of the application described above. The rest of the code is to make the application more or less appealing with bare minimum functionality. I encourage you to download the working scenario; setup is super easy. Play around with it! Add some goal, to enhance it with some small feature and feel how is the development going for you. For example, take that you are a developer who is asked to add "history" module with own tab and show all performed actions (adding/removing users).
Easy level
With the current given implementation what is the bare minimum you will modify in core module and what will you add to a history-module? How are you going to communicate in an efficient way?
Medium level
How core plugin has to be rewritten so that new history or any other potential functionality won't require changes in the core module? As this is an ultimate goal of this architecture.
I'm curious and excited to see your forks/PRs/smart and genius ideas regarding that. Looking forward for feedbacks and comments as well!
Further Reading
Opinions expressed by DZone contributors are their own.
Trending
-
Authorization: Get It Done Right, Get It Done Early
-
Redefining DevOps: The Transformative Power of Containerization
-
Introduction To Git
-
Auto-Scaling Kinesis Data Streams Applications on Kubernetes
Comments