DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations

JavaScript

JavaScript (JS) is an object-oriented programming language that allows engineers to produce and implement complex features within web browsers. JavaScript is popular because of its versatility and is preferred as the primary choice unless a specific function is needed. In this Zone, we provide resources that cover popular JS frameworks, server applications, supported data types, and other useful topics for a front-end engineer.

icon
Latest Refcards and Trend Reports
Trend Report
Modern Web Development
Modern Web Development
Refcard #363
JavaScript Test Automation Frameworks
JavaScript Test Automation Frameworks
Refcard #288
Getting Started With Low-Code Development
Getting Started With Low-Code Development

DZone's Featured JavaScript Resources

[DZone Survey] Share Your Expertise and Take our 2023 Web, Mobile, and Low-Code Apps Survey

[DZone Survey] Share Your Expertise and Take our 2023 Web, Mobile, and Low-Code Apps Survey

By Caitlin Candelmo
Do you consider yourself a developer? If "yes," then this survey is for you. We need you to share your knowledge on web and mobile development, how (if) you leverage low code, scalability challenges, and more. The research covered in our Trend Reports depends on your feedback and helps shape the report. Our April Trend Report this year focuses on you, the developer. This Trend Report explores development trends and how they relate to scalability within organizations, highlighting application challenges, code, and more.  And this is where we could use your insights! We're asking for ~10 minutes of your time to share your experience.Let us know your thoughts on the future of the developer! And enter for a chance to win one of 10 $50 gift cards! Take Our Survey Over the coming weeks, we will compile and analyze data from hundreds of DZone members to help inform the "Key Research Findings" for our upcoming April Trend Report, Development at Scale: An Exploration of Mobile, Web, and Low-Code Applications. Your responses help shape the narrative of our Trend Reports, so we cannot do this without you. The DZone Publications team thanks you in advance for all your help! More
Deploy a Node.js App to AWS in an EC2 Server

Deploy a Node.js App to AWS in an EC2 Server

By Rahul Shivalkar
There are multiple ways you can deploy your Nodejs app, be it On-Cloud or On-Premises. However, it is not just about deploying your application, but deploying it correctly. Security is also an important aspect that must not be ignored, and if you do so, the application won’t stand long, meaning there is a high chance of it getting compromised. Hence, here we are to help you with the steps to deploy a Nodejs app to AWS. We will show you exactly how to deploy a Nodejs app to the server using Docker containers, RDS Amazon Aurora, Nginx with HTTPS, and access it using the Domain Name. Tool Stack To Deploy a Node.js App to AWS Nodejs sample app: A sample Nodejs app with three APIs viz, status, insert, and list. These APIs will be used to check the status of the app, insert data in the database and fetch and display the data from the database. AWS EC2 instance: An Ubuntu 20.04 LTS Amazon Elastic Compute Cloud (Amazon EC2) instance will be used to deploy the containerized Nodejs App. We will install Docker in this instance on top of which the containers will be created. We will also install a MySQL Client on the instance. A MySQL client is required to connect to the Aurora instance to create a required table. AWS RDS Amazon Aurora: Our data will be stored in AWS RDS Amazon Aurora. We will store simple fields like username, email-id, and age will be stored in the AWS RDS Amazon Aurora instance.Amazon Aurora is a MySQL and PostgreSQL-compatible relational database available on AWS. Docker: Docker is a containerization platform to build Docker Images and deploy them using containers. We will deploy a Nodejs app to the server, Nginx, and Certbot as Docker containers. Docker-Compose: To spin up the Nodejs, Nginx, and Certbot containers, we will use Docker-Compose. Docker-Compose helps reduce container deployment and management time. Nginx: This will be used to enable HTTPS for the sample Nodejs app and redirect all user requests to the Nodejs app. It will act as a reverse proxy to redirect user requests to the application and help secure the connection by providing the configuration to enable SSL/HTTPS. Certbot: This will enable us to automatically use “Let’s Encrypt” for Domain Validation and issuing SSL certificates. Domain: At the end of the doc, you will be able to access the sample Nodejs Application using your domain name over HTTPS, i.e., your sample Nodejs will be secured over the internet. PostMan: We will use PostMan to test our APIs, i.e., to check status, insert data, and list data from the database. As I said, we will “deploy a Nodejs app to the server using Docker containers, RDS Amazon Aurora, Nginx with HTTPS, and access it using the Domain Name.” Let’s first understand the architecture before we get our hands dirty. Architecture Deploying a Nodejs app to an EC2 instance using Docker will be available on port 3000. This sample Nodejs app fetches data from the RDS Amazon Aurora instance created in the same VPC as that of the EC2 instance. An Amazon Aurora DB instance will be private and, hence, accessible within the same VPC. The Nodejs application deployed on the EC2 instance can be accessed using its public IP on port 3000, but we won’t. Accessing applications on non-standard ports is not recommended, so we will have Nginx that will act as a Reverse Proxy and enable SSL Termination. Users will try to access the application using the Domain Name and these requests will be forwarded to Nginx. Nginx will check the request, and, based on the API, it will redirect that request to the Nodejs app. The application will also be terminated with the SSL. As a result, the communication between the client and server will be secured and protected. Here is the architecture diagram that gives the clarity of deploying a Nodejs app to AWS: Prerequisites Before we proceed to deploying a Nodejs app to AWS, it is assumed that you already have the following prerequisites: AWS account PostMan or any other alternative on your machine to test APIs. A registered Domain in your AWS account. Create an Ubuntu 20.04 LTS EC2 Instance on AWS Go to AWS’ management console sign-in page and log into your account. After you log in successfully, go to the search bar and search for “EC2.” Next, click on the result to visit the EC2 dashboard to create an EC2 instance: Here, click on “Launch instances” to configure and create an EC2 instance: Select the “Ubuntu Server 20.04 LTS” AMI: I would recommend you select t3.small only for test purposes. This will have two CPUs and 2GB RAM. You can choose the instance type as per your need and choice: You can keep the default settings and proceed ahead. Here, I have selected the default VPC. If you want, you can select your VPC. Note: Here, I will be creating an instance in the public subnet: It’s better to put a larger disk space at 30GB. The rest can be the default: Assign a “Name” and “Environment” tag to any values of your choice. You may even skip this step: Allow the connection to port 22 only from your IP. If you allow it from 0.0.0.0/0, your instance will allow anyone on port 22: Review the configuration once, and click on “Launch” if everything looks fine to create an instance: Before the instance gets created, it needs a key-pair. You can create a new key-pair or use the existing one. Click on the “Launch instances” button that will initiate the instance creation: To go to the console and check your instance, click on the “View instances” button: Here, you can see that the instance has been created and is in the “Initiating” phase. Within a minute or two, you can see your instance up and running. Meanwhile, let’s create an RDS instance: Create an RDS Aurora With a MySQL Instance on AWS Go to the search bar at the top of the page and search for “RDS.” Click on the result to visit the “RDS Dashboard.” On the RDS Dashboard, click on the “Create database” button to configure and create the RDS instance: Choose the “Easy create” method, “Amazon Aurora” engine type, and the “Dev/Test” DB instance size as follows: Scroll down a bit and specify the “DB cluster identifier” as “my-Nodejs-database.” You can specify any name of your choice as it is just a name given to the RDS instance; however, I would suggest using the same name so you do not get confused while following the next steps. Also, specify a master username as “admin,” its password, and then click on “Create database.” This will initiate the RDS Amazon Aurora instance creation. Note: For production or live environments, you must not set simple usernames and passwords: Here, you can see that the instance is in the “Creating” state. In around 5-10 minutes, you should have the instance up and running: Make a few notes here: The RDS Amazon Aurora instance will be private by default, which means the RDS Amazon Aurora instance will not be reachable from the outside world and will only be available within the VPC. The EC2 instance and RDS instance belong to the same VPC. The RDS instance is reachable from the EC2 instance. Install Dependencies on the EC2 Instance Now, you can connect to the instance we created. I will not get into details on how to connect to the instance and I believe that you already know it. MySQL Client We will need a MySQL client to connect to the RDS Amazon Aurora instance and create a database in it. Connect to the EC2 instance and execute the following commands from it: sudo apt update sudo apt install mysql-client Create a Table We will need a table in our RDS Amazon Aurora instance to store our application data. To create a table, connect to the Amazon RDS Aurora instance using the MySQL client we installed on the EC2 instance in the previous step. Copy the Database Endpoint from the Amazon Aurora Instance: Execute the following common with the correct values: mysql -u <user-name> -p<password> -h <host-endpoint> Here, my command looks as follows: mysql -u admin -padmin1234 -h (here). Once you get connected to the Amazon RDS Aurora instance, execute the following commands to create a table named “users:” show databases; use main; CREATE TABLE IF NOT EXISTS users(id int NOT NULL AUTO_INCREMENT, username varchar(30), email varchar(255), age int, PRIMARY KEY(id)); select * from users; Refer to the following screenshot to understand command executions: Create an Application Directory Now, let’s create a directory where we will store all our codebase and configuration files: pwd cd /home/ubuntu/ mkdir Nodejs-docker cd Nodejs-docker Clone the Code Repository on the EC2 Instance Clone my Github repository containing all the code. This is an optional step, I have included all the code in this document: pwd cd /home/ubuntu/ git clone cp /home/ubuntu/DevOps/AWS/Nodejs-docker/* /home/ubuntu/Nodejs-docker Note: This is an optional step. If you copy all the files from the repository to the application directory, you do not need to create files in the upcoming steps; however, you will still need to make the necessary changes. Deploying Why Should You Use Docker in Your EC2 Instance? Docker is a containerization tool used to package our software application into an image that can be used to create Docker Containers. Docker helps to build, share and deploy our applications easily. The first step of Dockerization is installing Docker: Install Docker Check Linux Version: cat /etc/issue Update the apt package index: sudo apt-get update Install packages to allow apt to use a repository over HTTPS: sudo apt-get install apt-transport-https ca-certificates curl gnupg lsb-release Add Docker’s official GPG key: curl -fsSL (here) | sudo gpg –dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg Set up the stable repository: echo “deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] (here) $(lsb_release -cs) stable” | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null Update the apt package index: sudo apt-get update Install the latest version of Docker Engine and containerd: sudo apt-get install docker-ce docker-ce-cli containerd.io Check Docker version: docker—version Manage Docker as a non-root user: Create ‘docker’ group: sudo groupadd docker Add your user to the docker group: sudo usermod -aG docker <your-user-name> Exit: exit Login back to the terminal. Verify that you can run Docker commands without sudo: docker run hello-world Upon executing the above run command, you should see the output as follows: 14. Refer to the following screenshot to see the command that I have executed: Dockerize Your Node.js Application in the EC2 Instance Once you have Docker installed, the next step is to Dockerize the app. Dockerizing a Nodejs app means writing a Dockerfile with a set of instructions to create a Docker Image. Let’s create Dockerfile and a sample Nodejs app: pwd cd /home/ubuntu/Nodejs-docker Create Dockerfile and paste the following it it; alternatively, you can copy the content from my GitHub repository here: vim Dockerfile: #Base Image node:12.18.4-alpine FROM node:12.18.4-alpine #Set working directory to /app WORKDIR /app #Set PATH /app/node_modules/.bin ENV PATH /app/node_modules/.bin:$PATH #Copy package.json in the image COPY package.json ./ #Install Packages RUN npm install express --save RUN npm install mysql --save #Copy the app COPY . ./ #Expose application port EXPOSE 3000 #Start the app CMD ["node", "index.js"] Create index.js and paste the following in it; alternatively, you can copy the content from my GitHub repository here. This will be our sample Nodejs app: vim index.js: const express = require('express'); const app = express(); const port = 3000; const mysql = require('mysql'); const con = mysql.createConnection({ host: "my-Nodejs-database.cluster-cxxjkzcl1hwb.eu-west3.rds.amazonAWS.com", user: "admin", password: "admin1234" }); app.get('/status', (req, res) => res.send({status: "I'm up and running"})); app.listen(port, () => console.log(`Dockerized Nodejs Applications is listening on port ${port}!`)); app.post('/insert', (req, res) => { if (req.query.username && req.query.email && req.query.age) { console.log('Received an insert call'); con.connect(function(err) { con.query(`INSERT INTO main.users (username, email, age) VALUES ('${req.query.username}', '${req.query.email}', '${req.query.age}')`, function(err, result, fields) { if (err) res.send(err); if (result) res.send({username: req.query.username, email: req.query.email, age: req.query.age}); if (fields) console.log(fields); }); }); } else { console.log('Something went wrong, Missing a parameter'); } }); app.get('/list', (req, res) => { console.log('Received a list call'); con.connect(function(err) { con.query(`SELECT * FROM main.users`, function(err, result, fields) { if (err) res.send(err); if (result) res.send(result); }); }); }); In the above file, change the values of the following variables with the one applicable to your RDS Amazon Aurora instance: host: (here) user: “admin” password: “admin1234” Create package.json and paste the following in it; alternatively, you can copy the content from my GitHub repository here: vim package.json: { “name”: “Nodejs-docker”, “version”: “12.18.4”, “description”: “Nodejs on ec2 using docker container”, “main”: “index.js”, “scripts”: { “test”: “echo \”Error: no test specified\” && exit 1″ }, “author”: “Rahul Shivalkar”, “license”: “ISC” } Update the AWS Security Group To access the application, we need to add a rule in the “Security Group” to allow connections on port 3000. As I said earlier, we can access the application on port 3000, but it is not recommended. Keep reading to understand our recommendations: 1. Go to the “EC2 dashboard,” select the instance, switch to the “Security” tab, and then click on the “Security groups link:” 2. Select the “Inbound rules” tab and click on the “Edit inbound rules” button: 3. Add a new rule that will allow external connection from “MyIp” on the “3000” port: Deploy the Node.js Server on the EC2 Server (Instance) Let’s build a Docker image from the code that we have: cd /home/ubuntu/Nodejs-docker docker build -t Nodejs: 2. Start a container using the image we just build and expose it on port 3000: docker run –name Nodejs -d -p 3000:3000 Nodejs 3. You can see the container is running: docker ps 4. You can even check the logs of the container: docker logs Nodejs Now we have our Nodejs App Docker Container running. 5. Now, you can access the application from your browser on port 3000: Check the status of the application on /status api using the browser: You can insert some data in the application on /insert API using the Postman app using POST request: You can list the data from your application by using /list API from the browser: Alternatively, you can use the curl command from within the EC2 instance to check status, insert data, list data: curl -XGET “here” curl -XPOST “here” Stop and remove the container: docker stop Nodejs docker rm Nodejs In this section, we tried to access APIs available for the application directly using the Public IP:Port of the EC2 instance. However, exposing non-standard ports to the external world in the Security Group is not at all recommended. Also, we tried to access the application over the HTTP protocol, which means the communication that took place from the “Browser” to the “Application” was not secure and an attacker can read the network packets. To overcome this scenario, it is recommended to use Nginx. Nginx Setup Let’s create an Nginx conf that will be used within the Nginx container through a Docker Volume. Create a file and copy the following content in the file; alternatively, you can copy the content from here as well: cd /home/ubuntu/Nodejs-docker mkdir nginx-conf vim nginx-conf/nginx.conf server { listen 80; listen [::]:80; location ~ /.well-known/acme-challenge { allow all; root /var/www/html; } location / { rewrite ^ https://$host$request_uri? permanent; } } server { listen 443 ssl http2; listen [::]:443 ssl http2; server_name Nodejs.devopslee.com www.Nodejs.devopslee.com; server_tokens off; ssl_certificate /etc/letsencrypt/live/Nodejs.devopslee.com/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/Nodejs.devopslee.com/privkey.pem; ssl_buffer_size 8k; ssl_dhparam /etc/ssl/certs/dhparam-2048.pem; ssl_protocols TLSv1.2 TLSv1.1 TLSv1; ssl_prefer_server_ciphers on; ssl_ciphers ECDH+AESGCM:ECDH+AES256:ECDH+AES128:DH+3DES:!ADH:!AECDH:!MD5; ssl_ecdh_curve secp384r1; ssl_session_tickets off; ssl_stapling on; ssl_stapling_verify on; resolver 8.8.8.8; location / { try_files $uri @Nodejs; } location @Nodejs { proxy_pass http://Nodejs:3000; add_header X-Frame-Options "SAMEORIGIN" always; add_header X-XSS-Protection "1; mode=block" always; add_header X-Content-Type-Options "nosniff" always; add_header Referrer-Policy "no-referrer-when-downgrade" always; add_header Content-Security-Policy "default-src * data: 'unsafe-eval' 'unsafe-inline'" always; } root /var/www/html; index index.html index.htm index.nginx-debian.html; } In the above file, make changes in the three lines mentioned below. Replace my subdomain.domain, i.e., Nodejs.devopslee, with the one that you want and have: server_name: (here) ssl_certificate: /etc/letsencrypt/live/Nodejs.devopslee.com/fullchain.pem; ssl_certificate_key: /etc/letsencrypt/live/Nodejs.devopslee.com/privkey.pem; Why do you need Nginx in front of the node.js service? Our Nodejs application runs on a non-standard port 3000. Nodejs provides a way to use HTTPS; however, configuring the protocol and managing SSL certificates that expire periodically within the application code base, is something we should not be concerned about. To overcome these scenarios, we need to have Nginx in front of it with an SSL termination and forward user requests to Nodejs. Nginx is a special type of web server that can act as a reverse proxy, load balancer, mail proxy, and HTTP cache. Here, we will be using Nginx as a reverse proxy to redirect requests to our Nodejs application and have SSL termination. Why not Apache? Apache is also a web server and can act as a reverse proxy. It also supports SSL termination; however, there are a few things that differentiate Nginx from Apache. Due to the following reasons, mostly Nginx is preferred over Apache. Let’s see them in short: Nginx has a single or a low number of processes, is asynchronous and event-based, whereas Apache tries to make new processes and new threads for every request in every connection. Nginx is lightweight, scalable, and easy to configure. On the other hand, Apache is great but has a higher barrier to learning. Docker-Compose Let’s install docker-compose as we will need it: Download the current stable release of Docker Compose: sudo curl -L “(uname -s)-$(uname -m)” -o /usr/local/bin/docker-compose Apply executable permissions to the docker-composebinary we just downloaded in the above step: sudo chmod +x /usr/local/bin/docker-compose Test to see if the installation was successful by checking the docker-composeversion: docker-compose –version Create a docker-compose.yaml file; alternatively, you can copy the content from my GitHub repository here. This will be used to spin the Docker containers of our application tech stack we have: cd /home/ubuntu/Nodejs-docker vim docker-compose.yml version: '3' services: Nodejs: build: context: . dockerfile: Dockerfile image: Nodejs container_name: Nodejs restart: unless-stopped networks: - app-network webserver: image: nginx:mainline-alpine container_name: webserver restart: unless-stopped ports: - "80:80" - "443:443" volumes: - web-root:/var/www/html - ./nginx-conf:/etc/nginx/conf.d - certbot-etc:/etc/letsencrypt - certbot-var:/var/lib/letsencrypt - dhparam:/etc/ssl/certs depends_on: - Nodejs networks: - app-network certbot: image: certbot/certbot container_name: certbot volumes: - certbot-etc:/etc/letsencrypt - certbot-var:/var/lib/letsencrypt - web-root:/var/www/html depends_on: - webserver command: certonly --webroot --webroot-path=/var/www/html --email my@email.com --agree-tos --no-eff-email --staging -d Nodejs.devopslee.com -d www.Nodejs.devopslee.com #command: certonly --webroot --webroot-path=/var/www/html --email my@email.com --agree-tos --no-eff-email --force-renewal -d Nodejs.devopslee.com -d www.Nodejs.devopslee.com volumes: certbot-etc: certbot-var: web-root: driver: local driver_opts: type: none device: /home/ubuntu/Nodejs-docker/views/ o: bind dhparam: driver: local driver_opts: type: none device: /home/ubuntu/Nodejs-docker/dhparam/ o: bind networks: app-network: driver: bridge In the above file, make changes in the line mentioned below. Replace my subdomain.domain, i.e., Nodejs.devopslee, with the one you want and have. Change IP for your personal email: –email EMAIL: Email used for registration and recovery contact. command: certonly –webroot –webroot-path=/var/www/html –email my@email.com –agree-tos –no-eff-email –staging -d Nodejs.devopslee.com -d www.Nodejs.devopslee.com Update the AWS Security Groups This time, expose ports 80 and 443 in the security group attached to the EC2 instance. Also, remove 3000 since it is not necessary because the application works through port 443: Include the DNS change Here, I have created a sub-domain “here” that will be used to access the sample Nodejs application using the domain name rather than accessing using an IP. You can create your sub-domain on AWS if you already have your domain: Create 2 “Type A Recordsets” in the hosted zone with a value as EC2 instances’ public IP. One Recordset will be “subdomain.domain.com” and the other will be “www.subdomain.domain.com.” Here, I have created “Nodejs.devopslee.com” and “www.Nodejs.devopslee.com,” both pointing to the Public IP of the EC2 instance. Note: I have not assigned any Elastic IP to the EC2 instance. It is recommended to assign an Elastic IP and then use it in the Recordset so that when you restart your EC2 instance, you don’t need to update the IP in the Recordset because public IPs change after the EC2 instance is restarted. Now, copy values of the “Type NS Recordset” we will need these in the next steps: Go to the “Hosted zone” of your domain and create a new “Record” with your “subdomain.domain.com” adding the NS values you copied in the previous step: Now, you have a sub-domain that you can use to access your application. In my case, I can use “Nodejs.devopslee.com” to access the Nodejs application. We are not done yet. Now, the next step is to secure our Nodejs web application. Include the SSL Certificate Let’s generate our key that will be used in Nginx: cd /home/ubuntu/Nodejs-docker mkdir views mkdir dhparam sudo openssl dhparam -out /home/ubuntu/Nodejs-docker/dhparam/dhparam-2048.pem 2048 Deploy Nodejs App to EC2 Instance We are all set to start our Nodejs app using docker-compose. This will start our Nodejs app on port 3000, Nginx with SSL on port 80 and 443. Nginx will redirect requests to the Nodejs app when accessed using the domain. It will also have a Certbot client that will enable us to obtain our certificates. docker-compose up After you hit the above command, you will see some output as follows. You must see a message as “Successfully received certificates.” Note: The above docker-compose command will start containers and will stay attached to the terminal. We have not used the -d option to detach it from the terminal: You are all set, now hit the URL in the browser and you should have your Nodejs application available on HTTPS: You can also try to hit the application using the curl command: List the data from the application: curl (here) Insert an entry in the application: curl -XPOST (here) Again list the data to verify if the data has been inserted or not: curl (here) Check the status of the application: (Here) Hit the URL in the browser to get a list of entries in the database: (Here) Auto-Renewal of SSL Certificates Certificates we generate using “Let’s Encrypt” are valid for 90 days, so we need to have a way to renew our certificates automatically so that we don’t end up with expired certificates. To automate this process, let’s create a script that will renew certificates for us and a cronjob to schedule the execution of this script. Create a script with –dry-runto test our script: vim renew-cert.sh #!/bin/bash COMPOSE="/usr/local/bin/docker-compose --no-ansi" DOCKER="/usr/bin/docker" cd /home/ubuntu/Nodejs-docker/ $COMPOSE run certbot renew --dry-run && $COMPOSE kill -s SIGHUP webserver $DOCKER system prune -af Change permissions of the script to make it executable: chmod 774 renew-cert.sh Create a cronjob: sudo crontab -e */5 * * * * /home/ubuntu/Nodejs-docker/renew-cert.sh >> /var/log/cron.log 2>&1 List the cronjobs: sudo crontab -l 5. Check logs of the cronjob after five mins, as we have set a cronjob to be executed on every fifth minute: tail -f /var/log/cron.lo In the above screenshot, you can see a “Simulating renewal of an existing certificate….” message. This is because we have specified the “–dry-run” option in the script. Let’s remove the “–dry-run” option from the script: vim renew-cert.sh #!/bin/bash COMPOSE="/usr/local/bin/docker-compose --no-ansi" DOCKER="/usr/bin/docker" cd /home/ubuntu/Nodejs-docker/ $COMPOSE run certbot renew && $COMPOSE kill -s SIGHUP webserver $DOCKER system prune -af This time you won’t see such a “Simulating renewal of an existing certificate….” message. This time the script will check if there is any need to renew the certificates, and if required will renew the certificates else will ignore and say “Certificates not yet due for renewal.” What Is Next on How To deploy the Nodejs App to AWS? We are done with setting up our Nodejs application using Docker on AWS EC2 instance; however, there are other things that come into the picture when you want to deploy a highly available application for production and other environments. The next step is to use an Orchestrator, like ECS or EKS, to manage our Nodejs application at the production level. Replication, auto-scaling, load balancing, traffic routing, and monitoring container health does not come out of the box with Docker and Docker-Compose. For managing containers and microservices architecture at scale, you need a container orchestration tool like ECS or EKS. Also, we did not use any Docker repository to store our Nodejs app Docker Image. You can use AWS ECR, a fully managed AWS container registry offering high-performance hosting. Conclusion To deploy Nodejs app to AWS does not mean just creating a Nodejs application and deploying it on the AWS EC2 instance with a self-managed database. There are various aspects like containerizing the Nodejs App, SSL termination, and domain for the app that come into the picture when you want to speed up your software development, deployment, security, reliability, and data redundancy. In this article, we saw the steps to dockerize the sample Nodejs application, using AWS RDS Amazon Aurora and deploying a Nodejs app to EC2 instance using Docker and Docker-Compose. We enabled SSL termination to our sub-domain to be used to access the Nodejs application. We saw the steps to automate domain validation and SSL certificate creation using Certbot along with a way to automate certificate renewal that is valid for 90 days. This is enough to get started with a sample Nodejs application; however, when it comes to managing your real-time applications, 100s of microservices, 1000s of containers, volumes, networking, secrets, egress-ingress, you need a container orchestration tool. There are various tools, like self-hosted Kubernetes, AWS ECS, AWS EKS, that you can leverage to manage the container life cycle in your real-world applications. More
32 Best JavaScript Snippets
32 Best JavaScript Snippets
By Rahul .
Why and How To Create an Event Bus in Vue.js 3
Why and How To Create an Event Bus in Vue.js 3
By Valerio Barbera
Shallow and Deep Copies in JavaScript: What’s the Difference?
Shallow and Deep Copies in JavaScript: What’s the Difference?
By Janki Mehta
Migrating From MySQL to YugabyteDB Using YugabyteDB Voyager
Migrating From MySQL to YugabyteDB Using YugabyteDB Voyager

In this article, I’m going to demonstrate how you can migrate a comprehensive web application from MySQL to YugabyteDB using the open-source data migration engine YugabyteDB Voyager. Nowadays, many people migrate their applications from traditional, single-server relational databases to distributed database clusters. This helps improve availability, scalability, and performance. Migrating to YugabyteDB allows engineers to use a familiar SQL interface while benefiting from the data-resiliency and performance characteristics of distributed databases. YugaSocial Application I’ve developed an application called YugaSocial, built to run on MySQL. The YugaSocial application is a Facebook clone with the ability to make posts, follow users, comment on posts, and more! Let’s start by deploying and connecting to a Google Cloud SQL database running MySQL. Later, we’ll migrate our data to a multi-node YugabyteDB Managed cluster. Getting Started With MySQL We could run MySQL on our machines using a local installation or in Docker, but I’m going to demonstrate how to migrate a database hosted on the Google Cloud Platform (GCP) to YugabyteDB Managed. Setting Up Google Cloud SQL I’ve deployed a MySQL instance on Google Cloud SQL named yugasocial and set my public IP address to the authorized networks list so I can connect directly from my machine. While beneficial for demonstration purposes, I’d recommend connecting securely from inside a VPC, with SSL certificates to properly secure your data transfers. Connecting YugaSocial to MySQL in Node.js Connecting to our MySQL instance in the cloud is easy with the MySQL driver for Node.js. This is an application code snippet that connects to the MySQL instance: JavaScript // connect.js ... import mysql from "mysql"; if (process.env.DB_TYPE === "mysql") { const pool = mysql.createPool({ host: process.env.DB_HOST, port: process.env.DB_PORT, user: process.env.DB_USER, password: process.env.DB_PASSWORD, database: process.env.DB_NAME, connectionLimit: 100 }); } I’ve created a connection pool with up to 100 established connections. By setting environment variables with our Google Cloud SQL instance configuration, and running the application, we can confirm that our database has been configured properly: Shell > DB_TYPE=mysql DB_USER=admin DB_HOST=[HOST] DB_HOST=[PASSWORD] node index.js Connection to MySQL verified. Server running on port 8800. After verifying our MySQL database running in the cloud, we can start migrating to YugabyteDB Managed. Setting Up YugabyteDB Managed It takes less than five minutes to get started with YugabyteDB Managed. First, create an account then follow the steps to create a YugabyteDB cluster. I’ve chosen to deploy a three-node cluster to GCP, in the us-west-1 region. This configuration will provide fault tolerance across availability zones. Add your IP address to the cluster allow list so you can connect from your machine to the remote database and download the database credentials before creating your cluster. Once our cluster has been deployed, we’re ready to begin migrating with YugabyteDB Voyager. Migrating to YugabyteDB Having verified our MySQL deployment, it’s time to migrate from Cloud SQL to YugabyteDB using the YugabyteDB Voyager CLI. YugabyteDB Voyager is a powerful, open-source, data-migration engine, which manages the entire lifecycle of data migration. After installing YugabyteDB Voyager, we’ll begin by creating users in our source and target databases and granting them roles. I’ve chosen to use the mysqlsh command-line utility to connect to my cloud instance, but Google provides multiple connection options. 1. Create the ybvoyager user in Cloud SQL and grant permissions: SQL > mysqlsh root@CLOUD_SQL_HOST --password='CLOUD_SQL_PASSWORD' > \sql SQL=> \use social SQL=> CREATE USER 'ybvoyager'@'%' IDENTIFIED WITH mysql_native_password BY 'Password#123'; SQL=> GRANT PROCESS ON *.* TO 'ybvoyager'@'%'; SQL=> GRANT SELECT ON social.* TO 'ybvoyager'@'%'; SQL=> GRANT SHOW VIEW ON source_db_name.* TO 'ybvoyager'@'%'; SQL=> GRANT TRIGGER ON source_db_name.* TO 'ybvoyager'@'%'; SQL=> GRANT SHOW_ROUTINE ON *.* TO 'ybvoyager'@'%'; 2. Repeat this process using the YugabyteDB Managed Cloud Shell: SQL // Optionally, you can create a database for import. Otherwise, the target database will default to 'yugabyte'. yugabyte=> CREATE DATABASE social; yugabyte=> CREATE USER ybvoyager PASSWORD 'password'; yugabyte=> GRANT yb_superuser TO ybvoyager; Now, our source and target databases are equipped to use Voyager. In order to export from Cloud SQL, we first need to create an export directory and an associated environment variable: Shell > mkdir ~/export-dir > export EXPORT_DIR=$HOME/export-dir This directory will be used as an intermediary between our source and target databases. It will house schema and data files, as well as logs, metadata, and schema analysis reports. Let’s begin migrating our database. 1. Export the schema from Google Cloud SQL: Shell > yb-voyager export schema --export-dir ~/export-dir \ --source-db-type mysql \ --source-db-host CLOUD_SQL_HOST \ --source-db-user ybvoyager \ --source-db-password 'Password#123' \ --source-db-name social export of schema for source type as 'mysql' mysql version: 8.0.26-google exporting TABLE done exporting PARTITION done exporting VIEW done exporting TRIGGER done exporting FUNCTION done exporting PROCEDURE done Exported schema files created under directory: /export-dir/schema 2. Analyze the exported schema: Shell > yb-voyager analyze-schema --export-dir ~/export-dir --output-format txt -- find schema analysis report at: /export-dir/reports/report.txt By analyzing our schema before exporting data, we have the option to make any necessary changes to our DDL statements. The schema analysis report will flag any statements that require manual intervention. In the case of YugaSocial, Voyager migrated the MySQL schema to PostgreSQL DDL without needing any manual changes. 3. Finally, export the data from Google Cloud SQL: Shell > yb-voyager export data --export-dir ~/export-dir \ --source-db-type mysql \ --source-db-host CLOUD_SQL_HOST \ --source-db-user ybvoyager \ --source-db-password 'Password#123' \ --source-db-name social export of data for source type as 'mysql' Num tables to export: 6 table list for data export: [comments likes posts relationships stories users] calculating approx num of rows to export for each table... Initiating data export. Data export started. Exported tables:- {comments, likes, posts, relationships, stories, users} TABLE ROW COUNT comments 1000 likes 502 posts 1000 relationships 1002 stories 1000 users 1004 Export of data complete ✅ After successfully exporting our schema and data, we’re ready to move our database to YugabyteDB Managed. 1. Import the schema to YugabyteDB Managed: Shell > yb-voyager import schema --export-dir ~/export-dir \ --target-db-host YUGABYTEDB_MANAGED_HOST \ --target-db-user ybvoyager \ --target-db-password 'password' \ --target-db-name yugabyte \ --target-db-schema social \ --target-ssl-mode require \ --start-clean schemas to be present in target database "yugabyte": [social] creating schema 'social' in target database... table.sql: CREATE TABLE comments ( id bigserial, description varchar(200) NOT NULL, crea ... table.sql: ALTER SEQUENCE comments_id_seq RESTART WITH 1; table.sql: ALTER TABLE comments ADD UNIQUE (id); table.sql: CREATE TABLE likes ( id bigserial, userid bigint NOT NULL, postid bigint NOT ... table.sql: ALTER SEQUENCE likes_id_seq RESTART WITH 1; table.sql: ALTER TABLE likes ADD UNIQUE (id); table.sql: CREATE TABLE posts ( id bigserial, description varchar(200), img varchar(200) ... ... As you can see from the terminal output, I’ve chosen to import into the public schema. If you’d like to use a different schema, you can do this using the --target-db-schema option. 2. Import the data to YugabyteDB Managed: Shell > yb-voyager import data --export-dir ~/export-dir \ --target-db-host YUGABYTEDB_MANAGED_HOST \ --target-db-user ybvoyager \ --target-db-password 'password' \ --target-db-name yugabyte \ --target-db-schema social \ --target-ssl-mode require \ --start-clean import of data in "yugabyte" database started Using 2 parallel jobs by default. Use --parallel-jobs to specify a custom value skipping already imported tables: [] Preparing to import the tables: [comments likes posts relationships stories users] All the tables are imported setting resume value for sequences YugabyteDB Voyager handles this data import with parallelism, making quick work of it. 3. To wrap things up, import indexes and triggers: Shell > yb-voyager import schema --export-dir ~/export-dir \ --target-db-host YUGABYTEDB_MANAGED_HOST \ --target-db-user ybvoyager \ --target-db-password ‘password’ \ --target-db-name yugabyte \ --target-db-schema social \ --target-ssl-mode require \ --start-clean \ --post-import-data INDEXES_table.sql: CREATE INDEX comments_postid ON comments (postid); INDEXES_table.sql: CREATE INDEX comments_userid ON comments (userid); INDEXES_table.sql: CREATE INDEX likes_postid ON likes (postid); ... We no longer need the ybvoyager user in YugabyteDB Managed. To change ownership of the imported objects to another user in the YugabyteDB Managed Cloud Shell, run: SQL > REASSIGN OWNED BY ybvoyager TO admin; > DROP OWNED BY ybvoyager; > DROP USER ybvoyager; It’s time to verify that our database was successfully migrated to YugabyteDB Managed, by reconfiguring our YugaSocial application. Connecting YugaSocial to YugabyteDB Managed in Node.js As mentioned, YugaSocial was developed to run on MySQL. However, I also added support for PostgreSQL. Since YugabyteDB is PostgreSQL-compatible, we can use the node-postgres driver for Node.js to connect to our YugabyteDB Managed cluster. In fact, Yugabyte has developed its own smart drivers, which add load-balancing capabilities to native drivers. This can drastically improve performance by avoiding excessive load on any single cluster node. After installing Yugabyte’s fork of node-postgres, we’re ready to connect to our database: JavaScript // connect.js ... const { Pool } = require("@yugabytedb/pg"); if (process.env.DB_TYPE === “yugabyte”) { const pool = new Pool({ user: process.env.DB_USER, host: process.env.DB_HOST, password: process.env.DB_PASSWORD, port: 5433, database: process.env.DB_NAME, min: 5, max: 100, ssl: { rejectUnauthorized: false } }); } This configuration is very similar to the MySQL driver. By restarting our application with the proper environment variables for our connection details, we’re able to confirm that our data was migrated successfully: Shell > DB_TYPE=yugabyte DB_USER=admin DB_HOST=[HOST] DB_HOST=[PASSWORD] node index.js Our application functions just the same as before. This time I replied to Yana, to let her know that YugaSocial had officially been migrated to YugabyteDB Managed! Conclusion As you can see, YugabyteDB Voyager simplifies migration from MySQL to YugabyteDB. I encourage you to give it a try in your next coding adventure, whether you’re migrating from MySQL, or other relational databases, like PostgreSQL or Oracle. Look out for more articles on distributed SQL and Node.js from me in the near future. Until then, don’t hesitate to reach out and keep on coding!

By Brett Hoyer
Node.js REST API Frameworks
Node.js REST API Frameworks

Node.js is a popular platform for building scalable and efficient web applications, and one of its key strengths is its support for building REST APIs. With its growing ecosystem of libraries and frameworks, developers have a wide range of options for building and deploying REST APIs in Node.js. In this article, we will look closely at some of the top Node.js REST API frameworks and examine their pros, cons, and basic example to help you choose the right one for your next project. 1. Express Express is the most popular and widely-used framework for building REST APIs in Node.js. It provides a simple and minimal interface for creating REST APIs, making it easy to get started. Express is also highly modular, allowing developers to easily add new functionality through middleware and plugins. This makes it a great choice for projects of all sizes, from small hobby projects to large-scale enterprise applications. Pros Simple and easy to use. Widely adopted and well-documented. Large and active community. Highly modular and customizable. Cons It can become complex for larger projects. Some developers may find the minimalist approach too limiting. Example javascriptconst express = require('express'); const app = express(); app.get('/', (req, res) => { res.send('Hello World!'); }); app.listen(3000, () => { console.log('Example app listening on port 3000!'); }); 2. Fastify Fastify is a fast and low-overhead framework for building high-performance REST APIs. It offers a number of features for building efficient and scalable APIs, including a fast request routing system, support for async/await, and a low memory footprint. Fastify also provides a number of plugins and extensions for adding new functionality, making it a highly customizable framework. Pros Fast and efficient. Low overhead and memory footprint Supports async/await Highly customizable Cons It may not have as much community support as other frameworks. It may not be as well-suited for large-scale projects. Example javascriptconst fastify = require('fastify')(); fastify.get('/', async (request, reply) => { reply.send({ hello: 'world' }); }); fastify.listen(3000, (err, address) => { if (err) throw err; console.log(`server listening on ${address}`); }); 3. NestJS NestJS is a modular and scalable framework for building robust and efficient REST APIs. It offers a scalable architecture based on TypeScript, making it a great choice for large-scale projects. NestJS also provides a number of features for building robust APIs, including support for GraphQL, a powerful CLI tool, and an easy-to-use testing framework. Pros Scalable and modular architecture Built with TypeScript Supports GraphQL Powerful CLI tool and easy-to-use testing framework. Cons It may not be as simple to get started with as other frameworks. TypeScript may not be familiar to all developers. Example kotlinimport { Controller, Get } from '@nestjs/common'; @Controller() export class AppController { @Get() root(): string { return 'Hello World!'; } } 4. Koa Koa is a minimalist and elegant framework for building REST APIs in Node.js. It provides a lightweight and expressive interface for creating REST APIs. Some of the key features of Koa include: Pros Lightweight and expressive Good for building simple APIs. Middleware support Cons No built-in validation A smaller community than Express Here is a basic example of how to create a REST API using Koa: javascriptconst Koa = require('koa'); const app = new Koa(); app.use(ctx => { ctx.body = 'Hello World!'; }); app.listen(3000); 5. Hapi Hapi is a powerful and flexible framework for building scalable and production-ready REST APIs in Node.js. It offers a rich set of features for building APIs and managing the request/response lifecycle. Some of the key features of Hapi include: Pros Good for building large-scale APIs. Robust and production-ready Built-in validation and request parsing Large plugin ecosystem Cons: Steep learning curve for beginners. A smaller community than Express. Here is a basic example of how to create a REST API using Hapi: javascriptconst Hapi = require('hapi'); const server = new Hapi.Server(); server.route({ method: 'GET', path: '/', handler: (request, h) => { return 'Hello World!'; } }); async function start() { try { await server.start(); } catch (err) { console.log(err); process.exit(1); } console.log('Server running at:', server.info.uri); }; start(); In conclusion, each of these five Node.js REST API frameworks has its own unique features and strengths. Therefore, developers should choose the framework that best fits their specific needs and requirements. Whether building a simple API or a complex, production-ready API, these frameworks provide a solid foundation for building REST APIs in Node.js.

By Farith Jose Heras García
How To Setup Spring Boot With Vue.js Frontend
How To Setup Spring Boot With Vue.js Frontend

In this article, you will learn how to setup a Maven multi-module project which consists out of a Spring Boot backend and a Vue.js frontend. The application itself will not be created, only the setup of the project is covered in this article. Enjoy! Introduction Many applications consist out of a backend and a frontend. But how will you organize and structure your project and how will you deploy the backend and frontend? Many options to choose from and there is no one size fits all. You must make decisions that suit your use case. In either case, it is important to keep the backend code and the frontend code separated from each other. This way, it is easier to change things later on. So, which decisions do you need to make? Do you want to package and deploy the backend and frontend simultaneously? Do you need to be able to scale up and down your application from the beginning? Do you want the frontend to be updated separately from the backend? Among many more. There is no right or wrong, you need to choose wisely based on the type of application and non-functional requirements. In this article, you will learn how to setup the project structure for an application which consists out of a Spring Boot backend part and a Quasar frontend part (Quasar is a Vue.js based framework). Both are packaged in the Spring Boot jar-file and deployed as a single unit. This will enable you to get started quite fast and it will leave the options open to separate both when needed. In the latter case, you will need to deploy the frontend part in a web server like NGINX. As the build tool, Maven will be used. Sources used in this blog are available at GitHub. Prerequisites Basic Spring Boot knowledge. Basic Quasar (Vue.js) knowledge or other frontend framework. Basis Linux knowledge. Basic Maven knowledge. Besides that, you need to have Java 17, Node.js, and npm installed. Instructions for installing Node.js can be found in the official documentation. Choose the instructions for your operating system and ensure that you use a LTS version. Below are the instructions when you are using Ubuntu 22.04: Shell $ curl -fsSL https://deb.nodesource.com/setup_lts.x | sudo -E bash - &&\ $ sudo apt-get install -y nodejs After this, verify the installation: Shell $ node -v v18.12.1 Update npm: Shell $ sudo npm install -g npm@9.2.0 Install the Quasar Framework, which will allow you to create responsive web applications. It is like a layer on top of Vue.js: Shell $ sudo npm install -g @quasar/cli Project Overview As written before, you will create a Maven multi-module project, which consists out of a Spring Boot backend application and a Quasar frontend. The project structure is the following: Shell myspringbootvueplanet ├── backend │ └── pom.xml ├── frontend │ └── pom.xml └── pom.xml The main project is called myspringbootvueplanet and has its own pom. It consists out of a module backend and a module frontend, each with their own pom files. In the next sections, this structure will be created and the contents of the directories and pom files will be created. Spring Boot Backend First, you will start with the Spring Boot backend: Navigate to Spring Initializr. Choose Java 17 Choose Maven Add the Spring Web dependency. Download the project and unzip it. Create a directory backend in your main project directory and move the src directory to this new backend directory. Next, copy the pom file to the backend directory: Shell $ mkdir backend $ mv src/ backend/src $ cp pom.xml backend/pom.xml The pom in the main project needs to be adapted so that it knows about the backend module. Change the contents as follows. Note: the packaging is changed to pom: XML <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.mydeveloperplanet.myspringbootvueplanet</groupId> <artifactId>myspringbootvueplanet</artifactId> <version>0.0.1-SNAPSHOT</version> <name>MySpringBootVuePlanet</name> <description>Demo project for Spring Boot with Vue frontend</description> <packaging>pom</packaging> <modules> <module>backend</module> </modules> </project> The backend pom needs to be changed also. Change the artifactId into backend. Change the following line: XML <artifactId>myspringbootvueplanet</artifactId> Into: XML <artifactId>backend</artifactId> And remove the name tag. The upper part of the pom is the following: XML <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.0.1</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com.mydeveloperplanet.myspringbootvueplanet</groupId> <artifactId>backend</artifactId> <version>0.0.1-SNAPSHOT</version> Verify whether the Maven project builds by executing the following command from the root of the project: Shell $ mvn clean verify Quasar Frontend To create a basic Vue.js frontend application with the Quasar framework, you execute the following command from the root of the repository and you answer the questions according to your needs: Shell $ npm init quasar What would you like to build? › App with Quasar CLI, let's go! Project folder: … frontend Pick Quasar version: › Quasar v2 (Vue 3 | latest and greatest) Pick script type: › Typescript Pick Quasar App CLI variant: › Quasar App CLI with Vite Package name: … frontend Project product name: (must start with letter if building mobile apps) … myspringbootvueplanet Project description: … A demo project for Spring Boot with Vue/Quasar Author: … mydeveloperplanet <mymail@address.com> Pick a Vue component style: › Composition API Pick your CSS preprocessor: › Sass with SCSS syntax Check the features needed for your project: › ESLint Pick an ESLint preset: › Prettier Install project dependencies? (recommended) › Yes, use npm At this moment, a frontend directory has been created in the root of the repository containing the frontend application. Add the following pom to the frontend directory. In this pom, you make use of the frontend-maven-plugin, which allows you to build the frontend application by means of Maven: XML <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.mydeveloperplanet.myspringbootvueplanet</groupId> <artifactId>frontend</artifactId> <version>0.0.1-SNAPSHOT</version> <build> <plugins> <plugin> <groupId>com.github.eirslett</groupId> <artifactId>frontend-maven-plugin</artifactId> <version>1.12.1</version> <executions> <execution> <id>install node and npm</id> <goals> <goal>install-node-and-npm</goal> </goals> <configuration> <nodeVersion>v18.12.1</nodeVersion> </configuration> </execution> <execution> <id>npm install</id> <goals> <goal>npm</goal> </goals> <configuration> <arguments>install</arguments> </configuration> </execution> <execution> <id>npm install @quasar/cli -g</id> <goals> <goal>npm</goal> </goals> <configuration> <arguments>install @quasar/cli -g</arguments> </configuration> </execution> <execution> <id>npx quasar build</id> <goals> <goal>npx</goal> </goals> <configuration> <arguments>quasar build</arguments> </configuration> </execution> </executions> </plugin> </plugins> </build> </project> Now, add the frontend module to the pom in the root of the repository: XML <modules> <module>backend</module> <module>frontend</module> </modules> Verify whether the project builds and execute the following command from the root of the repository: Shell $ mvn clean verify Combine Backend and Frontend Now it is time to package the frontend distribution files with the backend application. To do so, you add the following to the build-plugins section of the backend pom: XML <build> <plugins> <plugin> <artifactId>maven-resources-plugin</artifactId> <executions> <execution> <id>copy frontend content</id> <phase>generate-resources</phase> <goals> <goal>copy-resources</goal> </goals> <configuration> <outputDirectory>target/classes/static</outputDirectory> <overwrite>true</overwrite> <resources> <resource> <directory>../frontend/dist/spa</directory> </resource> </resources> </configuration> </execution> </executions> </plugin> </plugins> </build> The frontend build outputs the distributable files into directory <root repo>/frontend/dist/spa. The maven-resouces-plugin will copy those resources and will add them to the <root repo>/target/classes/static directory. The Spring Boot application will serve those pages onto http://localhost:8080/. By default, Spring will serve static content from a number of directories. Beware: after this change, you will not be able to run the module builds in parallel anymore because the backend build is dependent of the frontend build. When you are using the Maven daemon to run your builds, you have to add the flag -T1 in order to force it to build sequentially. Build the application again: Shell $ mvn clean verify Start the application from the root of the repository by executing the following command: Shell $ java -jar backend/target/backend-0.0.1-SNAPSHOT.jar Navigate in the browser to http://localhost:8080/ and the Quasar Framework start page is shown: Conclusion In this article, you learned how to create the project structure for a basic Spring Boot backend with a Quasar frontend application. Deployment is easy because all distributable files are part of the Spring Boot jar-file and can be started by means of a single command.

By Gunter Rotsaert CORE
Angular Drag’n Drop With Query Components and Form Validation
Angular Drag’n Drop With Query Components and Form Validation

The AngularPortfolioMgr project can import the SEC filings of listed companies. The importer class is the FileClientBean and imports the JSON archive from “Kaggle.” The data is provided by year, symbol, and period. Each JSON data set has keys (called concepts) and values with the USD value. For example, IBM’s full-year revenue in 2020 was $13,456. This makes two kinds of searches possible. A search for company data and a search for keys (concepts) over all entries. The components below “Company Query” select the company value year with operators like “=,” “>=,” and “<=” (values less than 1800 are ignored). The symbol search is implemented with an angular autocomplete component that queries the backend for matching symbols. The quarters are in a select component of the available periods. The components below “Available Sec Query Items” provide the Drag’n Drop component container with the items that can be dragged down into the query container. “Term Start” is a mathematical term that means “bracket open” as a logical operator. The term “end” comes from mathematics and refers to a closed bracket. The query item is a query clause of the key (concept). The components below “Sec Query Items” are the search terms in the query. The query components contain the query parameters for the concept and value with their operators for the query term. The terms are created with the bracket open/close wrapper to prefix collections of queries with “and,” and “or,” or “or not,” and “not or” operators. The query parameters and the term structure are checked with a reactive Angular form that enables the search button if they are valid. Creating the Form and the Company Query The create-query.ts class contains the setup for the query: TypeScript @Component({ selector: "app-create-query", templateUrl: "./create-query.component.html", styleUrls: ["./create-query.component.scss"], }) export class CreateQueryComponent implements OnInit, OnDestroy { private subscriptions: Subscription[] = []; private readonly availableInit: MyItem[] = [ ... ]; protected readonly availableItemParams = { ... } as ItemParams; protected readonly queryItemParams = { ... } as ItemParams; protected availableItems: MyItem[] = []; protected queryItems: MyItem[] = [ ... ]; protected queryForm: FormGroup; protected yearOperators: string[] = []; protected quarterQueryItems: string[] = []; protected symbols: Symbol[] = []; protected FormFields = FormFields; protected formStatus = ''; @Output() symbolFinancials = new EventEmitter<SymbolFinancials[]>(); @Output() financialElements = new EventEmitter<FinancialElementExt[]>(); @Output() showSpinner = new EventEmitter<boolean>(); constructor( private fb: FormBuilder, private symbolService: SymbolService, private configService: ConfigService, private financialDataService: FinancialDataService ) { this.queryForm = fb.group( { [FormFields.YearOperator]: "", [FormFields.Year]: [0, Validators.pattern("^\\d*$")], [FormFields.Symbol]: "", [FormFields.Quarter]: [""], [FormFields.QueryItems]: fb.array([]), } , { validators: [this.validateItemTypes()] } ); this.queryItemParams.formArray = this.queryForm.controls[ FormFields.QueryItems ] as FormArray; //delay(0) fixes "NG0100: Expression has changed after it was checked" exception this.queryForm.statusChanges.pipe(delay(0)).subscribe(result => this.formStatus = result); } ngOnInit(): void { this.symbolFinancials.emit([]); this.financialElements.emit([]); this.availableInit.forEach((myItem) => this.availableItems.push(myItem)); this.subscriptions.push( this.queryForm.controls[FormFields.Symbol].valueChanges .pipe( debounceTime(200), switchMap((myValue) => this.symbolService.getSymbolBySymbol(myValue)) ) .subscribe((myValue) => (this.symbols = myValue)) ); this.subscriptions.push( this.configService.getNumberOperators().subscribe((values) => { this.yearOperators = values; this.queryForm.controls[FormFields.YearOperator].patchValue( values.filter((myValue) => myValue === "=")[0] ); }) ); this.subscriptions.push( this.financialDataService .getQuarters() .subscribe( (values) => (this.quarterQueryItems = values.map((myValue) => myValue.quarter)) ) ); } First, there are the arrays for the RxJs subscriptions and the available and query items for Drag’n Drop. The *ItemParams contain the default parameters for the items. The yearOperators and the quarterQueryItems contain the drop-down values. The “symbols” array is updated with values when the user types in characters (in the symbol) autocomplete. The FormFields are an enum with key strings for the local form group. The @Output() EventEmitter provides the search results and activate or deactivate the spinner. The constructor gets the needed services and the FormBuilder injected and then creates the FormGroup with the FormControls and the FormFields. The QueryItems FormArray supports the nested forms in the components of the queryItems array. The validateItemTypes() validator for the term structure validation is added, and the initial parameter is added. At the end, the form status changes are subscribed with delay(0) to update the formStatus property. The ngOnInit() method initializes the available items for Drag’n Drop. The value changes of the symbol autocomplete are subscribed to request the matching symbols from the backend and update the “symbols” property. The numberOperators and the “quarters” are requested off the backend to update the arrays with the selectable values. They are requested off the backend because that enables the backend to add new operators or new periods without changing the frontend. The template looks like this: HTML <div class="container"> <form [formGroup]="queryForm" novalidate> <div> <div class="search-header"> <h2 i18n="@@createQueryCompanyQuery">Company Query</h2> <button mat-raised-button color="primary" [disabled]="!formStatus || formStatus.toLowerCase() != 'valid'" (click)="search()" i18n="@@search" > Search </button> </div> <div class="symbol-financials-container"> <mat-form-field> <mat-label i18n="@@operator">Operator</mat-label> <mat-select [formControlName]="FormFields.YearOperator" name="YearOperator" > <mat-option *ngFor="let item of yearOperators" [value]="item">{{ item }</mat-option> </mat-select> </mat-form-field> <mat-form-field class="form-field"> <mat-label i18n="@@year">Year</mat-label> <input matInput type="text" formControlName="{{ FormFields.Year }" /> </mat-form-field> </div> <div class="symbol-financials-container"> <mat-form-field class="form-field"> <mat-label i18n="@@createQuerySymbol">Symbol</mat-label> <input matInput type="text" [matAutocomplete]="autoSymbol" formControlName="{{ FormFields.Symbol }" i18n-placeholder="@@phSymbol" placeholder="symbol" /> <mat-autocomplete #autoSymbol="matAutocomplete" autoActiveFirstOption> <mat-option *ngFor="let symbol of symbols" [value]="symbol.symbol"> {{ symbol.symbol } </mat-option> </mat-autocomplete> </mat-form-field> <mat-form-field class="form-field"> <mat-label i18n="@@quarter">Quarter</mat-label> <mat-select [formControlName]="FormFields.Quarter" name="Quarter" multiple > <mat-option *ngFor="let item of quarterQueryItems" [value]="item">{{ item }</mat-option> </mat-select> </mat-form-field> </div> </div> ... </div> First, the form gets connected to the formgroup queryForm of the component. Then the search button gets created and is disabled if the component property formStatus, which is updated by the formgroup, is not “valid.” Next, the two <mat-form-field> are created for the selection of the year operator and the year. The options for the operator are provided by the yearOperators property. The input for the year is of type “text” but the reactive form has a regex validator that accepts only decimals. Then, the symbol autocomplete is created, where the “symbols” property provides the returned options. The #autoSymbol template variable connects the input matAutocomplete property with the options. The quarter select component gets its values from the quarterQueryItems property and supports multiple selection of the checkboxes. Drag’n Drop Structure The template of the cdkDropListGroup looks like this: HTML <div cdkDropListGroup> <div class="query-container"> <h2 i18n="@@createQueryAvailableSecQueryItems"> Available Sec Query Items </h2> <h3 i18n="@@createQueryAddQueryItems"> To add a Query Item. Drag it down. </h3> <div cdkDropList [cdkDropListData]="availableItems" class="query-list" (cdkDropListDropped)="drop($event)"> <app-query *ngFor="let item of availableItems" cdkDrag [queryItemType]="item.queryItemType" [baseFormArray]="availableItemParams.formArray" [formArrayIndex]="availableItemParams.formArrayIndex" [showType]="availableItemParams.showType"></app-query> </div> </div> <div class="query-container"> <h2 i18n="@@createQuerySecQueryItems">Sec Query Items</h2> <h3 i18n="@@createQueryRemoveQueryItems"> To remove a Query Item. Drag it up. </h3> <div cdkDropList [cdkDropListData]="queryItems" class="query-list" (cdkDropListDropped)="drop($event)"> <app-query class="query-item" *ngFor="let item of queryItems; let i = index" cdkDrag [queryItemType]="item.queryItemType" [baseFormArray]="queryItemParams.formArray" [formArrayIndex]="i" (removeItem)="removeItem($event)" [showType]="queryItemParams.showType" ></app-query> </div> </div> </div> The cdkDropListGroup div contains the two cdkDropList divs. The items can be dragged and dropped between the droplists availableItems and queryItems and, on dropping, the method drop($event) is called. The droplist divs contain <app-query> components. The search functions of “term start,” “term end,” and “query item type” are provided by angular components. The baseFormarray is a reference to the parent formgroup array, and formArrayIndex is the index where you insert the new subformgroup. The removeItem event emitter provides the query component index that needs to be removed to the removeItem($event) method. If the component is in the queryItems array, the showType attribute turns on the search elements of the components (querItemdParams default configuration). The drop(...) method manages the item transfer between the cdkDropList divs: TypeScript drop(event: CdkDragDrop<MyItem[]>) { if (event.previousContainer === event.container) { moveItemInArray( event.container.data, event.previousIndex, event.currentIndex ); const myFormArrayItem = this.queryForm[ FormFields.QueryItems ].value.splice(event.previousIndex, 1)[0]; this.queryForm[FormFields.QueryItems].value.splice( event.currentIndex, 0, myFormArrayItem ); } else { transferArrayItem( event.previousContainer.data, event.container.data, event.previousIndex, event.currentIndex ); //console.log(event.container.data === this.todo); while (this.availableItems.length > 0) { this.availableItems.pop(); } this.availableInit.forEach((myItem) => this.availableItems.push(myItem)); } } First, the method checks if the event.container has been moved inside the container. That is handled by the Angular Components function moveItemInArray(...) and the fromgrouparray entries are updated. A transfer between cdkDropList divs is managed by the Angular Components function transferArrayItem(...). The availableItems are always reset to their initial content and show one item of each queryItemType. The adding and removing of subformgroups from the formgroup array is managed in the query component. Query Component The template of the query component contains the <mat-form-fields> for the queryItemType. They are implemented in the same manner as the create-query template. The component looks like this: TypeScript @Component({ selector: "app-query", templateUrl: "./query.component.html", styleUrls: ["./query.component.scss"], }) export class QueryComponent implements OnInit, OnDestroy { protected readonly containsOperator = "*=*"; @Input() public baseFormArray: FormArray; @Input() public formArrayIndex: number; @Input() public queryItemType: ItemType; @Output() public removeItem = new EventEmitter<number>(); private _showType: boolean; protected termQueryItems: string[] = []; protected stringQueryItems: string[] = []; protected numberQueryItems: string[] = []; protected concepts: FeConcept[] = []; protected QueryFormFields = QueryFormFields; protected itemFormGroup: FormGroup; protected ItemType = ItemType; private subscriptions: Subscription[] = []; constructor( private fb: FormBuilder, private configService: ConfigService, private financialDataService: FinancialDataService ) { this.itemFormGroup = fb.group( { [QueryFormFields.QueryOperator]: "", [QueryFormFields.ConceptOperator]: "", [QueryFormFields.Concept]: ["", [Validators.required]], [QueryFormFields.NumberOperator]: "", [QueryFormFields.NumberValue]: [ 0, [ Validators.required, Validators.pattern("^[+-]?(\\d+[\\,\\.])*\\d+$"), ], ], [QueryFormFields.ItemType]: ItemType.Query, } ); } This is the QueryComponent with the baseFormArray of the parent to add the itemFormGroup at the formArrayIndex. The queryItemType switches the query elements on or off. The removeItem event emitter provides the index of the component to remove from the parent component. The termQueryItems, stringQueryItems, and numberQueryItems are the select options of their components. The feConcepts are the autocomplete options for the concept. The constructor gets the FromBuilder and the needed services injected. The itemFormGroup of the component is created with the formbuilder. The QueryFormFields.Concept and the QueryFormFields.NumberValue get their validators. Query Component Init The component initialization looks like this: TypeScript ngOnInit(): void { this.subscriptions.push( this.itemFormGroup.controls[QueryFormFields.Concept].valueChanges .pipe(debounceTime(200)) .subscribe((myValue) => this.financialDataService .getConcepts() .subscribe( (myConceptList) => (this.concepts = myConceptList.filter((myConcept) => FinancialsDataUtils.compareStrings( myConcept.concept, myValue, this.itemFormGroup.controls[QueryFormFields.ConceptOperator] .value ) )) ) ) ); this.itemFormGroup.controls[QueryFormFields.ItemType].patchValue( this.queryItemType ); if ( this.queryItemType === ItemType.TermStart || this.queryItemType === ItemType.TermEnd ) { this.itemFormGroup.controls[QueryFormFields.ConceptOperator].patchValue( this.containsOperator ); ... } //make service caching work if (this.formArrayIndex === 0) { this.getOperators(0); } else { this.getOperators(400); } } private getOperators(delayMillis: number): void { setTimeout(() => { ... this.subscriptions.push( this.configService.getStringOperators().subscribe((values) => { this.stringQueryItems = values; this.itemFormGroup.controls[ QueryFormFields.ConceptOperator ].patchValue( values.filter((myValue) => this.containsOperator === myValue)[0] ); }) ); ... }, delayMillis); } First, the QueryFormFields.Concept form control value changes are subscribed to request (with a debounce) the matching concepts from the backend service. The results are filtered with compareStrings(...) and QueryFormFields.ConceptOperator (default is “contains”). Then, it is checked if the queryItemType is TermStart or TermEnd to set default values in their form controls. Then, the getOperators(...) method is called to get the operator values of the backend service. The backend services cache the values of the operators to load them only once, and use the cache after that. The first array entry requests the values from the backend, and the other entries wait for 400 ms to wait for the responses and use the cache. The getOperators(...) method uses setTimeout(...) for the requested delay. Then, the configService method getStringOperators() is called and the subscription is pushed onto the “subscriptions” array. The results are put in the stringQueryItems property for the select options. The result value that matches the containsOperator constant is patched into the operator value of the formcontrol as the default value. All operator values are requested concurrently. Query Component Type Switch If the component is dropped in a new droplist, the form array entry needs an update. That is done in the showType(…) setter: TypeScript @Input() set showType(showType: boolean) { this._showType = showType; if (!this.showType) { const formIndex = this?.baseFormArray?.controls?.findIndex( (myControl) => myControl === this.itemFormGroup ) || -1; if (formIndex >= 0) { this.baseFormArray.insert(this.formArrayIndex, this.itemFormGroup); } } else { const formIndex = this?.baseFormArray?.controls?.findIndex( (myControl) => myControl === this.itemFormGroup ) || -1; if (formIndex >= 0) { this.baseFormArray.removeAt(formIndex); } } } If the item has been added to the queryItems, the showType(…) setter sets the property and adds the itemFormGroup to the baseFormArray. The setter removes the itemFormGroup from the baseFormArray if the item has been removed from the querItems. Creating Search Request To create a search request, the search() method is used: TypeScript public search(): void { //console.log(this.queryForm.controls[FormFields.QueryItems].value); const symbolFinancialsParams = { yearFilter: { operation: this.queryForm.controls[FormFields.YearOperator].value, value: !this.queryForm.controls[FormFields.Year].value ? 0 : parseInt(this.queryForm.controls[FormFields.Year].value), } as FilterNumber, quarters: !this.queryForm.controls[FormFields.Quarter].value ? [] : this.queryForm.controls[FormFields.Quarter].value, symbol: this.queryForm.controls[FormFields.Symbol].value, financialElementParams: !!this.queryForm.controls[FormFields.QueryItems] ?.value?.length ? this.queryForm.controls[FormFields.QueryItems].value.map( (myFormGroup) => this.createFinancialElementParam(myFormGroup) ) : [], } as SymbolFinancialsQueryParams; //console.log(symbolFinancials); this.showSpinner.emit(true); this.financialDataService .postSymbolFinancialsParam(symbolFinancialsParams) .subscribe((result) => { this.processQueryResult(result, symbolFinancialsParams); this.showSpinner.emit(false); }); } private createFinancialElementParam( formGroup: FormGroup ): FinancialElementParams { //console.log(formGroup); return { conceptFilter: { operation: formGroup[QueryFormFields.ConceptOperator], value: formGroup[QueryFormFields.Concept], }, valueFilter: { operation: formGroup[QueryFormFields.NumberOperator], value: formGroup[QueryFormFields.NumberValue], }, operation: formGroup[QueryFormFields.QueryOperator], termType: formGroup[QueryFormFields.ItemType], } as FinancialElementParams; } The symbolFinancialsParams object is created from the values of the queryForm formgroup or the default value is set. The FormFields.QueryItems FormArray is mapped with the createFinancialElementParam(...) method. The createFinancialElementParam(...) method creates conceptFilter and valueFilter objects with their operations and values for filtering. The termOperation and termType are set in the symbolFinancialsParams object, too. Then, the finanicalDataService.postSymbolFinancialsParam(...) method posts the object to the server and subscribes to the result. During the latency of the request, the spinner of the parent component is shown. Conclusion The Angular Components library support for Drag’n Drop is very good. That makes the implementation much easier. The reactive forms of Angular enable flexible form checking that includes subcomponents with their own FormGroups. The custom validation functions allow the logical structure of the terms to be checked. Due to the features of the Angular framework and the Angular Components Library, the implementation needed surprisingly little code.

By Sven Loesekann
Cypress vs. Puppeteer: A Detailed Comparison
Cypress vs. Puppeteer: A Detailed Comparison

The availability of various tools in the market has often kept you thinking about which tool is appropriate for testing the web application. It is important to test the web application to ensure that it functions as per the user’s requirement and gives a high-end user experience. End-to-end testing is an approach that is designed to ensure the functionality of the applications by automating the browsers to run the scenario of particular actions made by end users. To accomplish this, Cypress and Puppeteer have commonly used tools, and their detailed comparison is the main focus of the blog. The use of Cypress has increased in the recent year for web automation testing addressing issues faced by modern web applications. Now, Puppeteer is also widely accepted for web automation testing. This triggered debate on Cypress vs. Puppeteer. To have a piece of good information on the testing tools and Cypress vs. Puppeteer’s detailed comparison is crucial. Let’s get started with discussing the overview of Cypress and Puppeteer What Is Cypress? Cypress is an open-source automation testing tool based on JavaScript solution, mainly used for modern web automation. The front-end testing framework helps us write the test cases in de-factor web language for the web application. It gives the option for testing with respect to unit tests and integration tests, including significance like easy reporting, test configuration, and many more. It also supports the Mocha test framework. The working of Cypress is different from other testing tools. For example, when you need to run a script inside the browser, it is mainly executed in the same loop as that of your application. However, when its execution needs to be done outside the browser for the same scripts, it leverages the Node.js server to support it. Features of Cypress Some of the exciting features of Cypress that you should know are as follows: · It takes snips of snapshots during the running of the tests. · It allows real-time and fast debugging by using tools like Developer Tools. · It has automated waiting, so you do not have to add waits or sleep to the running tests. · You can verify and manage the behavior of the functions, timers, and server response. · It effortlessly controls, test, and stub the edge cases without involving the servers. What Is Puppeteer? It is an open-source node js library-based framework used for automation testing and web scraping tools. It gives high-level API to control Chromium and Chrome, which runs headless by default. Puppeteer is very easy to use by the testers as it is based on the DevTools Protocol, which is similar to the one used by the Chrome Developer Tools. You need to be aware of the Chrome Developer Tools to ensure running with Puppeteer quickly. Cypress vs. Puppeteer The comparison between the Cypress vs. Puppeteer is made below based on highlighting aspects that will help you get a clear picture. Brief A puppeteer is a tool which is developed by Google which works for automating Chrome with the use of DevTool protocol. However, Cypress is developed by Semiconductors which is a test runner-open source. The main difference between Puppeteer and Cypress is based on their work. Puppeteer is basically allowing browser automation based on node library, whereas Cypress is purely a test automation framework allowing End-to-End Testing, Integration Testing, and Unit Testing. It could better be understood that Puppeteer is not a framework but just a chromium version of the node version, which provides browser automation for Chrome and Chromium. The running is executed headless by default which can be further configured to run full Chromium or Chrome. In addition, Puppeteer is such a tool that provides a high level of API for controlling Chrome and Chromium over the DevTool protocol. Relating this to Cypress, it is mainly a front-end testing tool that is built for the modern web. Lastly, Puppeteer is free to use, whereas Cypress comes with both free and paid versions. Language With respect to the testing in the programming language, both Cypress and Puppeteer are based on JavaScript language. This gives you an easy option to work on both tools. Types of Testing Comparing the testing done by Cypress and Puppeteer, Cypress gives you wider options. For example, if you are looking the testing an entire application, Puppeteer cannot be the best option. It is basically great for web scrapping and crawling SPA. However, Cypress is a tool through which you can do End-to-end tests, Unit tests, and integration tests, and it can test anything that runs in a browser. Uses Puppeteer is mainly used for automating UI testing, mouse and keyboard movement, and others. It basically tests the application developed in Angularjs and Angular. Like Cypress, it is not considered an automation tool but rather manages the internal aspects of the chromium browser. It is a development tool that is able to perform tasks by developers, like locating elements and handling requests and responses. Architecture Cypress and Puppeteer differ in their architecture. Generally, most of the testing tools work by running outside of the browser, which is executed remote commands across the network. Cypress-testing tools operate inside the browsers which execute the test codes. It allows Cypress to listen and verify the browser performance at run time by modifying DOM and altering network requests and responses on the fly. It does not require any of the driver binaries. It runs on a NodeJS server which associates with the test runner manipulated by Cypress to operate the application and test code which is another iframe in a similar event loop. The supported browser of Cypress includes Canary, Chromium, Microsoft Edge, Mozilla Firefox browsers, and electron. Relating to the Puppeteer architecture follows the DevTools protocol as mentioned above. It manages the Chromium and chrome browser with the aid of high-quality API given by the Node library. The browser platform executes the action on the browser engine with and without headless mode. Followed to this, all the test execution is done in Chromium which is a real place. Other browsers, like Microsoft edge, make use of Chromium as a browser engine. It is regarded as the package which is based on a node module and hence known as Nodejs level. With the use of JavaScript, the development of automation code is done by the end user. Testing Speed Comparing the testing speed of the Puppeteer and Cypress, Puppeteer is regarded to be much faster than Cypress. In using Cypress, the test scripts are executed in the browser, where you need to click on a particular button. This will not send the command to involve a specific driver to the browser but rather utilizes DOM events to send the click command to the button. However, Puppeteer has great control over the browser due to high-level API control over Chrome and Chromium. Further, it works with minimal settings, eliminates extras, and uses less space than Cypress, making them consume less memory and start faster. Cypress is slower when executing the run test in a larger application. The main reason is that it tends to take snips of the application state at a different point in time of the tests, which makes it take more time. However, such cases are not evident in the Puppeteer, which makes it faster than Cypress. Reliability and Flexibility Relating to the testing of the web application, Cypress can be more user-friendly and reliable in doing JavaScript framework for performing end-to-end testing as compared with Puppeteer. It is because Puppeteer is not a framework but just a chromium version of the node module. Nevertheless, Puppeteer can be a great option for quick testing; however, when we want to test the entire performance and functionality of application testing, it is better suggested to use a stronger tool like Cypress. The main reason is that Cypress has its individual assertion, but Puppeteer does not, and rather it is based on Mocha, Jasmine, or Jest frameworks. Further, Cypress has its individual IDE, and Puppeteer is dependent on the VS Code and Webstorm. In a nutshell, Puppeteer only supports Chromium engine-based browsers, whereas Cypress supports many different browsers, thus, making it more reliable and flexible. Testing Code Execution on the Client Side, Like the Web Browser Puppeteer and Cypress have aspects of the client side where they allow testing code execution on the client-like web browser. In Puppeteer, manual operation can be done in the browser, and it is easy to create a testing environment for the test to run directly. You have the option to test the front-end function and UI testing with the use of Puppeteer. Further, Cypress aims to test like anything that could be run in the browser and executed to build a high user experience. It tests the flow of the application from start to end according to the view of the user. It also works equally well on older servers for the pages and applications. Testing Behavior on the Server Side The major difference between Puppeteer and Cypress is related to the allowance of the testing behavior of the server-side code, whereas Puppeteer does not have such aspects. However, Cypress has the ability to test the back-end behavior, say, for example, with the use of cy.task() command. It gives the way to run the Node code. Through this, users can take actions crucial for the tests beyond the scope of Cypress. Test Recording Cypress comes with dashboards where you can be able to see the recorded tests and provide details on the events which happen during the execution. However, Puppeteer does not have such a dashboard, making it unable to record the test. Hence, transparency in the execution of the test is not maintained in Puppeteer. Fixtures Fixtures are the specific and fixed states of data that are test locals. This helps confirms a particular environment for a single test. Comparing the two, Cypress has the inbuilt fixtures abilities. With the use of the command cy.fixture(filePath), you can easily load a fixed set of data that is located in a file. However, Puppeteer does not have any such fixtures. Group Fixtures Group fixtures let to define particular and fixed states of data for a group of tests which helps ensures the environment for a given group of tests. For this also, Puppeteer does not have any such fixtures. At the same time, Cypress has the ability to create group fixtures with the use of the cy.fixture command. Conclusion The blog has presented detailed comparisons of the Puppeteer and Cypress, which gives you enough information for you to decide which tools will be best according to your test requirement. LambdaTest is a cloud-based automation testing platform with an online Cypress automation tool that writes simple Cypress automation testing and sees its actions. Using LambdaTest, you can also test your Puppeteer test scripts online. Both Cypress and Puppeteer come with their own advantages and limitations. It would be best if you decided which suits best to your tests.

By Nazneen Ahmad
Data Binding in React: The Easy Way
Data Binding in React: The Easy Way

In this article, we will explore what data binding is and why it is important in React. Data binding is a process of connecting data between a component and its UI representation. It means that any changes made to the data will automatically reflect in the UI, and vice versa. This is crucial in modern web development because it helps to keep the data and UI in sync, making it easier to create dynamic and responsive applications. In React, data binding is achieved through state and props. The state is the internal data of a component that can change over time, while props are external data that are passed down to a component from its parent. When the state or props change, React automatically re-renders the component, updating the UI to reflect the new data. One-Way Data Binding It is binding data from the component state to the UI. This means that any changes made to the component state will automatically reflect in the UI, but not vice versa. In React, one-way data binding is achieved using JSX. The data from the component state can be accessed using curly braces within the JSX code, and displayed in the UI: JavaScript class ItemList extends React.Component { state = { items: ['item 1', 'item 2', 'item 3'] } render() { return ( <ul> {this.state.items.map((item, index) => { return <li key={index}>{item}</li> })} </ul> ) } } In this example, the ItemList component has a state that contains an array of items. The render method maps over the items in the state and displays each one in a list item within a ul element. Two-Way Data Binding It is binding data from both the component state and the UI. This means that changes made to either the component state or the UI will automatically reflect in the other. In React, two-way data binding is achieved using the onChange event on form elements, such as input, select, and textarea. The onChange event allows the component to update the state with the current value of the form element: JavaScript class ItemForm extends React.Component { state = { newItem: '' } handleChange = (event) => { this.setState({ newItem: event.target.value }) } handleSubmit = (event) => { event.preventDefault() this.props.addItem(this.state.newItem) this.setState({ newItem: '' }) } render() { return ( <form onSubmit={this.handleSubmit}> <input type="text" value={this.state.newItem} onChange={this.handleChange} /> <button type="submit">Add Item</button> </form> ) } } Here: The ItemForm component has a state that contains the current value of the form input. The handleChange method updates the state with the current value of the form input every time it changes. The handleSubmit method is called when the form is submitted, adding the new item to the list and clearing the form input. useRef for Data Binding useRef is a hook that allows you to access the value of a DOM element or a React component instance. The useRef hook returns an object with a current property, which can be used to store values that persist across render cycles. One way to use useRef for data binding is to store the value of an input form in the current property of a ref. This allows you to directly bind data between the form and the component state without using an event handler: JavaScript function InputForm() { const inputRef = useRef(null) const [value, setValue] = useState('') const handleSubmit = (event) => { event.preventDefault() setValue(inputRef.current.value) } return ( <form onSubmit={handleSubmit}> <input type="text" ref={inputRef} /> <button type="submit">Submit</button> <p>{value}</p> </form> ) } Here, the useRef hook is used to create a reference to the input element in the form. The handleSubmit method is called when the form is submitted, updating the component state with the value of the input. The component state is then displayed in a p element. useReducer for Data Binding useReducer is a hook that allows you to manage complex state transitions in your components. The useReducer hook takes in a reducer function and an initial state, and returns an array with the current state and a dispatch function that can be used to update the state. One way to use useReducer for data binding is to manage the state of a shopping cart. The reducer function can be used to update the state based on the current action, and the dispatch function can be used to trigger the update: JavaScript function shoppingCartReducer(state, action) { switch (action.type) { case 'ADD_ITEM': return [...state, action.item] case 'REMOVE_ITEM': return state.filter((item, index) => index !== action.index) default: return state } } function ShoppingCart() { const [cart, dispatch] = useReducer(shoppingCartReducer, []) const addItem = (item) => { dispatch({ type: 'ADD_ITEM', item }) } const removeItem = (index) => { dispatch({ type: 'REMOVE_ITEM', index }) } return ( <div> <button onClick={() => addItem('item 1')}>Add Item 1</button> <button onClick={() => addItem('item 2')}>Add Item 2</button> <ul> {cart.map((item, index) => { return ( <li key={index}> {item} <button onClick={() => removeItem(index)}>Remove</button> </li> ) })} </ul> </div> ) } Here, the useReducer hook is used to manage the state of the shopping cart. The reducer function, shoppingCartReducer, takes in the current state and an action, and returns the updated state based on the type of the action. The dispatch function is used to trigger the update by passing in an action object. The component contains two buttons to add items to the cart, and a list of items in the cart that can be removed. What Is React Lifecycle Method? The react lifecycle method refers to the sequence of events that happen in a React component, from its creation to its destruction. It is necessary to know the life cycle methods in React as they play a crucial role in managing the data binding and ensuring the smooth flow of data between the component state and the UI. The most common life cycle methods in React are: componentDidMount: This method is called after the component is rendered on the screen. It is an ideal place to make API calls, set up event listeners, or perform any other actions that require the component to be fully rendered. shouldComponentUpdate: This method is called before a render is triggered. It allows you to control when a component should re-render. By default, this method returns true, meaning the component will re-render whenever there is a change in state or props. However, if you want to optimize performance, you can use this method to prevent unnecessary re-renders. componentDidUpdate: This method is called after a component has been updated. You can use it to perform any additional actions that need to be taken after a render. componentWillUnmount: This method is called just before a component is removed from the DOM. You can use it to perform any clean-up actions that need to be taken when a component is no longer needed. In terms of data binding, the life cycle methods can play a crucial role in ensuring the component state is correctly linked to the UI. For example, you might want to update the component state in response to an event, such as a button click. In this case, you would use the componentDidUpdate method to check for changes in the state and trigger a re-render if necessary. Conclusion Overall, understanding data binding and the React life cycle is essential for building dynamic and efficient applications. If you are interested in learning more about React, there are many resources available online, including the official documentation on React’s website, tutorials, and online courses.

By Rahul .
How to Format a Number as Currency in JavaScript
How to Format a Number as Currency in JavaScript

Every country has its currency and different patterns or ways of displaying monetary amounts. When we appropriately express a number, it is easier to read and comprehend for readers. When you use data from an API or an external resource, it will be in some generic format. For instance, if you are creating a store, you may have data such as pricing. This article will walk you through how to format a number as Currency in JavaScript. Let's dive in! We will be using a random number, such as 17225, as shown in the arrays below: JavaScript const Journals = [ { "id": 01, "name": "Software Development", "price": 100.80, }, { "id": 02, "name": "Introduction to Programming", "price": 1534, }, { "id": 04, "name": "Program or Be Programmed", "price": 17225, } ] Even adding a currency sign does not solve the problem since commas and decimals must be added in the right locations. You would also like each price output to be formatted correctly, dependent on the currency. For example, 17225 would be $17,225.00 (US Dollars), 17,225.00 (Rupees), or €17,225,00 (Euros), depending on your chosen currency, location, and style. You may also use JavaScript's Intl.NumberFormat() function to convert these integers to currencies. JavaScript const price = 17225; let KenyaShilling = new Intl.NumberFormat('en-Ke', { style: 'currency', currency: 'KSH', }); console.log(`The formatted version of {price} is {KenyaShilling.format(price)}`); // The formatted version of 17225 is Ksh17,225.00 Output: Ksh 17,225.00 How to Format Numbers as Currency Using the Intl.NumberFormat() Constructor You may build Intl.NumberFormat objects that enable language-sensitive numerical formatting, such as currency formatting, using the Intl.NumberFormat() constructor. This constructor considers two important factors: locales and options, which are both optional. new Intl.NumberFormat(locales, options) // we can also use Intl.NumberFormat(locales, options) Remember that Intl.NumberFormat() can be used either with or without "new." Both will create a new Intl.NumberFormat instance. When no locale or option is given to the Intl.NumberFormat() constructor will simply format the integer by adding commas. const price = 17225; console.log(new Intl.NumberFormat().format(price)); Output: 17,225 As noted above, you are not looking for standard number formatting. Instead, you want to structure these numbers as currency so that it returns the currency sign with suitable formatting rather than having to write it manually. We can now have a look and explore both parameters. The First Argument: Locales The locale argument is an optional string parameter that could be given. It denotes a particular geographical, political, or cultural territory. It only prepares the number according to the location and does not include currency formatting. const price = 172250; console.log(new Intl.NumberFormat('en-US').format(price)); // 172,250 console.log(new Intl.NumberFormat('en-IN').format(price)); // 1,72,250 console.log(new Intl.NumberFormat('en-DE').format(price)); // 172.250 You will see that the numbers and prices have been formatted regionally based on your location. Let's look at the options parameter now to see how we may change the numbers to represent a currency. The Second Argument: Options (Style, Currency…) This is the major parameter, and you may use this to apply additional formatting, such as currency formatting. This is a JavaScript object that has additional arguments, such as: Style: This specifies the sort of formatting you desire. This includes values such as decimals, currencies, and units. Currency; is an additional choice. You may use this option to indicate the currency to format to, such as USD, CAD, GBP, INR, and many more. JavaScript // format number to US dollar let USDollar = new Intl.NumberFormat('en-US', { style: 'currency', currency: 'USD', }); // format number to British pounds let Pounds = Intl.NumberFormat('en-GB', { style: 'currency', currency: 'GBP', }); // format number to Indian rupee let Rupee = new Intl.NumberFormat('en-IN', { style: 'currency', currency: 'INR', }); // format number to Euro let Euro = Intl.NumberFormat('en-DE', { style: 'currency', currency: 'EUR', }); console.log('Dollars: ' + USDollar.format(price)); // Dollars: $172,250.00 console.log(`Pounds: ${pounds.format(price)}`); // Pounds: £172,250.00 console.log('Rupees: ' + rupee.format(price)); // Rupees: ₹1,72,250.00 console.log(`Euro: ${euro.format(price)}`); // Euro: €172,250.00 The Third Argument: Maximum Significant Digits MaximumSignificantDigits is another option. This allows you to round the price variables based on the number of significant figures you choose. For example, if you change the value to 3, 172,250.00 becomes 172,000. JavaScript let euro= Intl.NumberFormat('en-'Euro, { style: 'currency', currency: 'EUR', maximumSignificantDigits: 3, }); console.log(`Euro: ${euro.format(price)}`); // Euro: £172,000 The scope of this article is just the basics of how to use JavaScript to convert a random number to a currency format. Happy coding!

By Dennis Mwangi
Cancel Duplicate Fetch Requests in JavaScript Enhanced Forms
Cancel Duplicate Fetch Requests in JavaScript Enhanced Forms

If you’ve ever used JavaScript fetch API to enhance a form submission, there’s a good chance you’ve accidentally introduced a duplicate-request/race-condition bug. Today, I’ll walk you through the issue and my recommendations to avoid it. (There is a video at the end if you prefer that.) Let’s consider a very basic HTML form with a single input and a submit button. <form method="post"> <label for="name">Name</label> <input id="name" name="name" /> <button>Submit</button> </form> When we hit the submit button, the browser will do a whole page refresh. Notice how the browser reloads after the submit button is clicked. The page refresh isn’t always the experience we want to offer our users, so a common alternative is to use JavaScript to add an event listener to the form’s “submit” event, prevent the default behavior and submit the form data using the fetch API. A simplistic approach might look like the example below. After the page (or component) mounts, we grab the form DOM node, add an event listener that constructs a fetch request using the form action, method, and data, and at the end of the handler, we call the event’s preventDefault() method. const form = document.querySelector('form'); form.addEventListener('submit', handleSubmit); function handleSubmit(event) { const form = event.currentTarget; fetch(form.action, { method: form.method, body: new FormData(form) }); event.preventDefault(); } Now, before any JavaScript hotshots start tweeting at me about GET vs. POST and request body and Content-Type and whatever else, let me just say, I know. I’m keeping the fetch request deliberately simple because that’s not the main focus. The key issue here is the event.preventDefault(). This method prevents the browser from performing the default behavior of loading the new page and submitting the form. Now, if we look at the screen and hit submit, we can see that the page doesn’t reload, but we do see the HTTP request in our network tab. Notice the browser does not do a full page reload. Unfortunately, by using JavaScript to prevent the default behavior, we’ve actually introduced a bug that the default browser behavior does not have. When we use plain HTML and you smash the submit button a bunch of times really quickly, you’ll notice that all the network requests except the most recent one turn red. This indicates that they were canceled and only the most recent request is honored. If we compare that to the JavaScript example, we will see that all of the requests are sent and all of them complete without any being canceled. This may be an issue because although each request may take a different amount of time, they could resolve in a different order than they were initiated. This means if we add functionality to the resolution of those requests, we might have some unexpected behavior. As an example, we could create a variable to increment for each request (totalRequestCount). Every time we run the handleSubmit function we can increment the total count as well as capture the current number to track the current request (thisRequestNumber). When a fetch request resolves, we can log its corresponding number to the console. const form = document.querySelector('form'); form.addEventListener('submit', handleSubmit); let totalRequestCount = 0 function handleSubmit(event) { totalRequestCount += 1 const thisRequestNumber = totalRequestCount const form = event.currentTarget; fetch(form.action, { method: form.method, body: new FormData(form) }).then(() => { console.log(thisRequestNumber) }) event.preventDefault(); } Now, if we smash that submit button a bunch of times, we might see different numbers printed to the console out of order: 2, 3, 1, 4, 5. It depends on the network speed, but I think we can all agree that this is not ideal. Consider a scenario where a user triggers several fetch requests in close succession and upon completion, your application updates the page with their changes. The user could ultimately see inaccurate information due to requests resolving out of order. This is a non-issue in the non-JavaScript world because the browser cancels any previous request and loads the page after the most recent request completes, loading the most up-to-date version. But page refreshes are not as sexy. The good news for JavaScript lovers is that we can have both a sexy user experience AND a consistent UI! We just need to do a bit more legwork. If you look at the fetch API documentation, you’ll see that it’s possible to abort a fetch using an AbortController and the signal property of the fetch options. It looks something like this: const controller = new AbortController(); fetch(url, { signal: controller.signal }); By providing the AbortContoller's signal to the fetch request, we can cancel the request any time the AbortContoller's abort method is triggered. You can see a clearer example in the JavaScript console. Try creating an AbortController, initiating the fetch request, then immediately execute the abort method. const controller = new AbortController(); fetch('', { signal: controller.signal }); controller.abort() You should immediately see an exception printed to the console. In Chromium browsers, it should say, “Uncaught (in promise) DOMException: The user aborted a request.” And if you explore the Network tab, you should see a failed request with the Status Text “(canceled)”. With that in mind, we can add an AbortController to our form’s submit handler. The logic will be as follows: First, check for an AbortController for any previous requests. If one exists, abort it. Next, create an AbortController for the current request that can be aborted on subsequent requests. Finally, when a request resolves, remove its corresponding AbortController. There are several ways to do this, but I’ll use a WeakMap to store relationships between each submitted <form> DOM node and its respective AbortController. When a form is submitted, we can check and update the WeakMap accordingly. const pendingForms = new WeakMap(); function handleSubmit(event) { const form = event.currentTarget; const previousController = pendingForms.get(form); if (previousController) { previousController.abort(); } const controller = new AbortController(); pendingForms.set(form, controller); fetch(form.action, { method: form.method, body: new FormData(form), signal: controller.signal, }).then(() => { pendingForms.delete(form); }); event.preventDefault(); } const forms = document.querySelectorAll('form'); for (const form of forms) { form.addEventListener('submit', handleSubmit); } The key thing is being able to associate an abort controller with its corresponding form. Using the form’s DOM node as the WeakMap‘s key is a convenient way to do that. With that in place, we can add the AbortController‘s signal to the fetch request, abort any previous controllers, add new ones, and delete them upon completion. Hopefully, that all makes sense. Now, if we smash that form’s submit button a bunch of times, we can see that all of the API requests except the most recent one get canceled. This means any function responding to that HTTP response will behave more as you would expect. Now, if we use that same counting and logging logic we have above, we can smash the submit button seven times and would see six exceptions (due to the AbortController) and one log of “7” in the console. If we submit again and allow enough time for the request to resolve, we’d see “8” in the console. And if we smash the submit button a bunch of times, again, we’ll continue to see the exceptions and final request count in the right order. If you want to add some more logic to avoid seeing DOMExceptions in the console when a request is aborted, you can add a .catch() block after your fetch request and check if the error’s name matches “AbortError“: fetch(url, { signal: controller.signal, }).catch((error) => { // If the request was aborted, do nothing if (error.name === 'AbortError') return; // Otherwise, handle the error here or throw it back to the console throw error }); Closing This whole post was focused around JavaScript-enhanced forms, but it’s probably a good idea to include an AbortController any time you create a fetch request. It’s really too bad it’s not built into the API already, but hopefully, this shows you a good method for including it. It’s also worth mentioning that this approach does not prevent the user from spamming the submit button a bunch of times. The button is still clickable and the request still fires off, it just provides a more consistent way of dealing with responses. Unfortunately, if a user does spam a submit button, those requests would still go to your backend and could use consume a bunch of unnecessary resources. Some naive solutions may be disabling the submit button, using a debounce, or only creating new requests after previous ones resolve. I don’t like these options because they rely on slowing down the user’s experience and only work on the client side. They don’t address abuse via scripted requests. To address abuse from too many requests to your server, you would probably want to set up some rate limiting. That goes beyond the scope of this post, but it was worth mentioning. It’s also worth mentioning that rate limiting doesn’t solve the original problem of duplicate requests, race conditions, and inconsistent UI updates. Ideally, we should use both to cover both ends. Anyway, that’s all I’ve got for today. If you want to watch a video that covers this same subject, watch this. Thank you so much for reading. If you liked this article, please share it.

By Austin Gil CORE
Easy Smart Contract Debugging With Truffle’s Console.log
Easy Smart Contract Debugging With Truffle’s Console.log

If you’re a Solidity developer, you’ll be excited to hear that Truffle now supports console logging in Solidity smart contracts. While Truffle has long been a leader in smart contract development tooling—providing an easy-to-use environment for creating, testing, and debugging smart contracts—a directly integrated console.log was a feature it still needed. But no more! Developers can now easily log messages and debug their smart contracts, all within the familiar Truffle (Ganache) environment. Let’s look at how. What Is Console.log? Console.log is a very popular feature in JavaScript and is widely used by developers to easily output logging messages and extract details directly from code. In the context of Web3 and smart contract development, console.log plays a similar role, allowing developers to print out Solidity variables and other information from their smart contracts. For example, you can use console.log to display the value of a variable or the output of a function call within your smart contract. This can be extremely useful when debugging or testing your smart contract. console.log("Console Logging: The Smart Contract Developer's Best Friend"); How To Use Console Logging in Truffle Making use of console.log is quite straightforward. First, you’ll have to ensure you have an up-to-date Truffle version running on your computer. If you have any issues, you might want to uninstall the package entirely and then reinstall it. For the commands used in this post, we’ll use NPM as our package manager. $ npm install -g truffle After a successful installation, I suggest that you modify the truffle configuration file (i.e. truffle-config.js) as follows: module.exports = { . . . solidityLog: { displayPrefix: ' :', // defaults to "" preventConsoleLogMigration: true, // defaults to false } displayPrefix: decorates the outputs from console.log to differentiate it from other contents displayed by the CLI. preventConsoleLogMigration: screens contract deployments from going through when on a test or mainnet. You can opt out of this if you wish to deploy your contract with the console.log included. However, if you choose to do this, keep in mind that console.log has unpredictable behavior when it comes to gas usage. Now you’re ready to try it out! Import the contract.sol contract into your Solidity code as usual. Now you’re ready to use the console.log() command as you would in JavaScript. This includes using string substitutions like %s and %f. pragma solidity ^0.8.9; import "truffle/console.sol"; contract BookStore { //... function transfer(address to, uint256 amount) external { console.log("Transferring %s tokens to %s", amount, to); require(balances[msg.sender] >= amount, "Not enough tokens"); balances[msg.sender] -= amount; balances[to] += amount; emit Transfer(amount, to, msg.sender); } } The above transfer function shows console.log in action. Imagine a call to the transfer function failing with the Not enough tokens error. The console.log line, in this case, will show the number of tokens the call is trying to transfer. This allows the developer to see the address and amount of tokens being transferred. The message will look like this. ... Transferring 10 tokens to 0x377bbcae5327695b32a1784e0e13bedc8e078c9c An even better way to debug this could be to add in the balances[msg.sender] to the console.log statement or print it out on a separate line. That way, the sender’s balance is visible in the console, too. You get the point! You can also leave logs in test and mainnets; this way, you’ll have a nice way to observe your smart contract. And it's worth mentioning that tools like Tenderly will integrate the scrapping of logs, which can be useful when debugging and testing smart contracts in a production environment. Finally, when using console logging, it's important to follow all the good usage rules you already know, such as using clear and descriptive log messages. This will make it easier to understand the output and identify any issues that may arise. Other Debugging Tools in Truffle While console logging is a powerful tool for debugging smart contracts, keep in mind that Truffle offers other debugging tools as well. Truffle has a powerful built-in debugger CLI tool that can be used to step through the execution of a smart contract and inspect the state of variables at different points in the execution. Additionally, events are a nice way to log messages and track the behavior of a smart contract. That said, it's worth noting that using the debugger for something as simple as variable output can be overkill. Similarly, event logging only works when the transaction succeeds, which can be a limitation in certain situations. The bottom line is that the console.log feature—in combination with the other debugging tools in Truffle—can provide a better developer experience thanks to its simplicity and ease of use. It gives developers the ability to quickly and easily log messages and monitor the behavior of their smart contracts, while the other debugging tools can be used for more advanced debugging and troubleshooting. Try It Out Truffle's new console logging feature is a valuable addition to smart contract development. It’s easy to use and can streamline the debugging and testing process. The ability to log messages and track the behavior of smart contracts in real-time can reduce inefficiencies and headaches. It’s a great tool to have in your toolbox.

By Michael Bogan CORE
PostgreSQL: Bulk Loading Data With Node.js and Sequelize
PostgreSQL: Bulk Loading Data With Node.js and Sequelize

Whether you're building an application from scratch with zero users, or adding features to an existing application, working with data during development is a necessity. This can take different forms, from mock data APIs reading data files in development, to seeded database deployments closely mirroring an expected production environment. I prefer the latter as I find fewer deviations from my production toolset leads to fewer bugs. A Humble Beginning For the sake of this discussion, let's assume we're building an online learning platform offering various coding courses. In its simplest form, our Node.js API layer might look like this. JavaScript // server.js const express = require("express"); const App = express(); const courses = [ {title: "CSS Fundamentals", "thumbnail": "https://fake-url.com/css"}], {title: "JavaScript Basics", "thumbnail": "https://fake-url.com/js-basics"}], {title: "Intermediate JavaScript", "thumbnail": "https://fake-url.com/intermediate-js"} ]; App.get("/courses", (req, res) => { res.json({data: courses}); }); App.listen(3000); If all you need is a few items to start building your UI, this is enough to get going. Making a call to our /courses endpoint will return all of the courses defined in this file. However, what if we want to begin testing with a dataset more representative of a full-fledged database-backed application? Working With JSON Suppose we inherited a script exporting a JSON-array containing thousands of courses. We could import the data, like so. JavaScript // courses.js module.exports = [ {title: "CSS Fundamentals", "thumbnail": "https://fake-url.com/css"}], {title: "JavaScript Basics", "thumbnail": "https://fake-url.com/js-basics"}], {title: "Intermediate JavaScript", "thumbnail": "https://fake-url.com/intermediate-js"}, ... ]; // server.js ... const courses = require("/path/to/courses.js"); ... This eliminates the need to define our mock data within our server file, and now we have plenty of data to work with. We could enhance our endpoint by adding parameters to paginate the results and set limits on how many records are returned. But, what about allowing users to post their own courses? How about editing courses? This solution gets out of hand quickly as you begin to add functionality. We'll have to write additional code to simulate the features of a relational database. After all, databases were created to store data. So, let's do that. Bulk Loading JSON With Sequelize For an application of this nature, PostgreSQL is an appropriate database selection. We have the option of running PostgreSQL locally or connecting to a PostgreSQL-compatible cloud-native database, like YugabyteDB Managed. Apart from being a highly-performant distributed SQL database, developers using YugabyteDB benefit from a cluster that can be shared by multiple users. As the application grows, our data layer can scale out to multiple nodes and regions. After creating a YugabyteDB Managed account and spinning up a free database cluster, we're ready to seed our database and refactor our code, using Sequelize. The Sequelize ORM allows us to model our data to create database tables and execute commands. Here's how that works. First, we install Sequelize from our terminal. Shell // terminal > npm i sequelize Next, we use Sequelize to establish a connection to our database, create a table, and seed our table with data. JavaScript // database.js // JSON-array of courses const courses = require("/path/to/courses.js"); // Certificate file downloaded from YugabyteDB Managed const cert = fs.readFileSync(CERTIFICATE_PATH).toString(); // Create a Sequelize instance with our database connection details const Sequelize = require("sequelize"); const sequelize = new Sequelize("yugabyte", "admin", DB_PASSWORD, { host: DB_HOST, port: "5433", dialect: "postgres", dialectOptions: { ssl: { require: true, rejectUnauthorized: true, ca: cert, }, }, pool: { max: 5, min: 1, acquire: 30000, idle: 10000, } }); // Defining our Course model export const Course = sequelize.define( "course", { id: { type: DataTypes.INTEGER, autoIncrement: true, primaryKey: true, }, title: { type: DataTypes.STRING, }, thumbnail: { type: DataTypes.STRING, }, } ); async function seedDatabase() { try { // Verify that database connection is valid await sequelize.authenticate(); // Create database tables based on the models we've defined // Drops existing tables if there are any await sequelize.sync({ force: true }); // Creates course records in bulk from our JSON-array await Course.bulkCreate(courses); console.log("Courses created successfully!"); } catch(e) { console.log(`Error in seeding database with courses: ${e}`); } } // Running our seeding function seedDatabase(); By leveraging Sequelize’s bulkCreate method, we’re able to insert multiple records in one statement. This is more performant than inserting requests one at a time, like this. JavaScript . . . // JSON-array of courses const courses = require("/path/to/courses.js"); async function insertCourses(){ for(let i = 0; i < courses.length; i++) { await Course.create(courses[i]); } } insertCourses(); Individual inserts come with the overhead of connecting, sending requests, parsing requests, indexing, closing connections, etc. on a one-off basis. Of course, some of these concerns are mitigated by connection pooling, but generally speaking the performance benefits of inserting in bulk are immense, not to mention far more convenient. The bulkCreate method even comes with a benchmarking option to pass query execution times to your logging functions, should performance be of primary concern. Now that our database is seeded with records, our API layer can use this Sequelize model to query the database and return courses. JavaScript // server.js const express = require("express"); const App = express(); // Course model exported from database.js const { Course } = require("/path/to/database.js") App.get("/courses", async (req, res) => { try { const courses = await Course.findAll(); res.json({data: courses}); } catch(e) { console.log(`Error in courses endpoint: ${e}`); } }); App.listen(3000); Well, that was easy! We've moved from a static data structure to a fully-functioned database in no time. What if we're provided the dataset in another data format, say, a CSV file exported from Microsoft Excel? How can we use it to seed our database? Working With CSVs There are many NPM packages to convert CSV files to JSON, but none are quite as easy to use as csvtojson. Start by installing the package. Shell // terminal > npm i csvtojson Next, we use this package to convert our CSV file to a JSON-array, which can be used by Sequelize. // courses.csv title,thumbnail CSS Fundamentals,https://fake-url.com/css JavaScript Basics,https://fake-url.com/js-basics Intermediate JavaScript,https://fake-url.com/intermediate-js JavaScript // database.js ... const csv = require('csvtojson'); const csvFilePath = "/path/to/courses.csv"; // JSON-array of courses from CSV const courses = await csv().fromFile(csvFilePath); ... await Course.bulkCreate(courses); ... Just as with our well-formatted courses.js file, we're able to easily convert our courses.csv file to bulk insert records via Sequelize. Conclusion Developing applications with hardcoded data can only take us so far. I find that investing in tooling early in the development process sets me on the path toward bug-free coding (or so I hope!) By bulk-loading records, we’re able to work with a representative dataset, in a representative application environment. As I’m sure many agree, that’s often a major bottleneck in the application development process.

By Brett Hoyer

Top JavaScript Experts

expert thumbnail

Anthony Gore

Founder,
Vue.js Developers

I'm Anthony Gore and I'm here to teach you Vue.js! Through my books, online courses, and social media, my aim is to turn you into a Vue.js expert. I'm a Vue Community Partner, curator of the weekly Vue.js Developers Newsletter, and the founder of vuejsdevelopers.com, an online community for web professionals who love Vue.js. Curious about Vue? Take my free 30-minute "Vue.js Crash Course" to learn what Vue is, what kind of apps you can build with it, how it compares to React & Angular, and more. Enroll for free! https://courses.vuejsdevelopers.com/p/vue-js-crash-course?utm_source=dzone&utm_medium=bio
expert thumbnail

John Vester

Lead Software Engineer,
Marqeta @JohnJVester

Information Technology professional with 30+ years expertise in application design and architecture, feature development, project management, system administration and team supervision. Currently focusing on enterprise architecture/application design utilizing object-oriented programming languages and frameworks. Prior expertise building (Spring Boot) Java-based APIs against React and Angular client frameworks. CRM design, customization and integration with Salesforce. Additional experience using both C# (.NET Framework) and J2EE (including Spring MVC, JBoss Seam, Struts Tiles, JBoss Hibernate, Spring JDBC).
expert thumbnail

Justin Albano

Software Engineer,
IBM

I am devoted to continuously learning and improving as a software developer and sharing my experience with others in order to improve their expertise. I am also dedicated to personal and professional growth through diligent studying, discipline, and meaningful professional relationships. When not writing, I can be found playing hockey, practicing Brazilian Jiu-jitsu, watching the NJ Devils, reading, writing, or drawing. ~II Timothy 1:7~ Twitter: @justinmalbano
expert thumbnail

Swizec Teller

CEO,
preona

I'm a writer, programmer, web developer, and entrepreneur. Preona is my current startup that began its life as the team developing Twitulater. Our goal is to create a set of applications for the emerging Synaptic Web, which would rank real-time information streams in near real time, all along reading its user behaviour and understanding how to intelligently react to it. twitter: @Swizec

The Latest JavaScript Topics

article thumbnail
The Ultimate Guide to the Input and Output Decorator in Angular
Learn how to use these two powerful Angular tools together and understand how they work.
March 23, 2023
by Chetan Suthar
· 471 Views · 1 Like
article thumbnail
7 Ways of Containerizing Your Node.js Application
This article lists seven ways to containerize your node.js application, so let’s look at them briefly.
March 23, 2023
by Nikunj Shingala
· 620 Views · 1 Like
article thumbnail
Noteworthy Storage Options for React Native Apps!
Peek through the offerings of the key React Native Storage Options and understand which one is most suitable for your specific use case.
March 23, 2023
by Parija Rangnekar
· 529 Views · 1 Like
article thumbnail
Create CloudWatch Custom Log Metric Alarm Notification Email Solution Using Terraform
Readers will use a tutorial to learn how to create a CloudWatch custom log metric alarm notification using Terraform, including code and guide visuals.
March 22, 2023
by Joyanta Banerjee
· 1,379 Views · 1 Like
article thumbnail
File Uploads for the Web (2): Upload Files With JavaScript
This is the second article in a series about uploading files for the web. In this article, readers will use JavaScript to create the file upload request.
March 22, 2023
by Austin Gil CORE
· 1,282 Views · 1 Like
article thumbnail
How To Choose the Right Streaming Database
This post helps you understand what SQL streaming is, when and why to use it, and discusses some key factors to consider when choosing the right streaming database.
March 22, 2023
by Bobur Umurzokov
· 2,631 Views · 2 Likes
article thumbnail
Create a CLI Chatbot With the ChatGPT API and Node.js
ChatGPT has taken the world by storm and is now available by API. In this article, we build a simple CLI to access ChatGPT with Node.js.
March 21, 2023
by Phil Nash
· 1,902 Views · 2 Likes
article thumbnail
Mocha JavaScript Tutorial With Examples for Selenium Testing
In this Mocha JavaScript testing tutorial, readers will become proficient in automated browser testing using Selenium and JavaScript, including code and images.
March 21, 2023
by Aditya Dwivedi
· 1,674 Views · 1 Like
article thumbnail
useState() vs. useRef(): Understand the Technical Difference
React provides many clearly defined properties, and useState() and useRef() are two of them. Learn the technical differences between these two with this post!
March 21, 2023
by Riha Mervana
· 1,604 Views · 1 Like
article thumbnail
What Is the Temporal Dead Zone In JavaScript?
In this article, we’ll explore what the Temporal Dead Zone is, why it happens, and how to avoid common pitfalls related to it.
March 21, 2023
by Kapil Upadhyay
· 1,832 Views · 1 Like
article thumbnail
Master C# Arrays: The Basics
Master C# Arrays fast with this step-by-step article and videos containing demo code and images. C# rectangular and jagged arrays are also covered.
March 21, 2023
by Pirzada .Rashid
· 2,314 Views · 1 Like
article thumbnail
Building a Real-Time App With Spring Boot, Cassandra, Pulsar, React, and Hilla
In this article, readers will learn how to create a Spring Boot application that connects to Pulsar and Cassandra and displays live data in a React frontend.
March 20, 2023
by Marcus Hellberg
· 2,434 Views · 7 Likes
article thumbnail
What Is JavaScript Slice? Practical Examples and Guide
This article explains the best practices and examples to help you get the most out of slicing data with JavaScript.
March 20, 2023
by Rahul .
· 1,615 Views · 1 Like
article thumbnail
Choosing the Right Framework for Your Project
Learn how to choose the right framework for your project with this beginner's guide. Consider factors like project type, scalability, and performance.
March 20, 2023
by Anahit Ghazaryan
· 1,914 Views · 2 Likes
article thumbnail
Cucumber.js Tutorial With Examples For Selenium JavaScript
This Cucumber.js tutorial will help you automate browser testing with Selenium and JavaScript across a cloud-based Selenium Grid of 2000+ browsers for mobile and desktop.
March 17, 2023
by Rahul Rana
· 3,094 Views · 1 Like
article thumbnail
How To Set Up and Run Cypress Test Cases in CI/CD TeamCity
In this article, readers will use a tutorial to learn how to set up and run Cypress test cases in CI/CD TeamCity, including code blocks and screenshots.
March 17, 2023
by Kailash P. (Cypress Ambassador) https://qaautomationlabs.com/blog
· 3,424 Views · 2 Likes
article thumbnail
How To Execute Cypress E2E Test Cases Using CI/CD GitLab
In this article, readers will use a tutorial to learn how to execute Cypress end-to-end test cases using CI/CD GitLab, including guide code and images.
March 16, 2023
by Kailash P. (Cypress Ambassador) https://qaautomationlabs.com/blog
· 3,084 Views · 3 Likes
article thumbnail
An End-to-End Guide to Vue.js Testing
This article is a step-by-step guide covering Vue.js testing basics and explaining how to test Vue.js-based websites and mobile applications for readers.
March 15, 2023
by Harish Rajora
· 2,809 Views · 1 Like
article thumbnail
[DZone Survey] Share Your Expertise and Take our 2023 Web, Mobile, and Low-Code Apps Survey
Calling all automation, web, mobile, and low- no-code experts! Take the survey and enter for a chance to win one of ten $50 gift cards!
March 15, 2023
by Caitlin Candelmo
· 8,334 Views · 5 Likes
article thumbnail
Building a RESTful API With AWS Lambda and Express
Build a RESTful API using AWS Lambda and Express.js with this quick and easy-to-follow tutorial. Deploy your RESTful API with confidence using Node.js.
March 15, 2023
by Preet Kaur
· 3,891 Views · 2 Likes
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • ...
  • Next

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: