DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Strengthening Web Application Security With Predictive Threat Analysis in Node.js
  • Unlocking the Power of Streaming: Effortlessly Upload Gigabytes to AWS S3 With Node.js
  • How To Obtain IP Geolocation Data in Next.js
  • Creating a Polar Chart to Measure Electromagnetic Field Strength

Trending

  • Infrastructure as Code (IaC) Beyond the Basics
  • How Large Tech Companies Architect Resilient Systems for Millions of Users
  • Navigating Double and Triple Extortion Tactics
  • Developers Beware: Slopsquatting and Vibe Coding Can Increase Risk of AI-Powered Attacks
  1. DZone
  2. Data Engineering
  3. Data
  4. How to Scrape E-Commerce Data With Node.js and Puppeteer

How to Scrape E-Commerce Data With Node.js and Puppeteer

Normalized data is the foundation for all price intelligence projects. This tutorial will cover the basics of how to scrape product information.

By 
Andreas Altheimer user avatar
Andreas Altheimer
·
Updated Jan. 15, 21 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
6.9K Views

Join the DZone community and get the full member experience.

Join For Free

Web scraping is nothing new. However, the technologies that are used to build websites are constantly developing. Hence, the techniques that have to be used to scrape a website have to adapt.

Why Node.js?

A lot of websites use front-end frameworks like React, Vue.js, Angular, etc., which load the content (or parts of the content) after the initial DOM is loaded. This especially applies to performance-optimized e-commerce websites, where price and production information are loaded asynchronously.

Now, if we access a page like this with PHP, or any other classic server-side language, this content will not be part of the retrieved markup, as we require a browser window for sufficient JavaScript rendering.
This is where Puppeteer comes in. It opens a headless Chrome instance to render a page.

Getting Started – Prerequisites

Let us get started by installing Node.js on our system by initializing a new npm (Node Package Manager) instance. npm allows us to install further packages easily. To begin, run the following command:

Shell
 




x


 
1
npm init
2

          
3
// we can now install our puppeteer instance via npm
4
npm install puppeteer


With this, we have initialized a new npm instance and installed our headless Chrome browser. At this point, you could also install a DOM parser library to make data extraction a little easier. However, we are going to use the JavaScript built-in querySelector() to parse retrieved HTML.

That's it. We are finished with all the prerequisites. Let's start working on our actual web scraper.

Building the Scraper

Let us create a new file, called index.js and start by importing the previously installed Puppeteer library. 

JavaScript
 




x


 
1
const puppeteer = require ('puppeteer');
2

          
3
puppeteer.launch().then (async browser => { 
4
  const page = await browser.newPage ();    
5
  await page.goto ('https://www.rentomojo.com/noida/furniture/rent-hutch-wardrobe-2-door');     
6
  await page.waitForSelector ('.price-box__price');
7

          


Next, we launch a new headless Chrome window. The await command tells Puppeteer to wait to proceed to the next line until the related statement is completed. Following this pattern, we open up our e-commerce site and tell the browser to wait until the element that contains all information that we want to scrape is visible. 

In our case, this element is a div-container, labeled with the price-box__price class. Now the website is in the state where all information that is relevant to us is visible. 

As a next step, we are going to use a function called evaluate(). It allows us to interfere with the rendered website, which is what we need to do if we want to scrape it.

JavaScript
 


x
 
1
let priceInformation = await page.evaluate (() => {
2
  
3
  let amount = document.body.querySelector('.price-box__amount');
4
  let currency = document.body.querySelector('.price-box__rupee-sign');
5
  
6
  let productInfo = {
7
    amount: amount ? amount : null,
8
    currency: currency ? currency : null
9
  };
10
  
11
  return productInfo;
12
});
13
14
console.log(priceInformation);

In the code snippet above, we first select the desired product information and save them into the variables amount and currency. Next, we save them to an object and declare null as a fallback value in case the property does not exist.

The console.log() statement will return the gathered information, as shown in the screenshot below:

We now want to get rid of the linebreaks and all spacing. Additionally, we want to convert the property amount to an integer value. 

This is the complete and finalized code snippet:

JavaScript
 




xxxxxxxxxx
1
33


 
1
const puppeteer = require('puppeteer');
2

          
3
puppeteer.launch().then(async browser => {
4
    const page = await browser.newPage();
5
    await page.goto('https://www.rentomojo.com/noida/furniture/rent-hutch-wardrobe-2-door');
6
    await page.waitForSelector('.price-box__price');
7

          
8

          
9
    let priceInformation = await page.evaluate(() => {
10

          
11
        let amount = document.body.querySelector('.price-box__amount');
12
        let currency = document.body.querySelector('.price-box__rupee-sign');
13

          
14
        function stripString(rawString) {
15
            return rawString.trim();
16
        }
17

          
18
        let productInfo = {
19
            amount: amount ? parseInt(stripString(amount.textContent)) : null,
20
            currency: currency ? stripString(currency.textContent) : null
21
        };
22

          
23
        return productInfo;
24
    });
25

          
26
    // Logging the results
27
    console.log(priceInformation);
28

          
29
    // Closing the browser instance
30
    await browser.close ();
31

          
32
});


This will output the data in an expected and well-structured way:

Of course, you would want to scrape a lot more data properties when applying and running this in a serious web project. But this tutorial is about the concept behind it.

This article was used as a basis for the creation of this article. 

Node.js Data (computing)

Opinions expressed by DZone contributors are their own.

Related

  • Strengthening Web Application Security With Predictive Threat Analysis in Node.js
  • Unlocking the Power of Streaming: Effortlessly Upload Gigabytes to AWS S3 With Node.js
  • How To Obtain IP Geolocation Data in Next.js
  • Creating a Polar Chart to Measure Electromagnetic Field Strength

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!