DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • What Is API-First?
  • Develop a Spring Boot REST API in AWS: PART 4 (CodePipeline / CI/CD)
  • Building REST API Backend Easily With Ballerina Language
  • Aggregating REST APIs Calls Using Apache Camel

Trending

  • How Large Tech Companies Architect Resilient Systems for Millions of Users
  • Caching 101: Theory, Algorithms, Tools, and Best Practices
  • How to Introduce a New API Quickly Using Micronaut
  • Data Lake vs. Warehouse vs. Lakehouse vs. Mart: Choosing the Right Architecture for Your Business
  1. DZone
  2. Software Design and Architecture
  3. Integration
  4. Spring Boot/Batch Tutorial: Integration With HBASE REST API and Data Ingestion

Spring Boot/Batch Tutorial: Integration With HBASE REST API and Data Ingestion

In this post, we look at how to integrate these two popular frameworks into your Java app in order to perform data ingestion.

By 
Damodhara Palavali user avatar
Damodhara Palavali
·
Apr. 05, 19 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
16.7K Views

Join the DZone community and get the full member experience.

Join For Free

Before reading this article you need some basic knowledge on REST, Spring Boot, and Spring Batch.

This article is focused on how to ingest data using Spring Boot/Batch and the HBase REST API. As Spring for Apache Hadoop project will reach End-Of-Life status on April 5th, 2019, using the REST API with Spring Batch helps us to interact with HBase directly from a Windows environment, so you don't need to deploy your jar to a Unix/Linux environment where your HBase is running.

For data ingestion using HBase's REST API, we need to use Base64 encoding for ingesting and fetching as well. First, we need to start HBase's REST API using the following commands:

The screenshot below shows what happens when these commands are run:

Now we are ready and have HBase's REST API up and running. Let's create a sprinboot-batch project and create our classes for file ingestion.

This use case covers the three concepts:

  1. Reading and writing multiple files with Spring Batch.

  2. Integrating REST APIs in an item writer.

  3. Invoking HBase's REST API for ingestion.

Here we are going to ingest each file as a single column in an HBase table and file name as row-key of that HBase table. First, create a Java bean to define the file details that we'll use across the project.

Java bean


Java code

Now let's build the job config class with a reader, processor, and writer:

package com.damu.hbase.test.ingest;

import java.io.ByteArrayOutputStream;

import java.io.File;

import java.io.FileInputStream;

import java.io.IOException;

import java.nio.file.Files;

import java.nio.file.Path;

import java.nio.file.Paths;

import java.security.KeyManagementException;

import java.security.KeyStoreException;

import java.security.NoSuchAlgorithmException;

import java.security.cert.X509Certificate;

import java.time.Instant;

import java.util.Base64;

import java.util.List;

import java.util.stream.Collectors;

import javax.net.ssl.SSLContext;

import org.apache.http.conn.ssl.SSLConnectionSocketFactory;

import org.apache.http.conn.ssl.TrustStrategy;

import org.apache.http.impl.client.CloseableHttpClient;

import org.apache.http.impl.client.HttpClients;

import org.springframework.batch.core.Job;

import org.springframework.batch.core.Step;

import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;

import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;

import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;

import org.springframework.batch.core.launch.support.RunIdIncrementer;

import org.springframework.batch.item.ItemWriter;

import org.springframework.batch.item.support.PassThroughItemProcessor;

import org.springframework.beans.factory.annotation.Autowired;

import org.springframework.context.annotation.Bean;

import org.springframework.context.annotation.Configuration;

import org.springframework.http.HttpEntity;

import org.springframework.http.HttpHeaders;

import org.springframework.http.HttpMethod;

import org.springframework.http.ResponseEntity;

import org.springframework.http.client.HttpComponentsClientHttpRequestFactory;

import org.springframework.scheduling.annotation.EnableScheduling;

import org.springframework.web.client.RestTemplate;

@Configuration

@EnableBatchProcessing

@EnableScheduling

public class ReadMultiFileJob {

 @Autowired

 private JobBuilderFactory jobBuilderFactory;

 @Autowired

 private StepBuilderFactory stepBuilderFactory;

 @Bean

 public Job readFiles() throws IOException {

  return jobBuilderFactory.get(“readFiles”).incrementer(new RunIdIncrementer()).

  flow(step1()).end().build();

 }

 @Bean

 public Step step1() throws IOException {

  return stepBuilderFactory.get(“step1”). < JSONFileBatchBean, JSONFileBatchBean > chunk(10)

   .reader(multiResourceItemReader()).writer(writer()).build();

 }

 @Bean

 public PassThroughItemProcessor < JSONFileBatchBean > processor() {

  return new PassThroughItemProcessor < JSONFileBatchBean > ();

 }

 @Bean

 public FileReader multiResourceItemReader() throws IOException {

  List < File > jsonFiles = Files.walk(Paths.get(“C: \\DAMU\\ XML\\ multi”))

   .filter(Files::isRegularFile)

   .map(Path::toFile)

   .collect(Collectors.toList());

  System.out.println(“inside file reader === === === === === = ”);

  return new FileReader(jsonFiles);

 }

 @Bean

 public RestTemplate restTemplate() {

  TrustStrategy acceptingTrustStrategy = (X509Certificate[] chain, String authType) -> true;

  SSLContext sslContext = null;

  try {

   sslContext = org.apache.http.ssl.SSLContexts.custom().loadTrustMaterial(null, acceptingTrustStrategy)

    .build();

  } catch (KeyManagementException e) {

   e.printStackTrace();

  } catch (NoSuchAlgorithmException e) {

   e.printStackTrace();

  } catch (KeyStoreException e) {

   e.printStackTrace();

  }

  SSLConnectionSocketFactory csf = new SSLConnectionSocketFactory(sslContext);

  CloseableHttpClient httpClient = HttpClients.custom().setSSLSocketFactory(csf).build();

  HttpComponentsClientHttpRequestFactory requestFactory = new HttpComponentsClientHttpRequestFactory();

  requestFactory.setHttpClient(httpClient);

  RestTemplate restTemplate = new RestTemplate(requestFactory);

  return restTemplate;

 }

 @Bean

 public ItemWriter < JSONFileBatchBean > writer() {

  return items -> {

   for (JSONFileBatchBean bean: items)

   {

    System.err.println(bean.getFileName());

    //System.err.println(“bean.toString() — -”+bean.getJsonFile().createNewFile());

    FileInputStream in = new FileInputStream(bean.getJsonFile());

    ByteArrayOutputStream bos = new ByteArrayOutputStream();

    byte[] buf = new byte[1024];

    for (int readNum;
     (readNum = in .read(buf)) != -1;) {

     bos.write(buf, 0, readNum);

    }

    byte[] byteArray = bos.toByteArray();

    Instant instant = Instant.now();

    long timeStampMillis = instant.toEpochMilli();

    String rowKey = bean.getFileName();

    HttpHeaders headers = new HttpHeaders();

    headers.set(“Accept”, “application / json”);

    headers.set(“content - type”, “application / json”);

    String fileName = bean.getFileName();

    String rowKeyHash = Base64.getEncoder().encodeToString(bean.getFileName().getBytes(“utf - 8”));

    String URL = ”http: //localhost:8157/FILE_DATA/"+rowKeyHash;

     String jsonPaylod = ” {
      \”
      Row\”: [{
       \”
       key\”: “+“\””

        +rowKeyHash

        +
        “\”,
       \”Cell\”: [{
        \”
        column\”: \””

         +“RGF0YUNGOnRlc3R4bWw = ” //XML test

         +“\”,
        \””

        +“timestamp\”: ”

         +“\”” +timeStampMillis + ”\””

         +“,
        \”$\”: \””+Base64.getEncoder().encodeToString(byteArray)

         +
         “\”
       }]
      }]
     }”;

    System.out.println(“byteArray === ”+byteArray);

    System.out.println(“jsonPaylod === ”+jsonPaylod);

    HttpEntity < String > entity = new HttpEntity < > (jsonPaylod, headers);

    ResponseEntity < String > response = restTemplate().exchange(URL, HttpMethod.POST, entity, String.class);

    System.out.println(“Response: “+response);

    in .close();

   }

  };

 }

}

ow let's see how we are calling HBase API




Here are the table structure and URL details.

After running the  Spring Boot application console logger, we get the following:

TEST_JSON.json

byteArray===[B@30871dc5

jsonPaylod==={“Row”: [{“key”: “VEVTVF9KU09OLmpzb24=”,”Cell”: [{“column”: “RGF0YUNGOnRlc3R4bWw=”, “timestamp”:”1554230585299",”$”: “ew0KICAibmFtZSI6ICJKdW5nbGUgR3ltIiwNCiAgImFnZSI6IDI1LA0KICAiZmF2b3JpdGVfY29sb3IiOiAiI2ZmYTUwMCIsDQogICJnZW5kZXIiOiAibWFsZSIsDQogICJsb2NhdGlvbiI6IHsNCiAgICAiY2l0eSI6ICJTZWF0dGxlIiwNCiAgICAic3RhdGUiOiAiV0EiLA0KICAgICJjaXR5c3RhdGUiOiAiU2VhdHRsZSwgV0EiDQogIH0sDQogICJwZXRzIjogWw0KICAgIHsNCiAgICAgICJ0eXBlIjogImRvZyIsDQogICAgICAibmFtZSI6ICJGb28iDQogICAgfSwNCiAgICB7DQogICAgICAidHlwZSI6ICJjYXQiLA0KICAgICAgIm5hbWUiOiAiQmFyIg0KICAgIH0NCiAgXQ0KfQ==”}]}]}

Response: <200,{}>

TEST_TEXT.txt

byteArray===[B@7d4d62d1

jsonPaylod==={“Row”: [{“key”: “VEVTVF9URVhULnR4dA==”,”Cell”: [{“column”: “RGF0YUNGOnRlc3R4bWw=”, “timestamp”:”1554230585637",”$”: “SGVsbG8gdGhpcyBzaW1wbGUgaGFic2UgaW50ZWdyYXRpb24=”}]}]}

Response: <200,{}>

TEST_XML.xml

byteArray===[B@6df13b6a

2019-Apr-02 11:43:06.292 INFO [restartedMain] o.s.b.c.l.s.SimpleJobLauncher — Job: [FlowJob: [name=readFiles]] completed with the following parameters: [{run.id=14}] and the following status: [COMPLETED]

We can use the HBase GET API to see the ingested data:

Data ingestion

Spring Framework REST API Web Protocols Data (computing) Integration

Opinions expressed by DZone contributors are their own.

Related

  • What Is API-First?
  • Develop a Spring Boot REST API in AWS: PART 4 (CodePipeline / CI/CD)
  • Building REST API Backend Easily With Ballerina Language
  • Aggregating REST APIs Calls Using Apache Camel

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!