DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Understanding RDS Costs
  • Using AWS DMS for Data Migration From On-Premises Oracle 19 to AWS RDS PostgreSQL: Insights From Practical Migration
  • Lessons from Migrating an Oracle Database to AWS RDS
  • Relational DB Migration to S3 Data Lake Via AWS DMS, Part I

Trending

  • Ethical AI in Agile
  • How the Go Runtime Preempts Goroutines for Efficient Concurrency
  • Transforming AI-Driven Data Analytics with DeepSeek: A New Era of Intelligent Insights
  • Why High-Performance AI/ML Is Essential in Modern Cybersecurity
  1. DZone
  2. Data Engineering
  3. Databases
  4. Migrating Data From Amazon Neptune to PostgreSQL Using AWS Services

Migrating Data From Amazon Neptune to PostgreSQL Using AWS Services

This article provides a step-by-step guide on how to migrate data from Amazon Neptune, a graph database service, to PostgreSQL, a relational database service.

By 
Bhanuprakash Jirra user avatar
Bhanuprakash Jirra
·
Jun. 21, 24 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
3.8K Views

Join the DZone community and get the full member experience.

Join For Free

Data migration is a critical task for businesses looking to move data between different databases or platforms. This article provides a step-by-step guide on how to migrate data from Amazon Neptune, a graph database service, to PostgreSQL, a relational database service. We will use the AWS Neptune export service to export data in Turtle format (.ttl) to Amazon S3, and then use AWS Glue to transform and load the data into PostgreSQL.

Scenario

Let's consider a simple example where we have data about users. Each user has a first name, last name, and email. We will export this data from Neptune, transform it, and load it into PostgreSQL.

Step 1: Export Data From Neptune to S3

Precautions

  • CPU usage: Be mindful that exporting data can be resource-intensive. Monitor CPU usage during the export process.
  • Database load: Avoid running the export job on the live database to prevent performance issues. Use a replica database if available.

1. Set up the Export Job

  • Navigate to the Amazon Neptune console.
  • Select the database instance you want to export.
  • Choose the option to export data and select the Turtle format.
  • Specify the Amazon S3 bucket where the data should be stored (e.g., s3://test-bucket/neptune-export/).

2. Run the Export Job

  • Execute the export job. The data will be exported in Turtle format (.ttl file) and saved to the specified S3 bucket.

Step 2: Use AWS Glue To Transform Data

1. Create a Glue Job To Transform Data

  • Open the AWS Glue console and create a new Glue job.
  • Configure the job to read data from the S3 bucket where the Turtle files are stored.
  • Use the following script to transform the data into SQL INSERT statements:
Python
 
import boto3

s3 = boto3.client('s3')

def transform_data(bucket_name, key):
    response = s3.get_object(Bucket=bucket_name, Key=key)
    data = response['Body'].read().decode('utf-8').splitlines()
    
    users = {}
    for line in data:
        parts = line.split()
        subject = parts[0].strip('<>')
        predicate = parts[1].strip('<>')
        obj = parts[2].strip('<>.')

        if subject not in users:
            users[subject] = {"first_name": None, "last_name": None, "email": None}
        
        if predicate.endswith('firstName'):
            users[subject]["first_name"] = obj.strip('"')
        elif predicate.endswith('lastName'):
            users[subject]["last_name"] = obj.strip('"')
        elif predicate.endswith('email'):
            users[subject]["email"] = obj.strip('"')
    
    insert_statements = []
    for user_id, attributes in users.items():
        if all(attributes.values()):  # Ensure all attributes are present
            insert_statement = f"INSERT INTO users (user_id, first_name, last_name, email) VALUES ('{user_id}', '{attributes['first_name']}', '{attributes['last_name']}', '{attributes['email']}');"
            insert_statements.append(insert_statement)
    
    return insert_statements

bucket_name = 'test-bucket'
key = 'neptune-export/users.ttl'

insert_statements = transform_data(bucket_name, key)

# Write insert statements to a new S3 file
output_key = 'neptune-export/insert_statements.sql'
s3.put_object(Bucket=bucket_name, Key=output_key, Body='\n'.join(insert_statements))

print(f"Transformed {len(insert_statements)} insert statements.")


Run the Glue Job

  • Execute the Glue job. This will read the Turtle files from S3, transform the data into SQL INSERT statements, and save these statements to another file in S3.

Step 3: Load Data Into PostgreSQL

1. Create Another Glue Job to Load Data

  • Create a new Glue job to read the SQL INSERT statements from the S3 bucket and execute them on PostgreSQL.
Python
 
import psycopg2
import boto3

s3 = boto3.client('s3')

def execute_statements(bucket_name, key, db_params):
    response = s3.get_object(Bucket=bucket_name, Key=key)
    statements = response['Body'].read().decode('utf-8').splitlines()
    
    conn = psycopg2.connect(**db_params)
    cur = conn.cursor()
    
    count = 0
    for statement in statements:
        cur.execute(statement)
        count += 1
    
    conn.commit()
    cur.close()
    conn.close()
    
    print(f"Executed {count} insert statements.")

bucket_name = 'test-bucket'
key = 'neptune-export/insert_statements.sql'

db_params = {
    'dbname': 'test_db',
    'user': 'db_user',
    'password': 'db_password',
    'host': 'db_host',
    'port': 'db_port'
}

execute_statements(bucket_name, key, db_params)


2. Run the Glue Job

  • Execute this Glue job. It will read the SQL INSERT statements from S3 and execute them on PostgreSQL, effectively migrating the data.

Conclusion

By following these steps, you can efficiently migrate data from Amazon Neptune to PostgreSQL using AWS services. This process leverages the AWS Neptune export service, S3 for storage, and AWS Glue for data transformation and loading. With this approach, you can ensure a smooth and automated data migration workflow.

AWS Data migration Relational database PostgreSQL

Opinions expressed by DZone contributors are their own.

Related

  • Understanding RDS Costs
  • Using AWS DMS for Data Migration From On-Premises Oracle 19 to AWS RDS PostgreSQL: Insights From Practical Migration
  • Lessons from Migrating an Oracle Database to AWS RDS
  • Relational DB Migration to S3 Data Lake Via AWS DMS, Part I

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!