DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Building a Scalable ML Pipeline and API in AWS
  • Setting Up CORS and Integration on AWS API Gateway Using CloudFormation
  • API Implementation on AWS Serverless Architecture
  • AI Advancement for API and Microservices

Trending

  • Docker Model Runner: Streamlining AI Deployment for Developers
  • A Guide to Developing Large Language Models Part 1: Pretraining
  • It’s Not About Control — It’s About Collaboration Between Architecture and Security
  • Mastering Fluent Bit: Installing and Configuring Fluent Bit on Kubernetes (Part 3)
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Serverless NLP: Implementing Sentiment Analysis Using Serverless Technologies

Serverless NLP: Implementing Sentiment Analysis Using Serverless Technologies

Learn how to build a serverless application to perform sentiment analysis using AWS lambda, API Gateway, and NLTK's Vader library.

By 
Vamsi Kavuri user avatar
Vamsi Kavuri
DZone Core CORE ·
Oct. 21, 24 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
21.6K Views

Join the DZone community and get the full member experience.

Join For Free

In this article, I will discuss building a sentiment analysis tool using AWS serverless capabilities and NLTK. I will be using AWS lambda to run sentiment analysis using the NLTK-vader library and AWS API Gateway to enable this functionality as an API. This architecture eliminates the need for any server management while providing on-demand scalability and cost-efficiency.

Before we dive in, ensure that you have the following:

  1. Python 3.10 or higher
  2. An AWS account
  3. AWS CLI installed and configured

Step-by-Step Implementation

  • First, we'll create a directory for our Lambda package and install NLTK and the required dependencies.
Shell
 
mkdir serverless_sentiment
cd serverless_sentiment
pip install nltk -t ./ 


  • To add the dependencies, open the Python interpreter, download the required NLTK dependencies, and copy them into the nltk_data folder inside the lambda package.
Shell
 
python
>>> import nltk
>>> nltk.download(‘punkt’)
>>> nltk.download(‘vader_lexicon’)	

cp -R /<projects_folder>/serverless_sentiment/* ./nltk_data 


  • Create a file named lambda_function.py with the following code:
Python
 
import json
import nltk
from nltk.sentiment import SentimentIntensityAnalyzer
from nltk.tokenize import word_tokenize


intesity_analyger = SentimentIntensityAnalyzer()


def find_overall_sentiment(score):
    if score['compound'] >= 0.05:
        return '+Positive+'
    elif score['compound'] <= -0.05:
        return '-Negative-'
    else:
        return '*Neutral*'


def lambda_handler(event, context):
	# Parse the incoming request
	body = json.loads(event['body'])
	input_text = body['text']
        
	# tokenize words and remove any punctuations
	tokens = word_tokenize(text.lower())
    tokens = [token for token in tokens if token.isalpha()]
    processed_tokens = ' '.join(tokens)

        
    # Run sentiment analysis on the processed text
    sentiment_scores =  intesity_analyger.polarity_scores(processed_tokens)
        
    # Evaluate the score to identify the overall sentiment 
    overall_sentiment = find_overall_sentiment(sentiment_scores)
        
    # Prepare and send response   
    return {
    	'statusCode': 200,
    	'body': json.dumps({
        			'requested_text': input_text,
        			'sentiment_scores': sentiment_scores,
        			'sentiment': overall_sentiment
        		}),
        'headers': {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            }
        }
  	


This function parses the “text” attribute from the request body and runs sentiment analysis on that.

  • Package the Lambda code and upload it to the AWS S3 bucket for deployment.
Shell
 
zip -r ../deploymentPackage.zip . 				
aws s3 mb s3://serverless_sentiment
aws s3 cp ./deploymentPackage.zip s3://serverless_sentiment 

aws lambda create-function \
 --function-name serverless_sentiment \
 --runtime python3.10 \
 --role <REPLACE_THIS_WITH_LAMBDA_EXECUTION_ROLE> \
 --handler LambdaFunction.lambda_handler --code S3Bucket=serverless_sentiment,S3Key=deploymentPackage.zip \
 --environment variables={NLTK_DATA=./nltk_data} 


Note: Make sure you have added the environment variables for NLTK_DATA.                                  

Now that your Lambda function is deployed and ready to execute, let's set up an API using AWS API Gateway to expose it.

  • Create a REST endpoint.
Shell
 
# Create new API 
aws apigateway create-rest-api --name 'serverless_sentiment' --description 'serverless sentiment analysis API'

# Preserve internal IDs for future scripts
API_ID=$(aws apigateway get-rest-apis --query "items[?name==\`lambda_nltk\`].id" --output text) 
PARENT_RES_ID=$(aws apigateway get-resources --rest-api-id $API_ID --query "items.id" --output text)

# Add a resource for the newly created API 
aws apigateway create-resource --rest-api-id $API_ID --parent-id $PARENT_RES_ID --path-part sear ch --region us-east-1

# Add POST operation
aws apigateway put-method --rest-api-id $APP_ID --resource-id $RES_ID --http-method POST --aut horization-type NONE


  • Next, we are going to integrate them with our Lambda.
Shell
 
aws apigateway put-integration --rest-api-id $API_D\ > --resource-id $RES_ID \
--http-method POST \
--type AWS \
--integration-http-method POST \
--uri <REPLACE_ME_WITH_LAMBDA_URI>

# Set the resposne type as JSON 
aws apigateway put-method-response --rest-api-id $API_ID --resource-id $RES_ID --http-method POST --status-code 200 --response-models "{}"
aws apigateway put-integration-response --rest-api-id $API_ID --resource-id $RES_ID --http-method POST --status-code 200 --selection-pattern ".*"

  • Finally, deploy your API and add the necessary permissions to invoke the Lambda directly from the API.
Shell
 
# Deploy API 
aws apigateway create-deployment --rest-api-id $API_ID --stage-name prod

# Add necessary permissions
aws lambda add-permission --function-name serverless_sentiment --statement-id apigateway-serverless-sentiment -prod --action lambda:InvokeFunction --principal apigateway.amazonaws.com --source-arn "arn:aws:execute-api:$REGION:$ACCOUNT:$API/*/POST/sentiment"


You can test your solution using curl or Postman by sending a POST request to your API Gateway endpoint with the text you want to analyze in the body.

Conclusion

We have successfully built a serverless sentiment analysis API that can be used by any system. It can handle varying loads efficiently as AWS Lambda automatically scales based on incoming requests. The same setup can be extended for various NLP tasks like text classification, entity recognition, etc. 

API AWS AWS Lambda NLP Sentiment analysis

Opinions expressed by DZone contributors are their own.

Related

  • Building a Scalable ML Pipeline and API in AWS
  • Setting Up CORS and Integration on AWS API Gateway Using CloudFormation
  • API Implementation on AWS Serverless Architecture
  • AI Advancement for API and Microservices

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!