DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Enable Faster Uploads and Downloads with Your S3 Bucket
  • How to Use AWS Aurora Database for a Retail Point of Sale (POS) Transaction System
  • Optimizing Serverless Computing with AWS Lambda Layers and CloudFormation
  • Driving DevOps With Smart, Scalable Testing

Trending

  • Optimizing Software Performance for High-Impact Asset Management Systems
  • A Guide to Using Amazon Bedrock Prompts for LLM Integration
  • Intro to RAG: Foundations of Retrieval Augmented Generation, Part 1
  • Knowledge Graph Embeddings and NLP Innovations
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Upload Files to AWS S3 in k6

Upload Files to AWS S3 in k6

Recently, k6 announced its next iteration with a lot of new features and fixes. In this blog post, we are going to see how to upload files to AWS S3 in k6.

By 
NaveenKumar Namachivayam user avatar
NaveenKumar Namachivayam
DZone Core CORE ·
Updated May. 27, 22 · Tutorial
Likes (1)
Comment
Save
Tweet
Share
8.8K Views

Join the DZone community and get the full member experience.

Join For Free

In my last post, we discussed how to upload files to AWS S3 in JMeter using Groovy. We have also seen Grape package manager on JMeter. Recently, k6 announced its next iteration with many new features and fixes. In this blog post, we will see how to upload files to AWS S3 in k6.

What's New in k6 0.38.1?

k6 0.38.1 is a minor patch. To view all the new features, check out the k6 0.38.0 tag. Following are the noteworthy features in 0.38.0.

  • AWS JSLib
  • Tagging metric values
  • Dumping SSL keys to an NSS formatted key log file
  • Accessing the consolidated and derived options from the default function

AWS JSLib Features

Amazon Web Services is one of the widely used public cloud platforms. AWS JSLib is packed with three modules (as of today's writing) :

  • S3 Client
  • Secrets Manager Client
  • AWS Config

S3 Client

S3Client modules help list S3 buckets and create, upload, and delete S3 objects. To access the AWS services, each client (in this case, S3Client) uses the AWS Credentials such as Access Key ID and Secret Access Key. The credentials must have the required privileges to perform the tasks; otherwise, the request will fail.

Prerequisites to Upload Objects to S3

The following are the prerequisites to upload objects to S3 in k6.

  • The latest version of k6 0.38.0 or above
  • AWS Access Key ID and Secret Access Key have relevant S3 permissions.
  • Basic knowledge in k6

How to Upload Files to S3?

Create a new k6 script in your favorite editor and name it as uploadFiletoS3.js. The first step to uploading files to S3 is to import the following directives to your k6 script.

 
import exec from 'k6/execution'

import {
    AWSConfig,
    S3Client,
} from 'https://jslib.k6.io/aws/0.3.0/s3.js'


The next step is to read the AWS credentials from the command line. It is not recommended to hard code the secrets into the code, which will raise a security concern. It is easy to pass the values into the k6 script using the variables. To pass the AWS config such as region, access key, and secret, use the object AWSConfig as shown below.

 
const awsConfig = new AWSConfig(
    __ENV.AWS_REGION,
    __ENV.AWS_ACCESS_KEY_ID,
    __ENV.AWS_SECRET_ACCESS_KEY
)


The next step is to create an S3 client by wrapping AWSConfig into it using the below code.

 
const s3 = new S3Client(awsConfig);


S3Client s3 has the following methods:

  • listBuckets()
  • listObjects(bucketName, [prefix])
  • getObject(bucketName, objectKey)
  • putObject(bucketName, objectKey, data)
  • deleteObject(bucketName, objectKey)

Now, we know the s3 methods. The next step is to create a dummy file to upload. Create a file with sample contents and save it in a current directory, e.g. test.txt

After creating a dummy file to upload, we need to load that file into the script using the open() method. Copy and paste the below code:

 
const data = open('test.txt', 'r')
const testBucketName = 'k6test'
const testFileKey = 'test.txt'


open() method reads the contents of a file and loads them into memory which will be used in the script. open() the method takes two arguments: file path and the mode. By default, it will read it as text r; to read it as binary use b.

The above open method works only in a init context. Please make a note.

The above variables data, testBucketName and testFileKey hold the data to upload, bucket name in S3, and file key respectively.

The next step is to define the main() context. Let us begin with the listing of the buckets. The below variable buckets will return the array which will contain each bucket object.

 
const buckets = s3.listBuckets()


Optionally, if you would like to loop through the bucket, use the below code snippet.

 
for (let bucket in buckets) {
        console.log(buckets[bucket].name);
}


Or you can use filter() the method as shown below.

 
buckets.filter(bucket => bucket.name === testBucketName)


Let us add a checkpoint whether the bucket is present or not. If the bucket is present, it will proceed to upload; else, the execution will abort. Copy and paste the below snippet.

 
if (buckets.filter((bucket) => bucket.name === testBucketName).length == 0) {
        exec.test.abort()
}


The next step is to upload the object to S3 using putObject() method.

 
s3.putObject(testBucketName, testFileKey,data)


Here is the final script.

 
import exec from "k6/execution";

import { AWSConfig, S3Client } from "https://jslib.k6.io/aws/0.3.0/s3.js";

const awsConfig = new AWSConfig(
  __ENV.AWS_REGION,
  __ENV.AWS_ACCESS_KEY_ID,
  __ENV.AWS_SECRET_ACCESS_KEY
);

const s3 = new S3Client(awsConfig);

const data = open("test.txt", "r");
const testBucketName = "k6test";
const testFileKey = "test.txt";

// main function
export default function () {
  const buckets = s3.listBuckets();


  if (buckets.filter((bucket) => bucket.name === testBucketName).length == 0) {
    exec.test.abort();
  }

  s3.putObject(testBucketName, testFileKey, data);
  console.log("Uploaded " + testFileKey + " to S3");

}


Save the above script and execute the below command.

 
k6 run -e AWS_REGION=ZZ-ZZZZ-Z -e AWS_ACCESS_KEY_ID=XXXXXXXXXXXXXXX -e AWS_SECRET_ACCESS_KEY=YYYYYYYYYYYYYYYYY uploadFiletoS3.js


To store the variables in PowerShell, you can use the below command e.g.

 
Set-Variable -Name "AWS_REGION" -Value "us-east-2"


To execute, you can use the below command.

 
k6 run -e AWS_REGION=$AWS_REGION -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY uploadFiletoS3.js


Navigate to the S3 console and go to the bucket to check the file object. You can download the file to verify the contents.

S3 Validation
S3 Validation

Congratulations! You successfully uploaded the file to S3. If you would like to delete the file object, use the below code snippet.

 
s3.deleteObject(testBucketName, testFileKey)


To read the content from the S3 bucket, you can use the below snippet.

 
const fileContent = s3.getObject(testBucketName, testFileKey);
console.log(fileContent.data);


Final Thoughts

The k6 AWS library is neatly designed with frequently used AWS services and methods. It supports S3 Client, Secret Manager Client, and AWS Config. Hopefully, the k6 team will add more services to help developers and performance engineers.

AWS Upload

Published at DZone with permission of NaveenKumar Namachivayam, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Enable Faster Uploads and Downloads with Your S3 Bucket
  • How to Use AWS Aurora Database for a Retail Point of Sale (POS) Transaction System
  • Optimizing Serverless Computing with AWS Lambda Layers and CloudFormation
  • Driving DevOps With Smart, Scalable Testing

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!