DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Optimizing Java Applications for AWS Lambda
  • Building Powerful AI Applications With Amazon Bedrock: Enhanced Chatbots and Image Generation Use Cases
  • The Role of Retrieval Augmented Generation (RAG) in Development of AI-Infused Enterprise Applications
  • How the Go Runtime Preempts Goroutines for Efficient Concurrency

Trending

  • DZone's Article Submission Guidelines
  • Top Book Picks for Site Reliability Engineers
  • Event-Driven Architectures: Designing Scalable and Resilient Cloud Solutions
  • Docker Base Images Demystified: A Practical Guide
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Build Serverless Applications Using Rust on AWS Lambda

Build Serverless Applications Using Rust on AWS Lambda

Build AWS Lambda functions in Rust for ultra-fast cold starts, minimal memory usage, and reliable, type-safe code at a lower cost.

By 
Eric Jonathan user avatar
Eric Jonathan
·
Feb. 12, 25 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
4.3K Views

Join the DZone community and get the full member experience.

Join For Free

Serverless computing has changed how teams build apps that scale effortlessly. But here’s the catch: popular tools like Node.js and Python often face delays when starting up, hog memory, or just don’t perform as smoothly as needed. That’s where Rust shines. Built for lightning speed and reliability without the bulk, it’s quickly becoming the secret weapon for serverless setups.

In this walkthrough, we’ll teach you how to build and launch serverless functions using Rust on AWS Lambda. 

Why Rust for AWS Lambda?

Blazing-Fast Cold Starts

AWS Lambda cold starts — the delay when a function initializes — are a critical performance bottleneck. Unlike interpreted languages (e.g., Python), Rust compiles to machine-native binaries, eliminating interpreter startup overhead. Combined with Rust’s lack of a garbage collector (GC), this can result in cold starts as low as 50–75 ms, even for complex functions.

Memory Safety Without Compromise

Rust’s ownership model guarantees memory safety at compile time, preventing common vulnerabilities like buffer overflows. This is critical for serverless, where functions often process untrusted input (e.g., data from an API Gateway).

Tiny Binaries, Lower Costs

Rust binaries are often just 5–10 MB when optimized, compared to 50–100 MB for equivalent Node.js or Python deployments. Smaller binaries mean:

  • Faster deployment times
  • Reduced memory usage (leading to lower AWS Lambda costs)
  • Compatibility with restrictive environments like AWS Lambda@Edge

Async-First Concurrency

Rust’s async/await syntax, paired with runtimes like Tokio, enables non-blocking I/O operations. This is ideal for serverless functions handling concurrent API requests or database queries.

Setting Up Rust for AWS Lambda

Install the Rust Toolchain

Start by installing Rust and the AWS Lambda-specific tools:

Rust
 
# Install Rust + Cargo
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

# Add nightly toolchain (required for some optimizations)
rustup default nightly
rustup target add x86_64-unknown-linux-musl

# Install cargo-lambda
cargo install cargo-lambda


Create a New Lambda Project

Use cargo-lambda to scaffold a new function:

Rust
 
cargo lambda new my-lambda-function


Your generated Cargo.toml might include essential dependencies like:

TOML
 
[dependencies]

lambda_runtime = "0.8"

tokio = { version = "1.0", features = ["macros"] }

serde = { version = "1.0", features = ["derive"] }


Write a Basic Handler

Replace src/main.rs with a Lambda function that processes JSON input:

Rust
 
use lambda_runtime::{handler_fn, Context, Error};
use serde_json::{json, Value};


Rust
 
#[tokio::main]
async fn main() -> Result<(), Error> {
    lambda_runtime::run(handler_fn(handler)).await
}

async fn handler(event: Value, _: Context) -> Result<Value, Error> {
    let name = event["name"].as_str().unwrap_or("World");
    Ok(json!({ "message": format!("Hello, {}!", name) }))
}


Key Components

  • #[tokio::main] – configures the async runtime
  • handler_fn – wraps the handler for AWS Lambda compatibility
  • serde_json – parses and serializes JSON payloads

Code Snippets With Expected Outputs

Below, you’ll see additional code examples illustrating structured logging, error handling, and Terraform deployment, each paired with expected inputs and outputs.

Basic Lambda Handler (Extended Example)

Rust
 
async fn handler(event: Value, _: Context) -> Result<Value, Error> {
    let name = event["name"].as_str().unwrap_or("World");
    Ok(json!({ "message": format!("Hello, {}!", name) }))
}


Input

JSON
 
{ "name": "Alice" }


Output

JSON
 
{ "message": "Hello, Alice!" }


Input (No Name)

JSON
 
{}


Output

JSON
 
{ "message": "Hello, World!" }


Structured Logging With Tracing

Rust
 
use tracing::{info, Level};
use tracing_subscriber::FmtSubscriber;

fn main() {
    let subscriber = FmtSubscriber::builder()
        .with_max_level(Level::INFO)
        .finish();
    tracing::subscriber::set_global_default(subscriber).unwrap();

    info!("Lambda initialized");
    // ...
}


CloudWatch Log Output

JSON
 
2023-10-05T12:34:56Z INFO my_lambda_function Lambda initialized


Logs appear in AWS CloudWatch, queryable via CloudWatch Insights for deeper analysis.

Error Handling With thiserror

Rust
 
#[derive(thiserror::Error, Debug)]
enum LambdaError {
    #[error("Missing field: {0}")]
    MissingField(String),
    #[error(transparent)]
    SerdeJson(#[from] serde_json::Error),
}

async fn handler(event: Value, _: Context) -> Result<Value, LambdaError> {
    let name = event["name"]
        .as_str()
        .ok_or(LambdaError::MissingField("name".into()))?;
    Ok(json!({ "message": format!("Hello, {}!", name) }))
}


Input (Missing Name)

JSON
 
{ "age": 30 }


Output (Error)

JSON
 
{
  "errorMessage": "Missing field: name",
  "errorType": "LambdaError"
}


AWS Deployment

JSON
 
{
  "resource": {
    "aws_lambda_function": {
      "rust_lambda": {
        "function_name": "rust-serverless",
        "runtime": "provided.al2",
        "handler": "bootstrap",
        "filename": "target/lambda/my-lambda-function/bootstrap.zip",
        "role": "${aws_iam_role.lambda_exec.arn}",
        "memory_size": 128,
        "timeout": 10
      }
    },
    "aws_iam_role": {
      "lambda_exec": {
        "name": "rust-lambda-role",
        "assume_role_policy": {
          "Version": "2012-10-17",
          "Statement": [
            {
              "Action": "sts:AssumeRole",
              "Effect": "Allow",
              "Principal": {
                "Service": "lambda.amazonaws.com"
              }
            }
          ]
        }
      }
    }
  }
}


Output After terraform Apply

JSON
 
aws_lambda_function.rust_lambda: Creating...
aws_lambda_function.rust_lambda: Creation complete after 5s
Apply complete! Resources: 1 added, 0 changed, 0 destroyed.


Outputs

JSON
 
lambda_arn = "arn:aws:lambda:us-east-1:123456789012:function:rust-serverless"


Optimizing Rust for AWS Lambda

Reduce Binary Size

AWS Lambda charges for memory usage, so smaller binaries can save costs:

Plain Text
 
# Compile with musl for static linking
cargo lambda build --release --target x86_64-unknown-linux-musl

# Strip debug symbols (saves ~30% size) 
strip target/x86_64-unknown-linux-musl/release/bootstrap


Pro tip: Use cargo udeps to audit unused dependencies.

Cold Start Mitigation

  • Precompiled binaries. The x86_64-unknown-linux-musl target ensures compatibility with AWS Lambda’s Amazon Linux 2 environment.
  • Provisioned concurrency. Pre-initialize Lambda instances via the AWS Console, Terraform, or CloudFormation to reduce cold starts for high-traffic functions.

Async Best Practices

Rust’s async runtime (Tokio) helps you run multiple I/O-bound tasks concurrently.

Rust
 
async fn fetch_s3_object(bucket: &str, key: &str) -> Result<Vec<u8>, Error> {
    let client = aws_sdk_s3::Client::new(&aws_config::load_from_env().await);
    let resp = client.get_object().bucket(bucket).key(key).send().await?;
    let data = resp.body.collect().await?;
    Ok(data.into_bytes().to_vec())
}


Use concurrency to fetch data from multiple sources without blocking the main thread.

Observability and Debugging

  • Structured logging. Already shown above with the tracing crate.
  • Error handling. thiserror for typed errors that help you quickly pinpoint issues in logs or metrics.
  • AWS X-Ray. Consider X-Ray for advanced tracing if you need deeper visibility into call chains, especially across microservices.

Advanced Optimization Example

Fetching S3 Data Concurrently

Rust
 
async fn fetch_s3_object(bucket: &str, key: &str) -> Result<Vec<u8>, Error> {
    let client = aws_sdk_s3::Client::new(&aws_config::load_from_env().await);
    let resp = client.get_object().bucket(bucket).key(key).send().await?;
    let data = resp.body.collect().await?;
    Ok(data.into_bytes().to_vec())
}


Input

JSON
 
{ "bucket": "my-bucket", "key": "data.json" }


Output

JSON
 
{
  "content": "<base64_encoded_data>",
  "metadata": { "last_modified": "2023-10-05T12:34:56Z" }
}


You can initiate multiple fetch_s3_object calls concurrently using tokio::join!, slashing overall execution time.

Final Deployment Workflow

Build

JSON
 
cargo lambda build --release --target x86_64-unknown-linux-musl
strip target/x86_64-unknown-linux-musl/release/bootstrap


Deploy

JSON
 
terraform apply -auto-approve


If you’re using aws_lambda_function_url, you can expose the function publicly via HTTPS once the apply step completes.

Invoke

JSON
 
aws lambda invoke \
  --function-name rust-serverless \
  --payload '{"name":"Alice"}' output.json


Response (output.json)

JSON
 
{ "message": "Hello, Alice!" }


Conclusion

Rust’s combination of speed, safety, and efficiency makes it ideal for serverless computing. By leveraging tools like cargo-lambda, tokio, and Terraform, you can deploy production-ready functions that outperform traditional runtimes in cold starts, memory usage, and overall cost.

Next Steps

  • Explore AWS Lambda Extensions for secrets management and advanced logging.
  • Integrate with AWS SQS or EventBridge for event-driven architectures.
  • Benchmark your own functions using AWS X-Ray to visualize call traces.

By adopting Rust for serverless, you’re not just optimizing performance — you’re future-proofing your architecture for the next wave of modern, scalable applications.

Further Reading

  • AWS Lambda Rust Runtime GitHub
  • A Guide to AWS Software Development
  • Rust Documentation
AWS Lambda applications Rust (programming language)

Opinions expressed by DZone contributors are their own.

Related

  • Optimizing Java Applications for AWS Lambda
  • Building Powerful AI Applications With Amazon Bedrock: Enhanced Chatbots and Image Generation Use Cases
  • The Role of Retrieval Augmented Generation (RAG) in Development of AI-Infused Enterprise Applications
  • How the Go Runtime Preempts Goroutines for Efficient Concurrency

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!