DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • The Long Road to Java Virtual Threads
  • Java Bean Validation: Applying Constraints Programmatically
  • How Stalactite ORM Implements Its Fluent DSL
  • Optimizing Java Applications: Parallel Processing and Result Aggregation Techniques

Trending

  • Apache Spark 4.0: Transforming Big Data Analytics to the Next Level
  • How GitHub Copilot Helps You Write More Secure Code
  • Tired of Spring Overhead? Try Dropwizard for Your Next Java Microservice
  • How To Build Resilient Microservices Using Circuit Breakers and Retries: A Developer’s Guide To Surviving
  1. DZone
  2. Coding
  3. JavaScript
  4. Spring Webflux Multipart File Upload and Reading Each Line Without Saving It

Spring Webflux Multipart File Upload and Reading Each Line Without Saving It

Take a look at this tutorial the demonstrates how to use Spring Webflux to upload and read each line in a file without saving it.

By 
Eaiman Shoshi user avatar
Eaiman Shoshi
·
Apr. 21, 20 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
39.9K Views

Join the DZone community and get the full member experience.

Join For Free

I’ve been working on Spring Webflux for a while. And in my experience, uploading and reading files in this framework is quite a hassle.

Today I am going to talk about uploading a file using Spring Webflux. And the most amazing part is that I am not going to save the file, but I will read it. I will also be checking whether all the data of the file match my RegEx criteria or not using powerful Java stream API.

The Real-Life Problem Description I’ve Faced

The issue I've faced is having to upload any type of file with the condition that the lines of the file have to be separated by a new line. There is no way you can save the file on the server. Create a list of String by reading the file, where each item of the list will be a single line of the file. Each item must have to match a validation rule; otherwise, you'll have to discard the whole file as it is corrupted. So, the summary is: upload -> read -> check -> list of string from the file without saving it.

So, the rough steps are:

  • Controller to consume the multipart file into Flux FilePart
  • Converting the file parts into Flux of String using dataBuffer
  • Collect all the data parts string and process them
  • Check validity using Java Stream API and regex
  • And tons of magic

Sound scary? Well, I will explain it to you step by step. So, what are we waiting for? Let’s dig in.

Controller


Java
 




x


 
1
// use Flux<FilePart> for multiple file upload
2
@PostMapping(value = "/upload-flux", consumes = MediaType.MULTIPART_FORM_DATA_VALUE, produces = MediaType.APPLICATION_STREAM_JSON_VALUE)
3
@ResponseStatus(value = HttpStatus.OK)
4
public Flux<String> upload(@RequestPart("files") Flux<FilePart> filePartFlux) {     
5
    
6
    return uploadService.getLines(filePartFlux);
7
}



This part is easy. This is a post endpoint that is able to accept multiple files. URL part is upload-flux and must have to use consumes = MediaType.MULTIPART_FORM_DATA_VALUE . As we can see I have used:

Java
 




xxxxxxxxxx
1


 
1
public Flux<String> upload(@RequestPart("files") Flux<FilePart> filePartFux)


Here, part of the request files will be automatically injected as Flux<FilePart> into the method by Spring Webflux.

Remember:
1. To upload multiple files you must have to use Flux<FilePart> 
2. To upload single file you must have to use Mono<FilePart>  or   FilePart
3.
Mono<MultiValueMap<String, Part>> can be used for both case. But in that case you have to find out the FilePart(s) from the map by key.l Like for this tutorial the key is files for both single and multiple file.

For this tutorial I am going to use Flux<FilePart>

Service

From the controller layer, filePartFlux is now passed to the service layer. I have divided the work of this service into two methods. Let’s try to understand these methods one by one.

First Method

Java
 




x


 
1
public Flux<String> getLines(Flux<FilePart> filePartFlux) {
2

            
3
    return filePartFlux.flatMap(filePart ->
4
            filePart.content().map(dataBuffer -> {
5
                byte[] bytes = new byte[dataBuffer.readableByteCount()];
6
                dataBuffer.read(bytes);
7
                DataBufferUtils.release(dataBuffer);
8

            
9
                return new String(bytes, StandardCharsets.UTF_8);
10
            })
11
            .map(this::processAndGetLinesAsList)
12
            .flatMapIterable(Function.identity());
13
}



In this method the filePartFlux is directly passed from the controller layer. Then we flatmap filePartFlux and get a new Flux<String> stream.

Java
 




xxxxxxxxxx
1


 
1
filePartFlux.flatMap(filePart ->
2
    filePart.content().map(dataBuffer -> {
3
        byte[] bytes = new byte[dataBuffer.readableByteCount()];
4
        dataBuffer.read(bytes);
5
        DataBufferUtils.release(dataBuffer);
6

           
7
        return new String(bytes, StandardCharsets.UTF_8);
8
    }))


filePartFlux will emit filepart into the flatmap. Then we access the content of filepart and map to create a Flux of String. Inside the map, we get dataBuffer which is emitted from the content(). Here we have to keep in mind that a certain amount of bytes are readable from this dataBuffer. So, we take a byte array variable bytes with length of dataBuffer.readableByteCount()

Then we fill the bytes array by reading data from dataBuffer like dataBuffer.read(bytes) . Then we free the dataBuffer by releasing it like DataBufferUtils.release(dataBuffer) . Then we convert the bytes into String and return it. So, when this full process will be completed we will get a new Flux<String> stream. Now let's see the rest of the method.

Java
 




xxxxxxxxxx
1


 
1
.map(this::processAndGetLinesAsList);
2
.flatMapIterable(Function.identity());


Now, we get every String from the Flux<String> stream and by processing them via processAndGetLinesAsList method we generate another Flux<String> stream from flatMapIterable(Function.identity()). processAndGetLinesAsList method is described in the next section. For processing and validation check we need another method. After validation check, if the file is corrupted then an empty Flux<String> will be returned from here.

Second Method

Java
 




xxxxxxxxxx
1
11


 
1
private List<String> processAndGetLinesAsList(String string) {
2
3
    Supplier<Stream<String>> streamSupplier = string::lines;
4
    var isFileOk = streamSupplier.get().allMatch(line -> 
5
                         line.matches(MultipartFileUploadUtils.REGEX_RULES));
6
7
    return isFileOk ? streamSupplier.get()
8
                           .filter(s -> !s.isBlank())
9
                           .collect(Collectors.toList())
10
                    : new ArrayList<>();
11
}



This not so scary as it looks like. Just read and translate as it is written here.

In this method, we have added some validation over our data. To do that at first we have to split each string into string::lines

At this point, you might ask why we are doing this. Well, we need every line from the file. But, you know what, as we are getting this string variable’s value from FilePart & DataBuffer , it is not guaranteed that every lines variable from the Flux stream will be a single line from the file. Because we have generated this list from FilePart & dataBuffer, so each string will contain multiple lines from the file as the file is read part by part and the strings are generated from each part respectively.

So, what we have done here is we have created a Supplier that will supply a Stream of string. In the Lambda function, we have made a stream from the splitted string .

The next statement is our validation checkpoint. Here we are checking every string of the stream (eventually that means every line of the uploaded file) against our RegEx rules using Java’s stream API.

Java
 




xxxxxxxxxx
1


 
1
streamSupplier.get().allMatch(line -> line.matches(Util.YOUR_REGEX))


In short form, this is equivalent to:

Java
 




xxxxxxxxxx
1


 
1
stream.allMatch(value -> condition)


And this will return true only if all the value of the stream meets the condition successfully. Mind blowing, isn’t it?

So, in our code, if all is well, then the stream will be converted into a list and returned. Otherwise, that means the file’s value(s) is/are against our rules, in other words, it is a corrupted file. And thus empty list will be returned.


And that’s it, the Flux of String is returned all the way through to the client. Enjoy the List of String. 

But wait, why on earth would anybody want to upload a file through a RestAPI and get the lines of the file as a response? That doesn’t make sense, right? Maybe you want to trigger some other operation by this list of string. Or, one could publish these string to a message broker queue. Or, what if one wants to save these tremendous numbers of lines in a database like DynamoDB?

Well, that’s a story for another step by step tutorial.

Please share and leave a comment for any questions or feedback.
To see the full tutorial in action with much more ingredients, browse the project on GitHub:
https://github.com/eaiman-shoshi/MultipartFileUpload

Upload Spring Framework Data Types Strings Stream (computing) Java (programming language)

Published at DZone with permission of Eaiman Shoshi. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • The Long Road to Java Virtual Threads
  • Java Bean Validation: Applying Constraints Programmatically
  • How Stalactite ORM Implements Its Fluent DSL
  • Optimizing Java Applications: Parallel Processing and Result Aggregation Techniques

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!