DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • REST Services With Apache Camel
  • Spring Microservices RESTFul API Documentation With Swagger Part 1
  • How to Develop Microservices With Spring Cloud and Netflix Discovery
  • REST API Error Handling With Spring Boot

Trending

  • Chat With Your Knowledge Base: A Hands-On Java and LangChain4j Guide
  • Build a Simple REST API Using Python Flask and SQLite (With Tests)
  • Efficient API Communication With Spring WebClient
  • The Evolution of Scalable and Resilient Container Infrastructure
  1. DZone
  2. Software Design and Architecture
  3. Security
  4. Establish Trust Chain From Kafka to Microservices REST APIs With JWT

Establish Trust Chain From Kafka to Microservices REST APIs With JWT

By 
Gary Li user avatar
Gary Li
DZone Core CORE ·
May. 06, 20 · Tutorial
Likes (7)
Comment
Save
Tweet
Share
14.1K Views

Join the DZone community and get the full member experience.

Join For Free

Abstract

This article will discuss an approach to propagate trust from Kafka event stream processing modules to microservices APIs with Spring security and JSON web token (JWT) on Spring Boot applications by using Spring Cloud Stream framework.

Apache Kafka is a distributed streaming platform and is used for building real-time data pipelines and streaming applications. It becomes more popular on event stream processing (ESP) with many advantages on processing performance, data parallelism, distributed coordination, fault tolerance, and operational simplicity. It has been adopted and runs in production in thousands of companies, including LinkedIn, Microsoft, and Netflix, top banks, insurers, and telecoms on processing trillions of messages each day.

Kafka has its own ecosystem, however, form design pattern perspective, it can be simply described as Message Queue + Pub-Sub. Please see my other article on ESP integration pattern advantage and cautions.

Microservices REST APIs, as another popular enterprise integration pattern for service composition, service clustering, and request-response pattern on handling multiple concurrence. However, in some cases, people want to combined request-response APIs with event sequence of update matters. There will be a concern on how to establish, share, or propagate the security among services and events.

What Does Kafka Security Provide?

In Kafka, it provides the following security measures, which focus on message payload delivery channels, partners, and authorization on message producer and consumers.

  1. Authentication of connections to brokers from producers to consumers, using either SSL or SASL:
    • SASL/GSSAPI (Kerberos).
    • SASL/PLAIN.
    • SASL/SCRAM-SHA-256 and SASL/SCRAM-SHA-512.
    • SASL/OAUTHBEARER – after version 2.0.
  2. Authentication of connections from brokers to ZooKeeper.
  3. Encryption of data transferred by using SSL.
  4. Authorization of read / write operations by clients.

Concerns and Suggestions

In most infrastructure environments, it could be difficult to integrate Kafka event security into an existing microservices service mesh security to perform sign-sign-on. When we look at the microservice API security, it will be LDAP/database basic authentication, digest authentication, API keys, cloud signatures, JWT token, OAuth 1.0/2.0, OpenId Connect, etc. None of them can be easily integrated with Kafka security mentioned above.

One thought to pass user identity (authorization information) between microservices in an asynchronous way is keeping Kafka security separate from the microservice service mesh. To use API Gateway to handle authentication and authorization, we can issue JWT tokens and use JWT for stateless API calls. We can then pass the JWT token via an asynchronous secure channel established by Kafka with SSL/SASL security settings, chaining the security token, and validating it by other microservices protected by the JWT. 

With this approach, it shares the security token among the microservices and also decoupled event streaming system (Kafka) from the microservice APIs. If the system needs to replace Kafka to another event stream processing system, such as ActiveMQ or RabbitMQ, it just needs proper configuration changes and all security implementation and setup in service mesh will stay.

Application workflow

Spring Apache Kafka and Spring Could Stream

This sample will use Spring-Cloud-Stream and spring-cloud-stream-binder-kafka to publish and consume events. Spring-cloud-stream-binder-kafka is a little different from Spring for Apache Kafka. It’s more loose-decoupled from Kafka's event streaming framework. It is a binding framework to allow the piece of code to communicate with remote brokers via message channels. 

It ports existing Kafka streams workload into a standalone could-native application and is able to coherent data pipeline using Spring Cloud Data flow. It also leverages the framework’s content-type conversation for inbound and outbound conversation, which saves a lot of code for data serialization and de- serialization. If we replace Kafka with RabbitMQ, the change will only be the configuration in the pom.xml.

Building a Prototype Sample With Spring Boot

In this sample, I will eliminate the SSL security setup on Kafka, which can be found easily in kafka.apache.org website. Let's start zookeeper and Kafka in separate terminal windows:

.\zookeeper-server-start.bat C:\working\software\kafka25\config\zookeeper.properties

Starting Zookeeper

.kafka-server-start.bat C:\working\software\kafka25\config\server.properties

Starting Kafka

As you can see, I have defined a topic in “TestTopic” for this sample:

.\kafka-topics.bat --list --zookeeper localhost:2181

TestTopic in Zookeeper

And in the Spring Boot project application.properties file, it defined the listener for the message output for the publisher. For the message consumer, just replace the “output” to “input”.

Properties files
 




x


 
1
spring.cloud.stream.bindings.output.destination=TestTopic



In the first Spring Boot project, we will set it up as a Kafka event producer by using Spring Cloud Stream, spring-cloud-stream-binder-kafka. Additionally, we'll provide a RESTful API protected by Spring Security with JWT. It has dependencies on Spring Web, Spring Security, Spring Cloud Stream + binder-Kafka, and JJWT packages for JWTs. I also used a Lombok library for data object mapping, which saves the amount of boilerplate Java code with annotations. The pom.xml dependencies are as below:

XML
 




xxxxxxxxxx
1
45


 
1
    <dependencies>
2
        <dependency>
3
            <groupId>org.springframework.boot</groupId>
4
            <artifactId>spring-boot-starter-security</artifactId>
5
        </dependency>
6
        <dependency>
7
            <groupId>org.springframework.boot</groupId>
8
            <artifactId>spring-boot-starter-web</artifactId>
9
        </dependency>
10
        <dependency>
11
            <groupId>org.springframework.cloud</groupId>
12
            <artifactId>spring-cloud-stream</artifactId>
13
        </dependency>
14
        <dependency>
15
            <groupId>org.springframework.cloud</groupId>
16
            <artifactId>spring-cloud-stream-binder-kafka</artifactId>
17
        </dependency>
18
        <dependency>
19
            <groupId>org.springframework.kafka</groupId>
20
            <artifactId>spring-kafka</artifactId>
21
        </dependency>
22
        <dependency>
23
            <groupId>org.projectlombok</groupId>
24
            <artifactId>lombok</artifactId>
25
            <version>1.18.12</version>
26
            <scope>provided</scope>
27
        </dependency>
28
        <dependency>
29
            <groupId>io.jsonwebtoken</groupId>
30
            <artifactId>jjwt-api</artifactId>
31
            <version>0.11.1</version>
32
        </dependency>
33
        <dependency>
34
            <groupId>io.jsonwebtoken</groupId>
35
            <artifactId>jjwt-impl</artifactId>
36
            <version>0.11.1</version>
37
            <scope>runtime</scope>
38
        </dependency>
39
        <dependency>
40
            <groupId>io.jsonwebtoken</groupId>
41
            <artifactId>jjwt-jackson</artifactId>
42
            <version>0.11.1</version>
43
            <scope>runtime</scope>
44
        </dependency>
45

          



In the first producer API sample, the microservice API/authenticate will issue the JWT to the client for further API access.

Java
 




xxxxxxxxxx
1
22


 
1
@RequestMapping(value = "/authenticate", method = RequestMethod.POST)
2
    public ResponseEntity<?> createAuthenticationToken(@RequestBody AuthenticationRequest authenticationRequest) throws Exception {
3

          
4
        try {
5
            authenticationManager.authenticate(
6
                    new UsernamePasswordAuthenticationToken(authenticationRequest.getUsername(), 
7
                            authenticationRequest.getPassword())
8
            );
9
        }
10
        catch (BadCredentialsException e) {
11
            throw new Exception("Incorrect username or password", e);
12
        }
13

          
14
        final UserDetails userDetails = userDetailsService
15
                .loadUserByUsername(authenticationRequest.getUsername());
16
        /* Use JWT Util to get JWT token outof UserDetails */
17
        final String jwt = jwtTokenUtil.generateToken(userDetails);
18

          
19
        /* Create authentication HTTP response with Token */
20
        return ResponseEntity.ok(new AuthenticationResponse(jwt));
21
    }
22

          



In the Spring security configuration file, it added the jwtRequestFilter into the security filter chain to check any incoming stateless API invocations. The Kafka message/publish API is protected by JWT security and will need token in the header for sending message.

Java
 




xxxxxxxxxx
1
11


 
1
    @Override
2
    protected void configure(HttpSecurity httpSecurity) throws Exception {
3
        httpSecurity.csrf().disable()
4
                .authorizeRequests().antMatchers("/authenticate").permitAll().
5
                        anyRequest().authenticated()
6
                        .and()
7
                        .exceptionHandling()
8
                        .and().sessionManagement()
9
                        .sessionCreationPolicy(SessionCreationPolicy.STATELESS);
10
        httpSecurity.addFilterBefore(jwtRequestFilter, UsernamePasswordAuthenticationFilter.class);
11
    }



And check the JWT during the API stateless access process:

Java
 




xxxxxxxxxx
1


 
1
        if (jwtUtil.validateToken(jwt, userDetails)) {
2
            UsernamePasswordAuthenticationToken usernamePasswordAuthenticationToken = new UsernamePasswordAuthenticationToken(
3
                    userDetails, null, userDetails.getAuthorities());
4
            usernamePasswordAuthenticationToken.setDetails(new WebAuthenticationDetailsSource().buildDetails(request));
5
            SecurityContextHolder.getContext().setAuthentication(usernamePasswordAuthenticationToken);
6
        }
7
        chain.doFilter(request, response);



When launching Postman to invoke the producer API, the service will generate a JWT for stateless API access

JSON
 




xxxxxxxxxx
1


 
1
{
2
"jwt": "eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJkZW1vIiwiaWF0IjoxNTg3NzA0MzY4LCJleHAiOjE1ODc3NDAzNjh9.HwnYOMa-IFmmEAHXjjSA_13gjJRRjj-lIq99UyVd9ac"
3
}



Once you've configured this JWT in the API/publish header, the HTTP POST call will send a message to the Kafka publisher and return an HTTP result 200. In the sample, we pass the JWT the header of the message payload into the Kafka message channel that's protected by SSL security.

Java
 




xxxxxxxxxx
1


 
1
    @PostMapping("/publish")
2
    public CaseData publishEvent(@RequestBody CaseData casedata, @RequestHeader HttpHeaders headers){
3
        String jwtBearer = headers.getFirst("Authorization");        
4
        CasePayload casePayload = new CasePayload(jwtBearer==null?"jwt_Null":jwtBearer, casedata);
5
        
6
        output.send(MessageBuilder.withPayload(casePayload).build());
7
        return casedata;
8
    }



Sending request in Postman

On the consumer side, it received the message payload and extracted the JWT from the message header. With the JWT, the consumer will build up a stateless remote API call to the 3rd micro-service, which is also protected by the JWT security. By using RestTemplate.exchange, it chained the security and passed the token over.

Java
 




xxxxxxxxxx
1
11


 
1
CaseDocument caseDocument = null; 
2
        try {
3
            String theUrl = "http://localhost:8093/cases/"+casePayload.getCasedata().getId();
4
            HttpHeaders headers = createHttpJwtHeaders(casePayload.getJwt());
5
            HttpEntity<String> entity = new HttpEntity<String>("parameters", headers);
6
            
7
            ResponseEntity<CaseDocument> response = restTemplate.exchange(theUrl, HttpMethod.GET, entity, CaseDocument.class);
8
            caseDocument = response.getBody();          
9
        }catch (Exception e) {
10
            System.out.println(e.getMessage()); 
11
        }



In the third microservice app, it checked the security via the Spring authentication filter. Once it passed, the service will return the requested document details to the second microservice’s API call.

Java
 




xxxxxxxxxx
1


 
1
UsernamePasswordAuthenticationToken usernamePasswordAuthenticationToken = new UsernamePasswordAuthenticationToken(
2
        userDetails, null, userDetails.getAuthorities());
3
usernamePasswordAuthenticationToken
4
        .setDetails(new WebAuthenticationDetailsSource().buildDetails(request));
5
SecurityContextHolder.getContext().setAuthentication(usernamePasswordAuthenticationToken);
6

          
7
chain.doFilter(request, response);



The complete sample code can be found in Github.

Conclusion

Kafka will provide stateless security access via the token and set up trust with microservice APIs. However, by decoupling event stream products with the microservice, APIs will also provide a flexible approach for APIs service mesh to choose different message delivery products. 

Security can be configured and setup within different zones for different purposes and chained together for business applications. Usually, we don’t recommend exposing the integration layer or event channels to front-end or exposed as APIs. However, as event stream processing pattern development, modern technologies could provide other easier ways to set up trust among the service mesh and event channels.

In the next step, we can try to use OAuth2.0/OpenID Connect with Kafka and integrate with multiple microservices and build up a security chain for a backend service mesh.

I hope you found this article helpful.

kafka microservice JWT (JSON Web Token) Spring Framework REST Web Protocols Spring Security Trust (business)

Opinions expressed by DZone contributors are their own.

Related

  • REST Services With Apache Camel
  • Spring Microservices RESTFul API Documentation With Swagger Part 1
  • How to Develop Microservices With Spring Cloud and Netflix Discovery
  • REST API Error Handling With Spring Boot

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!