DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workkloads.

Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Context Search With AWS Bedrock, Cohere Model, and Spring AI
  • Leverage Amazon BedRock Chat Model With Java and Spring AI
  • Generate Unit Tests With AI Using Ollama and Spring Boot
  • Using Spring AI to Generate Images With OpenAI's DALL-E 3

Trending

  • Docker Base Images Demystified: A Practical Guide
  • How Large Tech Companies Architect Resilient Systems for Millions of Users
  • A Developer's Guide to Mastering Agentic AI: From Theory to Practice
  • Top Book Picks for Site Reliability Engineers
  1. DZone
  2. Data Engineering
  3. AI/ML
  4. Unlocking AI Coding Assistants Part 4: Generate Spring Boot Application

Unlocking AI Coding Assistants Part 4: Generate Spring Boot Application

Determine whether it's feasible to build a production-grade Spring Boot application from scratch using a large language model (LLM) as an AI coding assistant.

By 
Gunter Rotsaert user avatar
Gunter Rotsaert
DZone Core CORE ·
May. 02, 25 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
2.8K Views

Join the DZone community and get the full member experience.

Join For Free

In this final installment of our Unlocking AI Assistants series, we will create a Spring Boot application from scratch using an AI coding assistant. The goal is not to just merely create a working application, but to create production-grade code. Enjoy!

Introduction

Some tasks are executed with the help of an AI coding assistant. The responses are evaluated and different techniques are applied, which can be used to improve the responses when necessary. 

Let's try to generate a Spring Boot application by means of using an AI coding assistant. However, some preconditions apply to this Spring Boot application. The most important precondition is that the generated code must be production-grade. Just generating working software is not enough.

The tasks are executed with the IntelliJ IDEA DevoxxGenie AI coding assistant.

The setup used in this blog is LMStudio as inference engine and qwen2.5-coder:7b as model. This runs on GPU.

The sources used in this blog are available at GitHub while the explanation of how the project is created can be found here.

Before we proceed, if you'd like to review the first three parts of this series, you can click on the links below: 

  • Unlocking AI Coding Assistants Part 1: Real-World Use Cases
  • Unlocking AI Coding Assistants Part 2: Generating Code
  • Unlocking AI Coding Assistants Part 3: Generating Diagrams, Open API Specs, And Test Data

Prerequisites

Prerequisites for reading this blog are:

  • Basic coding knowledge.
  • Basic knowledge of AI coding assistants.
  • Basic knowledge of DevoxxGenie. For more information, you can read my previous blog, ''DevoxxGenie: Your AI Assistant for IntelliJ IDEA," or watch the conference talk given at Devoxx. 

Create a Skeleton

First, some skeleton need to be created because the Spring Boot application must meet some requirements.

  • The Rest API must be defined by means of an OpenAPI specification.
  • The controller interface must be generated by means of the openapi-generator-maven-plugin.
  • PostgreSQL must be used as database.
  • Liquibase must be used to create the database tables.
  • jOOQ must be used to access the database;
  • The jOOQ classes must be generated by means of the testcontainers-jooq-codegen-maven-plugin.

Navigate to Spring Initializr and add the following dependencies:

  • Spring Web
  • PostgreSQL Driver
  • JOOQ Access Layer
  • Validation
  • Liquibase Migration

The following changes are applied to the generated Spring Boot application:

  • The controller interface must be generated based on the OpenAPI specification, add plugin openapi-generator-maven-plugin and dependency swagger-annotations.
  • Add scope runtime to dependency liquibase-core.
  • Add a file db.changelog-root.xml to src/main/resources/db/changelog/db.changelog-root.xml as root file for the Liquibase migration scripts.
  • The jOOQ classes should be generated, add plugin testcontainers-jooq-codegen-maven-plugin.
  • Remove the test from the test sources.

The changes are applied to branch feature/base-repository.

Run the build.

Shell
 
$ mvn clean verify


The build fails because the OpenAPI specification is missing. However, this is the starting point.

Generate OpenAPI Specification

The build fails on the OpenAPI specification, which is missing. So, let's fix this.

Prompt

Enter the prompt.

Shell
 
Generate an OpenAPI specification version 3.1.1. 
The spec should contain CRUD methods for customers. 
The customers have a first name and a last name. 
Use the Zalando restful api guidelines.


Response

The response can be viewed here.

Apply Response

Add the response to file src/main/resources/static/customers.yaml.

Additionally, change the following:

  • Change the identifiers from strings to integers.
  • Add the identifier to the Customer schema. This way, the identifier will be returned in the responses.
Shell
 
components:
  schemas:
    Customer:
      type: object
      properties:
        id:
          type: integer
          format: int64
        firstName:
          type: string
        lastName:
          type: string


Run the build. The build is successful and in the directory, target/generated-sources/openapi, the generated sources are available.

The build shows some warnings, but these can be fixed by adding an operationId to the OpenAPI specification. For now, the warnings are ignored.

Shell
 
[WARNING] Empty operationId found for path: GET /customers. Renamed to auto-generated operationId: customersGET
[WARNING] Empty operationId found for path: POST /customers. Renamed to auto-generated operationId: customersPOST
[WARNING] Empty operationId found for path: GET /customers/{id}. Renamed to auto-generated operationId: customersIdGET
[WARNING] Empty operationId found for path: PUT /customers/{id}. Renamed to auto-generated operationId: customersIdPUT
[WARNING] Empty operationId found for path: DELETE /customers/{id}. Renamed to auto-generated operationId: customersIdDELETE


The changes can be viewed here.

Generate Liquibase Scripts

In this section, the Liquibase scripts will be generated.

Prompt

Open the OpenAPI spec and enter the prompt.

Shell
 
Based on this openapi spec, generate liquibase migration scripts in XML format


Response

The response can be viewed here.

Apply Response

The generated XML Liquibase script is entirely correct. Create in directory src/main/resources/db/changelog/migration a file db.changelog-1.xml and copy the response into it. Besides that, change the author to mydeveloperplanet.

XML
 
<changeSet id="1" author="mydeveloperplanet">


Run the build.

The build log shows that the tables are generated.

Shell
 
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Reading resource: src/main/resources/db/changelog/migration/db.changelog-1.xml
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Creating database history table with name: public.databasechangelog
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Reading from public.databasechangelog
Feb 23, 2025 12:42:59 PM liquibase.command
INFO: Using deploymentId: 0310979670
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Reading from public.databasechangelog
Running Changeset: src/main/resources/db/changelog/migration/db.changelog-1.xml::1::yourname
Feb 23, 2025 12:42:59 PM liquibase.changelog
INFO: Table customers created
Feb 23, 2025 12:42:59 PM liquibase.changelog


In directory, target/generated-sources/jooq, you can also find the generated jOOQ files which are generated.

The changes can be viewed here.

Generate Domain Model

In this section, the domain model will be generated.

Prompt

Open the Liquibase migration script and enter the prompt.

Shell
 
Create a domain model based on this liquibase migration script


Response

The response can be viewed here.

Apply Response

Create class Customer in package com.mydeveloperplanet.myaicodeprojectplanet.model. This clashes with the OpenAPI domain model package.

Change the following lines in the pom.xml:

Shell
 
<packageName>com.mydeveloperplanet.myaicodeprojectplanet</packageName>
<apiPackage>com.mydeveloperplanet.myaicodeprojectplanet.api</apiPackage>
<modelPackage>com.mydeveloperplanet.myaicodeprojectplanet.model</modelPackage>


Into:

Shell
 
<packageName>com.mydeveloperplanet.myaicodeprojectplanet.openapi</packageName>
<apiPackage>com.mydeveloperplanet.myaicodeprojectplanet.openapi.api</apiPackage>
<modelPackage>com.mydeveloperplanet.myaicodeprojectplanet.openapi.model</modelPackage>


Run the build, the build is successful.

The changes can be viewed here.

Generate Repository

In this section, the repository will be generated.

Prompt

Add the full project and also add the generated jOOQ classes from the directory target/generated-sources/jooq to the Prompt Context. Note: It seems DevoxxGenie did not add these files at all because they were ignored in the .gitignore file, see this issue.

Shell
 
Generate a CustomerRepository in order that the operations defined in the openapi spec customers.yaml are supported


Response

The response can be viewed here.

The response uses Spring Data JPA and this is not what is wanted.

Prompt

Give explicit instructions to only use dependencies in the pom.xml.

Shell
 
Generate a CustomerRepository in order that the operations defined in the openapi spec customers.yaml are supported.
Only use dependencies available in the pom.xml


Response

The response can be viewed here.

The same response is returned. The instructions are ignored.

Prompt

Let's be more specific. Enter the prompt.

Shell
 
You do not use the dependencies defined in the pom.
You should use jooq instead of jpa


Response

The response can be viewed here.

Apply Response

This response looks great. Even an example of how to use it in a service is added.

Create a package com/mydeveloperplanet/myaicodeprojectplanet/repository and paste the CustomerRepository code. Some issues are present:

  • A RuntimeException is thrown when a Customer cannot be found. Probably this needs to be changed, but for the moment this will do.
  • The package com.mydeveloperplanet.myaicodeprojectplanet.jooq could not be found. A Maven Sync solved this issue.
  • Customers.CUSTOMER could not be found. The following line needed to be added import com.mydeveloperplanet.myaicodeprojectplanet.jooq.tables.Customers;
  • Still, two compile errors remain due to a nonexistent exists() method.
Shell
 
if (dslContext.selectFrom(Customers.CUSTOMERS)
              .where(Customers.CUSTOMERS.ID.eq(id))
              .exists())


Prompt

Open a new chat window. Open the CustomerRepository file and enter the prompt.

Shell
 
the .exists() method does not seem to be available, fix the code


Response

The response can be viewed here.

Apply Response

The response suggests to use the selectExists method, but also this method is nonexistent.

Prompt

Enter the follow-up prompt.

Shell
 
the selectExists method is also not available, fix the code properly


Response

The response can be viewed here.

Apply Response

The response suggests to use the fetchExists method. This is already closer to the real solution, but still does not compile. The LLM suggests to use:

Shell
 
boolean exists = dslContext.selectFrom(Customers.CUSTOMERS)
                           .where(Customers.CUSTOMERS.ID.eq(id))
                           .fetchExists();


Time to help a little bit and change it manually to the correct implementation.

Shell
 
boolean exists = dslContext.fetchExists(dslContext.selectFrom(Customers.CUSTOMERS));


Run the build, the build is successful.

Prompt

In the current implementation, the repository methods use the jOOQ generated CustomersRecord as an argument. This means that the service layer would need to know about the repository layer and this is not wanted. The service layer should know only the domain model.

Open a new chat window, add the src directory to the Prompt Context and also the generated jOOQ classes from the directory target/generated-sources/jooq. Enter the prompt.

Shell
 
I want that the methods make use of the Customer model and that any mappings 
between Customer and CustomerRecord are done in the CustomerRepository itself


Response

The response can be viewed here.

Apply Response

This looks great. Some variables, arguments are called record and this is a reserved word. Change this in customerRecord.

Run the build, the build is successful.

This took a bit longer than the previous generations, but the end result is quite good.

The changes can be viewed here.

Generate Service

In this section, the service will be generated.

Prompt

Open a new chat window and add the full project to the Prompt Context. Enter the prompt.

Shell
 
Create a spring service in order that the operations defined in the openapi spec customers.yaml are supported. 
The service must use the CustomerRepository.


Response

The response can be viewed here.

Apply Response

The response looks good. Also a Service Interface is created, which is not really needed.

Create package com/mydeveloperplanet/myaicodeprojectplanet/service and add the Service class and interface.

Run the build, the build is successful.

The changes can be viewed here.

Generate Controller

In this section, the controller will be generated.

Prompt

Open a new chat window, add the src directory and the target/generated-sources/openapi/src directory to the Prompt Context. Enter the prompt.

Shell
 
Create a Spring Controller in order that the operations defined in the openapi spec customers.yaml are supported. 
The controller must implement CustomersApi. 
The controller must use the CustomersService.


Response

The response can be viewed here.

Apply Response

The response looks good. Create the package com/mydeveloperplanet/myaicodeprojectplanet/controller and add the controller to this package. Some issues exist:

  • The import for CustomersApi is missing. Add it.
  • The arguments in the methods use the Customer domain model, which is not correct. It should use the OpenAPI Customer model.

Prompt

Enter a follow-up prompt.

Shell
 
The interface is not correctly implemented. 
The interface must use the openapi Customer model and 
must convert it to the Customer domain model which is used by the service.


Response

The response can be viewed here.

Apply Response

The response is not correct. The LLM does not seem to see the difference between the Customer domain model and the Customer OpenAPI model.

There is also a strange nonexistent Java syntax in the response (looks more like Python).

Shell
 
import com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer as OpenAPICustomer;


Prompt

Enter a follow-up prompt.

Shell
 
This is not correct. 
Try again, the openai Customer model is available in package com.mydeveloperplanet.myaicodeprojectplanet.openapi.model, 
the domain model is available in package com.mydeveloperplanet.myaicodeprojectplanet.model


Response

The response can be viewed here.

This response is identical to the previous one.

Let's help the LLM a little bit. Fix the methods and replace OpenAPICustomer with com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer.

This still raises compile errors, but maybe the LLM can fix this.

Prompt

Open a new chat window, add the src directory and the target/generated-sources/openapi/src directory to the Prompt Context. Enter the prompt.

Shell
 
The CustomersController has the following compile errors:
* customersGet return value is not correct
* customersIdGet return value is not correct
Fix this


Response

The response can be viewed here.

Apply Response

This seems to be a better solution. Only the following snippet does not compile.

Shell
 
private com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer convertToOpenAPIModel(Customer customer) {
    return new com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer(
            customer.getId(),
            customer.getFirstName(),
            customer.getLastName()
    );
}


Let's fix this manually.

Shell
 
private com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer convertToOpenAPIModel(Customer customer) {
    com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer openAPICustomer = 
            new com.mydeveloperplanet.myaicodeprojectplanet.openapi.model.Customer();
    openAPICustomer.setId(customer.getId());
    openAPICustomer.setFirstName(customer.getFirstName());
    openAPICustomer.setLastName(customer.getLastName());
    return openAPICustomer;
}


Run the build, the build is successful.

The changes can be viewed here.

Run Application

Time to run the application.

Add the following dependency in order to start a PostgreSQL database when running the application.

Shell
 
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-docker-compose</artifactId>
	<scope>runtime</scope>
	<optional>true</optional>
</dependency>


Add a compose.yaml file to the root of the repository.

Shell
 
services:
  postgres:
    image: 'postgres:17-alpine'
    environment:
      - 'POSTGRES_DB=mydatabase'
      - 'POSTGRES_PASSWORD=secret'
      - 'POSTGRES_USER=myuser'
    labels:
      - "org.springframework.boot.service-connection=postgres"
    ports:
      - '5432'


Run the application.

Shell
 
mvn spring-boot:run


An error occurs.

Shell
 
2025-02-23T15:29:27.478+01:00 ERROR 33602 --- [MyAiCodeProjectPlanet] [           main] o.s.b.d.LoggingFailureAnalysisReporter   : 

***************************
APPLICATION FAILED TO START
***************************

Description:

Liquibase failed to start because no changelog could be found at 'classpath:/db/changelog/db.changelog-master.yaml'.

Action:

Make sure a Liquibase changelog is present at the configured path.

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  19.530 s
[INFO] Finished at: 2025-02-23T15:29:27+01:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:3.4.3:run (default-cli) on project myaicodeprojectplanet: Process terminated with exit code: 1 -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException


This can be fixed by adding the following line into the application.properties file.

Shell
 
spring.liquibase.change-log=classpath:db/changelog/db.changelog-root.xml


Run the application and now it starts successfully.

Test Application

The application runs, but is it also functional?

Prompt

Open a new chat window and open the OpenAPI specification. Enter the prompt.

Shell
 
Generate some curl commands in order to test this openapi spec


Response

The response can be viewed here.

Run Tests

Create a Customer.

Shell
 
curl -X POST "http://localhost:8080/customers" -H "Content-Type: application/json" -d '{
  "firstName": "John",
  "lastName": "Doe"
}'
{"timestamp":"2025-02-23T14:33:11.903+00:00","status":404,"error":"Not Found","path":"/customers"}


This test fails. The cause is that the CustomersController has the following unnecessary annotation.

Shell
 
@RequestMapping("/customers")


This should not be here, this is already part of the CustomersApi interface.

Remove this line, build the application and run it again.

The changes can be viewed here.

Create a Customer. This is successful.

Shell
 
curl -X POST "http://localhost:8080/customers" -H "Content-Type: application/json" -d '{
  "firstName": "John",
  "lastName": "Doe"
}'


Retrieve a Customer. This is successful.

Shell
 
curl -X GET "http://localhost:8080/customers/1" -H "accept: application/json"
{"id":1,"firstName":"John","lastName":"Doe"}


Update a Customer. This is successful.

Shell
 
curl -X PUT "http://localhost:8080/customers/1" -H "Content-Type: application/json" -d '{
  "id": 1,
  "firstName": "Jane",
  "lastName": "Doe"
}'


Retrieve all Customers. This is successful.

Shell
 
curl -X GET "http://localhost:8080/customers" -H "accept: application/json"
[{"id":1,"firstName":"Jane","lastName":"Doe"}]


Delete a Customer. This is successful.

Shell
 
curl -X DELETE "http://localhost:8080/customers/1" -H "accept: application/json"


Retrieve all Customers. An empty list is returned. This is successful.

Shell
 
curl -X GET "http://localhost:8080/customers" -H "accept: application/json"
[]


Conclusion

It is possible to create a Spring Boot application from scratch using a local LLM. Creating the repository and controller needed some extra iterations and manual interventions. However, the result is quite good—the application is functional and meets the initial requirements.

AI application Spring Boot

Published at DZone with permission of Gunter Rotsaert, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Context Search With AWS Bedrock, Cohere Model, and Spring AI
  • Leverage Amazon BedRock Chat Model With Java and Spring AI
  • Generate Unit Tests With AI Using Ollama and Spring Boot
  • Using Spring AI to Generate Images With OpenAI's DALL-E 3

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: