DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Build an AI Chatroom With ChatGPT and ZK by Asking It How!
  • Using Unblocked to Fix a Service That Nobody Owns
  • The Future of Java and AI: Coding in 2025
  • Recurrent Workflows With Cloud Native Dapr Jobs

Trending

  • AWS to Azure Migration: A Cloudy Journey of Challenges and Triumphs
  • Implementing API Design First in .NET for Efficient Development, Testing, and CI/CD
  • Rust, WASM, and Edge: Next-Level Performance
  • The Role of AI in Identity and Access Management for Organizations
  1. DZone
  2. Software Design and Architecture
  3. Integration
  4. Integrating AI Into the IDEs

Integrating AI Into the IDEs

Several ideas can be explored to achieve the seamless integration of AI into IDEs: some have already been implemented, and others remain untapped opportunities.

By 
Gregory Ledenev user avatar
Gregory Ledenev
·
Aug. 09, 23 · Analysis
Likes (4)
Comment
Save
Tweet
Share
6.0K Views

Join the DZone community and get the full member experience.

Join For Free

AI has demonstrated its immense potential in aiding coding tasks, such as answering questions, generating code snippets, explaining code, and detecting issues. However, the current approach of using Web UIs like ChatGPT and relying heavily on copy/paste methods can be cumbersome and less efficient. To maximize the benefits of AI in coding, tighter integration of AI capabilities within modern Integrated Development Environments (IDEs) is essential. Several ideas can be explored to achieve this seamless integration, some of which have already been implemented in IDEs like IntelliJ IDEA, while others remain untapped opportunities.

There are two major concepts of such an integration: AI Actions and AI Assistant (Chat UI).

AI Actions

AI Actions are tasks that leverage AI to achieve desired results. Each AI Action should either:

  1. Prepare a prompt and a context, engage AI to process the data, gather results and present them to the user in the most convenient form or commit changes to the code.
  2. Prepare a prompt and a context, call AI Assistant, and let the user complete the work.
  3. Do some complicated mix of 1 and 2 scenarios.

AI Actions can be carried out in different ways:

  • Triggered by user requests (e.g., as refactorings)
  • Initiated by system services on a schedule, in the background, or when appropriate (e.g., code analysis actions)

AI Code Intentions

In the world of modern IDEs, a powerful feature that has become increasingly prevalent is the concept of "Intentions." These Intentions offer users the ability to execute predefined actions on their code based on the specific context. Imagine, for instance, working with a "for" statement, and within that context, having various Intentions available such as “Convert to while”, “Reverse order” or “Use Streams” among others. The utilization of Intentions proves to be incredibly advantageous as it enables rapid refactoring and manipulation of code, streamlining the development process.

Developing Intentions, however, is no easy task, and as such, IDE vendors hard-code them and bundle them with their respective IDEs. While some IDEs do allow developers to create their own Intentions, this endeavor is quite challenging and demands in-depth knowledge of the IDE platform, extensive coding, and the creation of plugins to deploy the newly developed Intentions.

Thankfully, with the advent of LLMs like ChatGPT, requesting code-related actions has become significantly simpler. By merely providing a code snippet and a prompt, LLM can effortlessly execute the desired action. For instance, if you supply a loop to LLM with the “Reverse direction of a loop” prompt, it would easily perform the specified transformation on the given code snippet.

Java
 
for (int i = 1; i <= max; i++) 
    System.out.println("i = " + i);


And you will get the same result as for the hard-coded Intention:

Java
 
for (int i = max; i >= 1; i--) 
    System.out.println("i = " + i);


So it is pretty natural that we can introduce AI Intentions, which are basically named AI prompts bound to some context. Context can be a file, class, statement, particular method, etc. So an AI Intention should supply a prompt, ask the IDE for a context, engage AI to process the data, gather results, and present them to the user in the most convenient form or commit changes back to the code.

Advantages of AI Intentions:

  1. Almost any refactoring can be done using plain language. The prompts can be as simple as "Replace using format" for well-known APIs (like Java’s PrintStream.println(String)) or they can be more complex with more instructions for lesser-known APIs.
  2. No need to hardcode such Intentions
  3. No need to write plugins to distribute such Intentions
  4. Such Intentions can be easily user-configurable.
  5. No need to use a chat interface, especially for repeating actions
  6. Less LLM token traffic to save the cost

Define AI Code Intentions With Annotations

An exciting and innovative feature to consider is the integration of AI intentions directly into the API using annotations. By introducing declarative AI intentions, developers can instruct the IDE about the available intentions for each class, field, or method and specify the appropriate prompts for executing specific intentions through the assistance of a Language Model like LLM. These declarative AI intentions can be provided by framework/library authors and seamlessly accessible to all developers using IDEs that support them.

As an illustrative example, let's take a look at the “Replace using format” AI Intention, which enables developers to replace println(String) calls with more efficient printls() calls that take format and arguments list as input:

Java
 
@AIIntention(title="Replace using format", prompt="Replace using format and arguments list")
public void println(String x) {
}


So applying such Intentions to a call:

System.out.println("i = " + i);


Results:

System.out.printf("i = %d%n", i);


If the IDE can provide rendered views for some text elements in the editor, an Annotated AI Intention can be rendered in the code with the title only, hiding the lengthy prompt. Also, that rendering can contain the Play inlaid button that allows one to perform that action in one click.

Java
 
| ▶️ Replace using format
public void println(String x) {
}


Fix Deprecations Action

Another very good use for declarative AI Intentions is dealing with deprecated API. So each deprecated method can include annotations that define special AI Intentions to allow refactoring that method to a modern version. Such AI intentions can be invoked explicitly while editing/browsing the code. Or there can be another higher-level action that would gather all the deprecated and properly annotated methods and prompt the developer to fix some/all of them.

The benefits of utilizing declarative AI Intentions for handling deprecated APIs are numerous. It significantly reduces the time and effort required to maintain and update legacy code, fostering a smoother transition to the latest technologies and best practices. Moreover, it enhances collaboration among developers by providing a unified approach to managing deprecated methods across the entire codebase.

TODO Action

In many cases TODO comments (like // TODO: ) provide enough instructions to let LLM correctly generate the code required to complete such TODOs. For example, the following code:

Java
 
private static void printNumbers(int max) {
    // TODO: 7/20/23 use Stream.forEach
    for (int i = 1; i <= max; i++) {
        System.out.printf("i = %d%n", i);
    }
}


will be correctly refactored by LLM using the prompt "Complete TODO in the following code" to be:

Java
 
private static void printNumbers(int max) {
    // Using Stream.forEach
    IntStream.rangeClosed(1, max).
        forEach(i -> System.out.printf("i = %d%n", i));
}


So it is pretty natural to gather TODO comments and offer them in the list of AI Actions to perform. Class-level TODOs should be offered anywhere in the class, method-level TODOs should be offered in the method scope, etc. Certainly, some/many TODOs can not be completed using just LLM and generic prompts but it would be up to a particular developer whether invoke them or not.

If the IDE can provide rendered views for some text elements in the editor, a TODO can be rendered in the code with an inlaid button (like Play) that allows one to complete that TODO in one click.

Java
 
private static void printNumbers(int max) {
    | ▶️ 7/20/23 use Stream.forEach
    for (int i = 1; i <= max; i++) {
        System.out.printf("i = %d%n", i);
    }
}


Create Method Action

It would be good to let AI generate the method's signature by utilizing the typed method name and guessing the return type and parameters. For example, a "Propose signature for a method (with empty body): splitStringUsingRegex" prompt pretty correctly produces the following method:

Java
 
public static String[] splitStringUsingRegex(String input, String regex) {
    // method body will go here.
    return null;
}


Optionally, it would include the generation of the method body.

The “Create Method” action should be invoked in place in the code editor by typing the method name and either hitting the Tab key (or any other suitable shortcut) or by explicitly choosing the "Create Method" action.

Suggest Dependencies Action

LLM is proficient in generating code based on users' requirements. However, the generated code often relies on third-party libraries or frameworks. This can cause compilation errors if the required dependencies are not already added to the project. Consequently, manually finding and adding the necessary dependencies can be more time-consuming than obtaining the code itself.

To address this issue, a helpful addition would be a “Suggest Dependencies” action. This action would allow users to request LLM (or a specifically trained model) for information on certain types of dependencies, such as libraries, Maven, Gradle, etc. If the dependencies are found, they can be automatically applied to the project, streamlining the process and saving time. 

For instance, engaging LLM to find the “org.json.JSONObject” Maven dependency could yield the following suggestion:

XML
 
<dependency>
    <groupId>org.json</groupId>
    <artifactId>json</artifactId>
    <version>20210307</version>
</dependency>


That can be used to alter a project’s dependencies.

Suggest Name Action

There are lots of existing code refactorings that introduce new class methods, parameters, or variables like “Extract Method”, “Introduce Parameter,” etc. All new things, created during such refactorings, should be properly named. So it is pretty natural to utilize LLM to find suggested possible names based on the code being refactored and show those names in a popup. Names retrieval would take time so it should not be intrusive and thus should be performed in the background. 

Suggest Implementation Action

This action would allow generating implementation for the methods having an empty body. LLM can be pretty fine if the method name and its arguments clearly (as they should) define the purpose of the method. For example, for the following empty method:

Java
 
public void println(String x) {
}


LLM correctly suggests its body as:

Java
 
public void println(String x) {
    System.out.println(x);
}


This action should be performed in the background and the result should be applied in place in the code editor without showing any additional UI and without stopping the developer to continue working with the IDE.

Probably, there should be an option to launch the “Suggest Implementation” action in the Chat UI to get more control if needed.

Suggest Regex Action

The “Suggest Regex” action should be available for all String literals and it should take a String literal and use it to query AI for a Regex. Then the String literal should be replaced by a generated Regex. 

Maybe this action should be available for String literals starting with "Regex" only.

For example, after invoking the “Suggest regex” action for a String literal in the following code:

Java
 
final Pattern pattern = 
    Pattern.compile("Regex to match phone numbers");


The code would be changed to include a generated Regex:

Java
 
final Pattern pattern = 
    Pattern.compile("^\\+(?:[0-9] ?){6,14}[0-9]$");


Explain Code Action

The “Explain Code” action should utilize LLM with a proper “explain code” prompt to get a description of what the code does. By default, such an explanation should be shown in a popup with the ability to switch between the short and full versions of the explanation. Also, there should be an option to launch the “Explain Code” action in the Chat UI to get more control if needed.

Comment Code Action

The “Comment Code” action should utilize LLM with a kind of “Explain code” prompt and insert a very short code explanation as a comment right to the code. This action should be performed in the background and the result should be applied in place in the code editor without showing any additional UI and without stopping the developer to continue working with the code.

Explain String Action

This action should be available for String literals and it should show an explanation in a popup. There should be an option to open this explanation in the Chat UI if some additional explanations or tests are needed.

LLMs are pretty good at recognizing what is baked in the string so this action will work with a pretty basic prompt. For example, LLMs easily recognize:

  • More-less complex regex
  • SQL statements
  • Printf format string
  • MessageFormat string
  • Etc.

When LLM can not detect the format of the string, there should be a prompt to describe a format.

Find Potential Problems Action

Most modern IDEs include tools that can detect and highlight all the code errors and warnings found by a compiler and by supplemental IDE-specific code analysis tools. 

Though, LLMs can provide some additional detection of potential problems, especially in some very tricky situations. With a proper prompt, LLM can produce structured problem reports in either textual form or JSON format. So parsing and utilizing that report would be pretty easy.

So the “Find Potential Problems” action can utilize LLM with a proper “Find problems in code” prompt and context, parse results, and present them in the usual Messages list view, providing common UX with code navigation, etc. 

Suggest Refactoring Action

The “Suggest Refactoring” action should utilize LLM to get an optimized, fixed, or somewhat improved version of your code. 

By default, this action should be performed in the background and the result should be applied in place in the code editor without showing any additional intrusive UI. There should be an option to open this refactoring in the Chat UI if some additional steps, explanations, or tests are needed.

If the IDE can provide rendered views for some text elements in the editor, a refactored method can be temporarily rendered with inlaid Prev/Next buttons at the right that allows quick change variants of refactored code (if any).

Java
 
private void printNumbers(int max) { | ⬆︎ ⬇︎ Choose variants
	. . .
}


Suggest Commit Message Action

The “Suggest Commit Message” action should utilize LLM to compose a VCS commit message based on provided curated list of changes done in the project. There should be an option to switch between different styles of messages like monolithic or itemized lists. 

Generate Documentation Action

The “Generate Documentation” action should utilize LLM to compose developer documentation for methods and classes in a language-specific format; e.g., JavaDoc.

AI Assistant

AI Assistant provides a universal chat UI similar to the one offered by ChatGPT, so developers can interact with LLM, get answers and code, and commit results back if desired.

AI Assistant should track the context of the chat and it should be aware of where the editing point is. This allows us to:

  • Generate code based on existing codebase (if context window size allows this).
  • Check and alter the code to be inserted to produce valid compilable results. For example:
    • If AI Assistant produced a method, it should be inserted properly at the class level either ignoring the caret position or placing it in a valid nearest place.
    • If AI Assistant produces a code snippet and the insertion point is not in the method, it can be wrapped to a method with suggested by the AI method name.
    • Etc.
  • Avoid the generation of a class and a main() method until explicitly asked to do this.
  • Check names to disallow producing things with duplicate names.
  • Reuse already existing methods without a need to generate duplicates (maybe with a prompt).

There should be a playground built directly into AI Assistant. So generated code can be tested and debugged in place without the need to paste it back, add the main() method, build a project, etc. It can be done either in the AI Assistant itself or in some related window/view. Any missing but required dependencies should be resolved automatically (see the “Suggest Dependencies” Action for more details).

Note: It would be good to avoid using AI Assistant in most cases where good enough results can be achieved by a single LLM transaction using a properly engineered prompt. AI Assistant should be used either when explicitly invoked by the user or for actions that DO require several transactions with the direct user involvement needed to get desired results and it is hardly possible to design some dedicated UI.

AI Integrated development environment UI Java (programming language) Data Types

Published at DZone with permission of Gregory Ledenev. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Build an AI Chatroom With ChatGPT and ZK by Asking It How!
  • Using Unblocked to Fix a Service That Nobody Owns
  • The Future of Java and AI: Coding in 2025
  • Recurrent Workflows With Cloud Native Dapr Jobs

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!