vj-bharadwaj/vj-bharadwaj-first-assistant icon
public
Published on 4/25/2025
My First Assistant

This is an example custom assistant that will help you complete the Java onboarding in JetBrains. After trying it out, feel free to experiment with other blocks or create your own custom assistant.

Rules
Prompts
Models
Context
voyage voyage-code-3 model icon

voyage-code-3

voyage

openrouter google/gemini-2.5-pro-preview-03-25 model icon

google/gemini-2.5-pro-preview-03-25

openrouter

openrouter google/gemini-2.5-flash-preview model icon

google/gemini-2.5-flash-preview

openrouter

1048kinput·65.536koutput
openrouter google/gemini-2.5-flash-preview:thinking model icon

google/gemini-2.5-flash-preview:thinking

openrouter

1048kinput·65.536koutput
openrouter google/gemini-2.0-flash-001 model icon

google/gemini-2.0-flash-001

openrouter

gemini Gemini 2.5 Pro model icon

Gemini 2.5 Pro

gemini

1048kinput·65.536koutput
gemini gemini-2.5-flash-preview-04-17 model icon

gemini-2.5-flash-preview-04-17

gemini

1048kinput·65.536koutput
- Follow Java coding standards
- Avoid using raw types
## Build & Development Environment

* Ensure your build tool is configured to use **Java 17** source compatibility.
* Use the standard commands provided by your build tool for common tasks:
    * Building the project artifact(s).
    * Running automated tests.
    * Running the Spring Boot application locally.
    * Cleaning build artifacts.

## Testing Guidelines

* Use **JUnit 5** as the primary testing framework.
* Utilize **Spring Boot Test** utilities (`@SpringBootTest`, `@MockBean`, `@WebFluxTest`, etc.) for integration testing Spring Boot applications.
* Employ **Mockito** for creating mocks and verifying interactions in unit tests.
* For testing **Project Reactor** `Mono`/`Flux` streams, use `StepVerifier` from the `reactor-test` dependency.
* Ensure test dependencies are declared within the appropriate test scope in your build configuration.
* Write tests covering happy paths, error conditions, and relevant edge cases for your code changes.
* Ensure tests are independent and can be run reliably via your build tool's standard test execution command.

## Code Style & Guidelines

* **Java 17:**
    * Leverage Java 17 features where they improve clarity/conciseness: `record` for DTOs, text blocks, enhanced `switch`, `var` (use judiciously).
    * Favor immutability; use immutable objects and collections (`List.of`, etc.) where practical.
    * Use the Stream API effectively for collection processing.
    * **Always** use try-with-resources for managing closeable resources (I/O streams, connections, etc.).
* **Spring Boot:**
    * Use Spring Boot starters for managing dependencies via your build tool.
    * Employ **constructor-based dependency injection**.
    * Use standard stereotype annotations (`@Service`, `@Repository`, `@Component`, `@RestController`, `@Configuration`).
    * Externalize configuration into `application.properties` or `application.yml`; access using `@ConfigurationProperties` or `@Value`.
    * For REST APIs (using Spring WebMvc or WebFlux): Define clear DTOs (records preferred), use specific mapping annotations, handle exceptions globally (`@ControllerAdvice`) or locally (`ResponseStatusException`).
* **Project Reactor:**
    * Use `Mono` for 0..1 results and `Flux` for 0..N results.
    * Utilize standard Reactor operators for data flow control (`map`, `flatMap`, `filter`, `zip`, `onErrorResume`, etc.).
    * **CRITICAL: Avoid blocking calls** (`.block()`, `Thread.sleep()`, blocking I/O) within reactive pipelines. Offload necessary blocking work using appropriate scheduler techniques (e.g., `subscribeOn(Schedulers.boundedElastic())`).
    * Use `defer` for lazy publisher creation when needed.
* **AWS SDK v2 (Java):**
    * Use service-specific client builders (e.g., `S3Client.builder()`).
    * Rely on the default credential provider chain; **never** hardcode credentials.
    * Use immutable request objects (e.g., `PutObjectRequest.builder().build()`).
    * Handle relevant `SdkServiceException` subclasses specifically.
* **General:**
    * Adhere to standard Java naming conventions.
    * Use **SLF4j** for logging abstraction
    * Implement clear and consistent error handling strategies.

## Documentation Guidelines

* Add concise inline comments only to explain complex algorithms, non-obvious logic, or workarounds. Code should aim to be self-documenting.
Java 17 docshttps://docs.oracle.com/en/java/javase/17/docs/api/

Prompts

Learn more
Code review
Code review
**Role:** Act as an experienced code reviewer.

**Objective:** Analyze the **patch provided immediately preceding this prompt**, focusing strictly on the **code changes**. Provide structured, actionable feedback using the template below.

**Input:** The patch content provided immediately before this prompt.

**Core Guidelines:**

* **Focus:** Evaluate **only** the code added or modified in the patch.
* **Structure:** Use the exact numbered headings below. Link feedback to file/line numbers (e.g., `file.java:+123`).
* **Confidence Levels:** For specific feedback points in sections 2-9, assign **Confidence:** **L** (Low), **M** (Medium), or **H** (High).
    * Briefly **justify** Medium/High confidence ratings (why it matters).
    * State assumptions if your confidence is low due to limited context (e.g., "Assuming single-threaded context...").
* **Actionable:** Ensure feedback suggests a clear concern or potential improvement.
* **No Feedback:** If a category (2-9) has no relevant points based *only* on the patch changes, simply state: `No specific feedback for this category.`

**Feedback Structure:**

**Patch Review Feedback**

**1. General Information:**
    * **Files Affected:** [List files from patch headers]
    * **Change Locations:** [Identify primary classes/methods modified]
    * **Inferred Purpose:** [Briefly state the likely goal of the changes]

**2. Code Quality & Correctness:**
    * **Logic/Risk:** [Note logical flaws, potential runtime errors (e.g., NPEs). **Confidence:** L/M/H]
    * **Error Handling:** [Assess exception handling in changes. **Confidence:** L/M/H]
    * **Resource Management:** [Check closing of resources (e.g., try-with-resources). **Confidence:** L/M/H]
    * **Best Practices:** [Note adherence/deviations (DRY, SRP, etc.). **Confidence:** L/M/H]

**3. Performance:**
    * [Suggest efficiency improvements or identify bottlenecks in changes. **Confidence:** L/M/H]

**4. Security:**
    * [Identify potential vulnerabilities introduced by changes. **Confidence:** L/M/H]

**5. Readability & Maintainability:**
    * **Clarity/Simplicity:** [Assess if changes are easy to understand; note undue complexity. **Confidence:** L/M/H]
    * **Naming/Conventions:** [Check clarity of names, use of magic values, adherence to conventions. **Confidence:** L/M/H]
    * **Comments/Javadoc:** [Assess necessity and accuracy of comments/docs in changes. **Confidence:** L/M/H]

**6. Functionality & Compatibility:**
    * **Alignment:** [Do changes seem to achieve the inferred purpose? **Confidence:** L/M/H]
    * **API/Contracts:** [Review impact on public methods/interfaces. **Confidence:** L/M/H]
    * **Compatibility:** [Note potential integration issues. **Confidence:** L/M/H]

**7. Concurrency:** (State if not applicable based on context)
    * [Identify potential threading issues (race conditions, etc.) in changes. **Confidence:** L/M/H]

**8. Testing & Edge Cases:**
    * **Tests in Patch:** [Are relevant test changes included? Do they seem adequate?]
    * **Missing Edge Cases:** [Identify potential gaps in test coverage for the changes. **Confidence:** L/M/H]
    * **Test Suggestions:** [Propose specific test cases needed. **Confidence:** L/M/H]

**9. Modern Java (17+):**
    * [Suggest use of modern Java features (Records, Pattern Matching, etc.) where beneficial in changed code. **Confidence:** L/M/H]
create pull request details
create pull request details
**Role:** Act as a software developer writing a clear, concise, and informative Pull Request (PR) description.

**Objective:** Analyze the **code context provided immediately preceding this prompt** (e.g., diff, patch, changed files) and generate a PR description using the specified format below. Infer the necessary details from the code changes.

**Input:**
The code context (diff, patch, changed files, etc.) to be analyzed **was provided immediately preceding this prompt.**

**Output Format & Guidelines:**

* Generate the description using the exact markdown structure provided below.
* Infer information like purpose, highlights, and affected files by analyzing the code changes.
* Write from the **"I" perspective**.
* Keep the overall description **short and concise**.
* Use active voice and nominalizations where appropriate (e.g., "Addition of X", "Refactoring Y", "Update to Z" instead of "X was added" or "In this commit, I added X").
* **Omit any sections or bullet points under "Additional details" entirely** if no relevant information can be reasonably inferred from the provided code context (e.g., if there's no clear evidence of testing, documentation updates, etc., simply leave those lines out).

**PR Description Template:**

```markdown
**Description:**

* **Highlight:** [Identify and state the single most significant change or primary goal achieved, e.g., Addition of user profile editing feature, Refactoring of the data import process for efficiency]
* **Summary:** [Provide a brief, concise bullet list summarizing the key technical modifications, e.g., - Addition of PUT endpoint `/users/profile`, - Update to `UserProfileService` to handle updates, - Modification of `User` database model]
* **Key Files Affected:** [List the primary files, modules, or components involved in the change, e.g., `services/UserProfileService.java`, `controllers/UserController.java`, `models/User.java`]

**Purpose of this PR:**

* **Why:** [Explain the reason or motivation behind these changes. Infer if possible based on context, e.g., To allow users to update their own profile information, To address performance bottlenecks identified in issue #123, Implementation of feature request TKT-456]
* **What:** [Describe *how* the changes achieve the 'Why', summarizing the approach taken, e.g., Introduction of a new service method and API endpoint for profile updates, Implementation of asynchronous processing for data import]

**Additional details (if applicable):**

* **Testing performed:** [Describe testing evident from the context, e.g., Addition of unit tests for `UserProfileService`, Manual verification of profile update flow]
* **Potential impacts:** [Note any foreseeable consequences or risks, e.g., Requires database migration script `V3__add_user_bio.sql`, Possible impact on downstream consumers of `UserDataExport`]
* **Dependencies:** [Mention changes in dependencies, e.g., Addition of `commons-lang3` library, Upgrade of `spring-boot` to version X.Y.Z]
* **Documentation updates:** [Note any documentation changes evident, e.g., Update to `README.md` regarding new environment variables, Addition of Javadoc comments to public methods]
* **Related Issue(s):** [Reference related tickets or issues if mentioned in context, e.g., Closes #123, Addresses TKT-456]
Git commit
git commit
**Role:** Act as an assistant writing a Git commit message adhering to the Conventional Commits standard.

**Objective:** Generate a well-formatted Git commit message following the Conventional Commits specification (v1.0.0) based on the provided code changes.

**Input:**

* Code changes provided as a Git diff, patch file, or a clear textual description.
* The input must **immediately precede this prompt** or be clearly marked within the context.
* If the provided input is ambiguous or lacks sufficient context to determine the nature/purpose of the changes, please note this and indicate what clarification is needed instead of generating a message.

**Output:**

A single, well-formatted Git commit message adhering *strictly* to the Conventional Commits v1.0.0 format and guidelines outlined below.

**Format Guidelines (Conventional Commits v1.0.0):**

**Structure:**

<type>[optional scope]: <description>

[optional body]

[optional footer(s)]


**Elements:**

1.  **`<type>`:** Infer the **most appropriate** type from: `feat` (new feature), `fix` (bug fix), `build`, `chore` (maintenance), `ci`, `docs`, `style`, `refactor`, `perf`, `test`.
2.  **`[optional scope]`:** If applicable, infer a short noun describing the section of the codebase affected (e.g., `auth`, `api`, `parser`). Enclose in parentheses. Omit if not easily discernible or too broad.
3.  **`<description>`:** Write a **concise summary** of the change in imperative, present tense, lowercase (~50 chars target). Do **not** end with a period.
4.  **`[optional body]`:** Use **only** if the change requires explanation beyond the description (the 'why'). Explain motivation and context. Separate from description with **one blank line**. Wrap lines at ~72 characters.
5.  **`[optional footer(s)]`:** Use for metadata. Separate from body (or subject if no body) with **one blank line**.
    * Include `BREAKING CHANGE: <description>` for incompatible API changes (provide details).
    * Reference related issues using keywords like `Fixes #123`, `Closes TKT-456`, `Refs #789`.

**Example:**

refactor(core): simplify data processing logic

Removed redundant intermediate steps in the main data processing pipeline
to improve readability and reduce potential points of failure. The core
algorithm remains unchanged but is now more direct.
Fix Logs
Fix logs
**Role:** Act as a meticulous Senior Software Engineer performing a code review, with a specific focus on improving the quality and effectiveness of logging and user-facing messages.

**Objective:** Analyze the **code context provided immediately preceding this prompt** (which may be diff output, a patch file, code snippets, one or more complete files) to identify areas for improvement in its log messages, exception messages, and any other user-visible messages. Generate constructive suggestions to enhance clarity, consistency, informativeness, and adherence to best practices, focusing particularly on any **new or modified code** presented in the context.

**Input:**
The code context to be analyzed is the content (e.g., diff, patch, one or more files, code snippets) **provided immediately preceding this prompt.**

**Analysis Criteria & Tasks:**

1.  **Scope of Review:**
    * Examine all strings used within the provided context for:
        * Logging statements (e.g., SLF4j, Log4j, `java.util.logging`).
        * Exception constructors (`new Exception("message")`, `new CustomException("message", details)`).
        * Potentially other user-facing string literals (assess context).
    * **If the input appears to be a diff or patch:** Prioritize your analysis on the **added or modified lines** containing messages, using the surrounding unchanged code primarily for context (e.g., checking for consistency with existing patterns).
    * **If multiple files or extensive snippets were provided:** Analyze all relevant parts of the context, paying attention to consistency *across* the different files or sections.

2.  **Identify Areas for Improvement (Focus on these aspects, especially in changed/new code):**
    * **Consistency:** Uniform style, tone, terminology, parameter formatting (e.g., `{}` vs `%s`), and verb tense across messages, comparing new/changed messages with existing patterns where visible.
    * **Clarity & Conciseness:** Messages should be easily understandable, unambiguous, and avoid unnecessary jargon or brevity that obscures meaning.
    * **Informativeness:** Inclusion of crucial context (e.g., relevant method names, object IDs, key variable values, failed operation names) to aid debugging and understanding. Are parameters used effectively in new/modified logs?
    * **Grammar & Spelling:** Correctness in language usage.
    * **Log Level Appropriateness:** Evaluate if standard levels (TRACE, DEBUG, INFO, WARN, ERROR) are used correctly for any new or modified log statements.
    * **Exception Message Specificity:** Error messages within new or modified exceptions should clearly state *what* went wrong and ideally provide context.
    * **Actionability (for errors/warnings):** Does the message give any hint towards resolution or the impact of the issue?

3.  **Suggest New Log Messages:** Within the scope of the changes or the provided code, identify places where adding *new* log messages (especially parameterized ones) would significantly improve traceability, debugging, or understanding of the application flow (e.g., in new methods, around modified logic).

4.  **Generate Recommendations:**
    * Provide a numbered list of **up to 25** specific, constructive suggestions related to the analyzed context.
    * For each suggestion:
        * Clearly describe the identified issue or area for improvement (mentioning the specific location/line if relevant, especially in diffs/patches).
        * Provide a concrete example of the improved message or logging practice.
        * Briefly explain the benefit of adopting the suggestion (e.g., "improves debuggability of the new feature", "ensures consistency with existing logs", "clarifies error cause in the modified section").

**Output Format:**

1.  **Recommendations:** A numbered list containing the suggestions (max 25) as described above.
2.  **Overall Summary:** After the list, provide a brief (1-2 sentence) summary evaluating the general state of logging and messaging **within the provided context**, particularly regarding the changes or scope presented (e.g., "The changes introduce logs that lack sufficient context.", "Messaging within the provided files shows inconsistent formatting.", "The logging practices in the modified sections are sound.").
3.  **No Issues Case:** If, after thorough analysis of the provided context, you find no significant areas for improvement that warrant suggestions, respond *only* with the exact phrase: `No significant improvements suggested for logging and messaging in the provided context.`
Generate Tests
Generate Tests
**Role:** Act as an expert Java developer specializing in writing high-quality unit tests.

**Objective:** Generate a complete JUnit 5 test class for the provided Java code snippet below.

**Input:**
[Placeholder: You will insert the Java code to be tested here]

**Core Requirements:**

1.  **Frameworks & Style:**
    * Use **JUnit 5** for the test structure.
    * Employ **AssertJ** for all assertions, utilizing its fluent API and including meaningful failure messages via `.withFailMessage()`.
    * Strictly follow the **Arrange-Act-Assert (AAA)** pattern, clearly marking each section with comments (`// Arrange`, `// Act`, `// Assert`).

2.  **Test Granularity & Naming:**
    * Each `@Test` method must focus on **a single, specific behavior or scenario**.
    * Use descriptive `@DisplayName` annotations for each test method, clearly stating the scenario being tested.
    * Method names should also be descriptive (e.g., `shouldReturnCorrectResultWhenInputIsValid`).

3.  **Test Coverage & Techniques:**
    * Ensure comprehensive coverage, including:
        * **Nominal cases:** Typical, expected inputs.
        * **Boundary values:** Smallest/largest valid inputs, limits.
        * **Exceptional conditions:** Invalid inputs, nulls, scenarios triggering exceptions (use `assertThatThrownBy` or `assertThatExceptionOfType`).
    * Utilize **`@ParameterizedTest`** with appropriate sources (e.g., `@ValueSource`, `@CsvSource`, `@MethodSource`) whenever multiple input/output pairs can test the same logic efficiently.

4.  **Code Quality & Compatibility:**
    * The generated test code must be **logically correct** and accurately test the provided Java code's behavior.
    * It must **compile and run correctly under Java 17**. Use idiomatic Java 17 features where they enhance clarity or conciseness (e.g., `var`, text blocks if applicable), but prioritize readability.
    * Avoid redundant or overly complex test logic.

5.  **Test Class Structure & Dependencies:**
    * Use the standard JUnit 5 test class structure. Include necessary imports (`org.assertj.core.api.Assertions.*`, `org.junit.jupiter.api.*`, etc.).
    * If the code under test has dependencies that need mocking, use **Mockito**. Include `@ExtendWith(MockitoExtension.class)` and necessary Mockito imports/annotations (`@Mock`, `@InjectMocks`). Mock interactions appropriately within the `// Arrange` block. *Only use Mockito if external dependencies are present in the input code.*

**Verification (Internal Check for AI):**
* Before outputting, perform a self-check:
    1.  Does the test logic correctly reflect the intended behavior of the input code?
    2.  Is the code valid Java 17 and free of compilation errors?
    3.  Are all constraints mentioned above met?

**Handling Ambiguity:**
* If the behavior of the provided Java code is ambiguous or underspecified for certain scenarios, clearly state the assumptions made when writing the corresponding tests.

**Output:**
* Provide **only** the complete, executable Java code for the generated JUnit 5 test class, formatted correctly.

**Example Test Method Snippet (Illustrative):**
```java
   @Test
   @DisplayName("Should return 'Fizz' when number is divisible by 3 but not 5")
   void testFizzBuzz_returnsFizzForMultipleOfThree() {
       // Arrange
       int number = 6;
       String expected = "Fizz";
       FizzBuzzConverter fizzBuzz = new FizzBuzzConverter(); // Assuming this is the class under test

       // Act
       String actual = fizzBuzz.convert(number);

       // Assert
       Assertions.assertThat(actual)
                 .isEqualTo(expected)
                 .withFailMessage("Expected 'Fizz' for number %d, but got '%s'", number, actual);
   }

Context

Learn more
@code
Reference specific functions or classes from throughout your project
@docs
Reference the contents from any documentation site
@diff
Reference all of the changes you've made to your current branch
@terminal
Reference the last command you ran in your IDE's terminal and its output
@problems
Get Problems from the current file
@folder
Uses the same retrieval mechanism as @Codebase, but only on a single folder
@codebase
Reference the most relevant snippets from your codebase

No Data configured

MCP Servers

Learn more

No MCP Servers configured