codeburst

Bursts of code to power through your day. Web Development articles, tutorials, and news.

Follow publication

Best Practices, Tips, and Practical Tricks: What You Should Follow

Best Practices for Building Rest Microservices with Spring Boot

A collection of best practices and techniques for developing REST-based microservices using Spring Boot.

Ehab Qadah
codeburst
Published in
9 min readJul 29, 2020

--

Photo by Clark Tibbs on Unsplash

The goal of this article is to give you a collection of recommended best practices and techniques for building Java REST microservices using Spring Boot. These recommendations are designed to help you create an efficient, maintainable, and effective Spring Boot based microservices. Spring Boot has become the de-facto standard for Java™ microservices, it has many purpose-built features that ease building, running your microservices in production at large scale.

This list of best practices is built based on my experience in running microservices-based architecture on Google Kubernetes Engine (GKE). I have seen and applied these approaches in several projects, and over the years, I have formed a pretty strong opinion about them. We will walk through the different factors and guidelines that should be considered when building REST-based microservices using spring-boot, and explaining how they help you to implement more robust, well-founded, and successful microservices.

This article assumes that you are familiar with Java, spring-boot concepts such as Spring Data JPA, Spring Data REST, Docker/Kubernetes basic concepts, and general knowledge in microservices architecture. Given that implementing microservices architecture with a foundation of best practices can drastically improve your software architecture. However, we focus only on the REST-based microservices without considering other types like Event-Driven microservices.

You may or may not agree with some of the best practices presented here, and that’s absolutely fine as long as it drives you to research and examine these best practices while developing spring-boot based microservices, that will help you to build better microservices as much as possible.

Design REST APIs optimally

As a general best practice, you should group all related APIs in single controller oriented around a use-case (e.g., Aggregator APIs or Domain APIs). While keeping them clean and focused, you should follow the best practices for the REST APIs design such as:

  • Use nouns instead of verbs in the endpoint paths, which represent entities/resources to fetch or manipulate and use consistently plural nouns such as /orders/{id}/products over /order/{id}/product.
  • The operation must be represented by the HTTP request, GET retrieves resources. POST creates a new data record. PUT updates existing data. DELETE removes data
  • Nesting resources for hierarchical objects
  • Provide filtering, sorting, and pagination
  • Consider API Versioning
  • APIs should be fully documented.

Controllers are not supposed to perform any business logic apart from routing and delegating the action to the proper services. You must enforce separation of concerns by keeping the controllers as light as possible. Let’s take a look at an example of a REST controller considering the above points:

OrderController

The @RequestMapping annotation provides “routing” information, use the consume and produce annotation to define what the endpoint accepts for the request payload and what it returns as a response, where JSON is used in 99% of cases it for REST APIs. Given Server and Client-sides technologies have libraries that can encode/decode JSON without doing much work.

The @Operation & @ApiResponse annotations allow you to add some description to the API documentation with the Open API 3.0 specification. For instance, here is the Swagger UI page for our APIs:

OpenAPI 3 documentations with swagger UI

In a nutshell, it is important to design REST APIs properly while at least considering the performance and ease of use for the API clients.

Entities

It is strongly recommended that you use a base class for entities with manually assigned customized identifiers and common properties, as shown in the following example:

BaseEntity
BaseIdentifierGenerator

Use Data Transfer Objects (DTOs)

While it is also always possible to expose directly the JPA/database entities representation in the REST endpoints to send/receive data from the client. However, this is not the best approach even though it sounds the easy and obvious way, there are several reasons to do not that:

  • It creates high coupling between the persistence models and the API models.
  • The complexity of the hiding or adjusting some field values in the API response based on the user roles and access permissions or some other business rules
  • Possible vulnerabilities or misuse by when the client is able to override some auto-generated database fields such as primary Id or modified date-time or system fields access
  • It exposes the implementation details of the application
  • It makes it harder to add, remove or update any field of the database entities without changing the API or change the data returned by a REST endpoint without changing the internal database structure
  • Supporting multiple versions of an API becomes more challenging

The better approach is conceptually a more simple one, which is defining a separate data transfer object (DTO) that represents the API resource class which is mapped from a database entity or multiple entities, given that they are serialized and deserialized in the API, her is an example of designing a DTO class for an entity:

A DTO class for creating a new Order that contains only properties can be set by the API client.

On the other hand, You won’t need to map your persistence entities to DTOs and vice versa manually, you can use a library like ModelMapper or MapStruct that handles the conversion automatically. As a summary, Using DTOs enables the REST APIs and the persistence layer to evolve independently of each other.

Leverage java bean validation

Before transforming DTO to your API Resource you always want to ensure that the DTO has valid data, where it should be assumed to be bad until it’s been through some kind of validation process.

javax.validation is the top-level package for Bean Validation API and it has some predefined annotation-based constraints, so you can annotate the DTO/entity with javax.validation.constraints.* such as @NotNull, @Min, @Max, and @Size, @NotEmpty, and @Email as shown in the OrderIncomingDto class.

We should annotate controller input with @Valid to enable bean validation and activate these constraints on the income @RequestBody. If the bean validation is failed, it will trigger aMethodArgumentNotValidException. By default, Spring will send back an HTTP status 400 Bad Request. In the next section, I will describe how to customize the error response for the validation errors.

Use global/custom errors handler

Because there are multiple ways a service can crash or break, we must ensure that any REST APIs handle errors gracefully using standard HTTP codes that help consumers in such cases. It is always recommended to build out meaningful error messages for the client, with the clear goal of giving that client all the info to easily diagnose the problem, the following example shows how to define a general exception handler by extending the ResponseEntityExceptionHandler:

GeneralExceptionHandler

To summarize, a few points to consider for error handling as implemented in the above class:

  • To return a correct status code for each type of exception such as BAD_REQUEST(400) for requests with invalid data.
  • Include all relevant information in the response body
  • Handling all exceptions in a standardized way
  • Log all exceptions message and traces for debugging and monitoring purposes

And here is an example of the returned response in case of validation errors:

Invalid request for creating a new order.

Use database migration tools

It is always easy and simple to start with the auto-generation of the database schema based on the entity definitions when using a relational database with Spring Boot using the DDL generation of Spring Data JPA (by adding schema.sql and data.sqlfiles) or letting Hibernate to automatically create the schema by setting spring.jpa.hibernate.ddl-auto to create or update. However, it is highly recommended to use higher-level migration tools Flyway or Liquibase. Such tools provide us a version control of the database schema and keep track of the current state of the database and determine the required steps to migrate the database schema in a fine-grained way.

Properties and configuration

According to the 12-factor application methodology, Configuration that varies among deployment environments should be updated without changing any code. In practice, they are stored in environment variables.

Fortunately, spring-boot makes this relatively straightforward, where you define configurations with placeholders in application.properties. These configuration properties can be set on the command line or by updating environment variables. For example, you can configure the Spring Data JPA including MySQL DataSource and Hibernate properties as follows:

For instance, you can define the value of a property through the Kubernetes deployment’s environment variables:

Also worth mentioning that spring-boot supports configuration per environment based on the active profile, we can change configuration sources for dev and production profiles by using two files named application-dev.properties and application-prod.properties.

Monitoring and availability

Use the spring-boot-actutor to expose out of the box endpoints to that let you monitor and interact with your application, which can be achieved by adding the pring-boot-starter-actuator starter dependency:

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>

Under the hood : the /actuator/health endpoint will be then automatically exposed that provides basic health information, we will get the following response when calling it:

Health endpoint response

Health check endpoints let the system know if an instance of your microservice is working and available to serve client requests (alive) or not (dead). Furthermore, you can set up the Liveness and readiness probes for spring boot microservices deployed on Kubernetes, as shown in the following example:

Liveness and Readiness setup for Kubernetes deployment

Where you can configure which Health Indicators are part of the probes as follows:

For more detailed information about production-ready features in spring-boot, refer to specification here.

Containerize it

A Dockerfilefor a spring-boot application could look like:

Dockerfile

The final Docker image is built using multi-stage builds feature that copies the result from one image to another, where the first image is labeled as “build” and it is used to run Maven and generate the fat jar from the source code, and the second image is used to run the microservice.

General guidelines and recommendations

  • DON’T FORGET to test your code by writing unit and integration tests. It’s not an exaggeration to say that each piece of code and API must be covered by a test!
  • Add monitoring alert for failures and downtimes
  • Automate build and deployment management (CI/CD pipeline) e.g., using Jenkins or bitbucket pipelines
  • Always use the latest releases by regularly upgrading the dependencies
  • Write clear and concise logs that are easy to read and parse
  • Consider applying Domain-driven Design (DDD) principles
  • Cache data to improve performance
  • Secure your APIs
  • Finally, microservices should be kept as small as possible as only one business function per service

Summary

This post has covered a list of best practices for spring-boot based microservices, I hope that following these best practices will help you build more robust and successful microservices in spring-boot. These factors do not have to be strictly followed. However, keeping them in mind allows you to build portable applications or services that can be built and maintained in continuous delivery environments.

The complete source code of this article is available in the GitHub project.

Further reading & references:

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Bursts of code to power through your day. Web Development articles, tutorials, and news.

Responses (2)

Write a response