AWS, Deploy API lambda service to upload presigned images to S3, managed with a Pipeline

In this entry I would like to share an example to show how to deploy a serverless API service in AWS using the CodePipeline.

In this scenario there is an static web application that is consuming the published API to upload images via Java lambda function and get a presigned url to grant temporal access to the file stored in S3, in this way we can keep the bucket as private and provide temporal access to the files.

The next diagram shows the architecture for this solution:

As you can see, there is a Cloud9 development environment and CI/CD pipeline which is triggered by CodeCommit when the changes are pushed to the repository, then start to build the application with CodeBuild, finally the API is published to be consumed by the web site.

So let’s get started.

Seguir leyendo

AWS ECS, Run Java microservices using docker containers in ECS

In this entry I’m going to show an example to run Java microservices in containers in AWS, I’m using a simple web application with a provided source code from my last training that you can donwload from my github repository.

Please notice that refactoring any monolithic application it could be a very complex task and depends of each context as well as the analisys before to modernize any component to be decoupled, as reference this is is the initial architecture and how must to be after the modernization in containers.

AS-IS, a highly available environment for a monolithic Java application.

TO-BE, the highly available environment for the containeraized Java application

As same as my previuos entries, all information were taken from my last training, therefore some components were already provided:

  • Development IDE
  • Development Pipeline
  • A ECS Cluster
  • A RDS instance
  • A Custom VPC

Let’s get started.

Seguir leyendo

AWS EC2, Implement CI/CD pipeline for a monolithic Java application

In this entry I would like to share an example to show how to automate the build and deployment in AWS using Code Pipeline with a Java application. The next steps are taken from my last training and I can’t share all detailed steps because my user has restricted permissions, but I consider this is enough if you know the basics, like the vpc configuration and the IAM roles. So, let’s get started.

The next diagram shows the architecture to implement, the source code is located in Code Commit and will be edited using Cloud9, afther the changes are pushed into the repository the application artifacts would be built using Code Build and deployed with Code Deploy, this last will manage the load balancer to distribute the traffic and deploy the application into the auto scaling group servers.

Seguir leyendo

AWS EC2, build high available architecture for a Java web application

In this entry I would like to share my recent training activity in AWS where I’ve learn how to build and deploy the next architecture for a Java web application in a highly availability infrastructure.

As you can see, it has an Aplication Load Balancer (ALB) configured to use an Auto Scaling Group (ASG), it will launch EC2 instances in a private subnet, every EC2 instance would connect to RDS database (MySQL).

This time I can’t share all detailed steps because I have a restricted user and it don’t have enough permissions to read the security groups neither VPC configuration, including other elements like the instance profile, for that reason I’m going to describe it in summarized way to keep it simple.

Let’s get started.

Seguir leyendo

Red Hat Toolkit to modernize a Java application

Looking in Red Hat Developer web page I have found an application to guide the moderinization of the old Java applications, there are two flavors: Migration Toolkit for Applications (MTA) and Migration Toolkit for Runtimes (MTR).

Here are the provided description of each one:

MTA: Simplify the modernization of your legacy applications and reduce risks with the migration toolkit for applications – included with a Red Hat OpenShift subscription. This tool gives project leads and migration teams insight and alignment as they move to Red Hat OpenShift— whether at the portfolio or application level. It provides a simpler, faster way to modernize your applications for the cloud.

MTR: The migration toolkit for runtimes is an assembly of tools that support large-scale Java application modernization and migration projects across a broad range of transformations and use cases. It accelerates application code analysis and code migration, supports effort estimation, and helps you move applications to the cloud and containers.

What is the difference?

Here are the recomendations from the product page:

If you have a Red Hat OpenShift subscription, we recommend using the migration toolkit for applications. It offers additional capabilities that focus on application portfolio management and collaborating across teams for modernization projects.

If you don’t have a Red Hat OpenShift cluster but are modernizing applications to run in the cloud and containers, we recommend using the migration toolkit for runtimes. While it doesn’t include capabilities that focus on application portfolio management and cross-team collaboration, it’s a powerful tool for analyzing application code, estimating effort, and accelerating code migration.

In this entry I’m going to test the MTR tool, so lets get started.

Please keep this in mind: «The migration toolkit for runtimes simplifies the migration and modernization of Java applications by examining application artifacts, including project source directories and application archives. The tool then produces an HTML report highlighting areas needing changes.«

Seguir leyendo

Run the sonarqube analysis locally over a Java project (version 19) using a container

Recently I had to execute an static code analysis over a Maven Java project builded with the JDK 19, but I couldn’t do it because I have and old Sonarqube server (version 7.8) which don’t support that version of Java language.

In order to skip this limitation I’ve used a new version of Sonarqube executed into a container, and I want to share the steps here. Please note tha I have used Podman instead of Docker.

Seguir leyendo

Creating a distributable JAR file with maven

In this entry I’m going to show how to package a jar file using a maven multi module project with different alternatives based on my working experience for microservices.

First, let’s take a look the project structure, demo-jar-assembly is composed by two maven modules (the java code it isn’t important here only the configuration files), the module1 is a dependency for module2.

\demo-jar-assembly
|   pom.xml
|
+---module1
|   |   pom.xml
|
\---module2
    |   pom.xml

Let’s shows some alternatives to build the distribution package.

Seguir leyendo

Podman, basic steps for a Java project

Podman is an open source utility that can be used to create and mantain containers, and provides an good alternative to Docker.

In this entry I’m going to show how to execute some basic commands using a Java project. All source code is published in my GitHub repository.

Virtual machine management commands

Let’s take a look to the commands with the help parameter, I think they are very descriptive:

$ podman machine --help 
Manage a virtual machine

Description:
  Manage a virtual machine. Virtual machines are used to run Podman.

Usage:
  podman.exe machine [command]

Available Commands:
  info        Display machine host info
  init        Initialize a virtual machine
  inspect     Inspect an existing machine
  list        List machines
  rm          Remove an existing machine
  set         Sets a virtual machine setting
  ssh         SSH into an existing machine
  start       Start an existing machine
  stop        Stop an existing machine
Seguir leyendo

Docker, using bind mounts in a Java project

The bind mounts in a short is way to set a mount point from the host to the container, it means the files are accesible from the file system where the container is launched, for example it could be useful to mount the source code for a web site into the container and do some code changes in the UI and view them inmediatly without build again the Docker image.

Cool right?

Well for Java projects is not so cool, because when the source code is changed it must be compiled and packaged again to be executed or deployed, but the bind mounts could be useful too when there are some external files like configuration properties, input and output files to read, etc. So, if you are working on development environment or in your local machine, perhaps you would have to consider use the bind mounts to get access easily to the files when you are coding or testing new features.

In this entry I’m going to show how to apply in a Maven Java project. All code can be downloaded from my GitHub repository.

Seguir leyendo

Docker basic steps for a Java Project

In this entry I’m going to show how to build a docker image and deploy a Maven Java project from a simple jar file and execute it as micro service.

All code is published into my GitHub repository.

Build an image and run into a container

First, crate a file named Dockerfile at the top of your maven project folder with the next content:

# OS image
FROM alpine:3.14

# Exposed port
EXPOSE 8000

RUN apk --update-cache add openjdk11
WORKDIR /demo
COPY . .
CMD ["java","-jar", "target/demo-1.0-SNAPSHOT.jar", "start"]

This is a very simple configuration file, as you can see the selected image is an alpine linux exposed by the port 8000 and the openjdk11 is installed with the package manager, also I’ve setted a working directory name and all the content is copied from the Maven folder to the image. Finally, the jar is executed with the CMD command.

Seguir leyendo