In this article we will learn Send Java Maven App Logs to Elastic Stack | How to Monitor Java Maven App Logs with Elastic Stack. Monitoring application logs is essential for debugging issues, analyzing system performance, and ensuring smooth operation. In this guide, we will build a Java application using Maven and configure structured logging with Logback. We will then set up the Elastic Stack (Elasticsearch, Kibana, and Filebeat) to collect, store, and visualize logs efficiently. By the end of this tutorial, you will have a fully functional logging setup that helps in centralized log management, making it easier to analyze and troubleshoot issues.
Table of Contents
Prerequisites
- AWS Account with Ubuntu 24.04 LTS EC2 Instance.
- At least 2 CPU cores and 4 GB of RAM for smooth performance.
- Java and Maven installed.
Step #1:Setting Up Ubuntu EC2 Instance
Before we start, ensure that your system is ready with the necessary tools. We’ll need Java, Maven, and the Elastic Stack (Elasticsearch, Kibana, and Filebeat).
Update the Package List to ensure you have the latest versions.
sudo apt update

Install OpenJDK 17, which is required to build and run the Java application.
sudo apt install openjdk-17-jdk -y

Maven is a build automation tool used for Java projects. Install it using following command.
sudo apt install maven -y

Step #2:Creating a Java Application with Maven
Now that the environment is set up, let’s create a simple Java application using Maven.
First create a directory and navigate to it.
mkdir my-java-app
cd my-java-app

Use the Maven archetype to generate a basic Java project.
mvn archetype:generate -DgroupId=com.example -DartifactId=my-app -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false
This command creates a new Maven project in the my-app
directory.

You will see the build success message like below.

Navigate to the Project Directory.
cd my-app

Step #3:Configuring the Java Application
We’ll now configure the Java application to include logging and an HTTP server.
Open the pom.xml
file and add dependencies for SLF4J (a logging facade) and Logback (a logging implementation). Also, configure the Maven plugins for building the project.
nano pom.xml

Modify it as shown below.
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>my-app</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>my-app</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- Add SLF4J and Logback -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.1.0-alpha1</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.5.18</version>
</dependency>
</dependencies>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.14.0</version>
<configuration>
<source>17</source>
<target>17</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.7.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>com.example.App</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

Explanation of the code:
- Project Metadata
- Defines project information (
groupId
,artifactId
,version
,packaging
,name
,url
). - This project is named my-app and will be packaged as a JAR.
- Defines project information (
- Dependencies
- JUnit (3.8.1): For unit testing (
scope: test
means it’s only used for testing). - SLF4J API (2.1.0-alpha1): A logging facade.
- Logback (1.5.18): A logging implementation.
- JUnit (3.8.1): For unit testing (
- Compiler Properties
- Specifies Java 17 as the source and target version.
- Build Configuration
- Maven Compiler Plugin (
maven-compiler-plugin
): Compiles the Java code with Java 17. - Maven Assembly Plugin (
maven-assembly-plugin
):- Creates a JAR-with-dependencies, including all required libraries.
- Sets
com.example.App
as the main class for execution. - Runs during the package phase.
- Maven Compiler Plugin (
Next open the App.java file.
nano src/main/java/com/example/App.java

Replace the contents of src/main/java/com/example/App.java
with the following code to create an HTTP server and log requests.
package com.example;
import com.sun.net.httpserver.HttpServer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetSocketAddress;
public class App {
private static final Logger logger = LoggerFactory.getLogger(App.class);
public static void main(String[] args) throws IOException {
logger.info("Starting the application...");
// Create an HTTP server listening on port 8080
HttpServer server = HttpServer.create(new InetSocketAddress(8080), 0);
server.createContext("/", exchange -> {
// Get the client's IP address and port
String clientAddress = exchange.getRemoteAddress().getAddress().getHostAddress();
int clientPort = exchange.getRemoteAddress().getPort();
// Get the request URI
String requestURI = exchange.getRequestURI().toString();
// Get the Host header (e.g., "65.2.182.24:8080")
String hostHeader = exchange.getRequestHeaders().getFirst("Host");
// Construct the full URL
String fullUrl = "http://" + hostHeader + requestURI;
// Ignore requests for /favicon.ico
if (requestURI.equals("/favicon.ico")) {
exchange.sendResponseHeaders(404, -1); // Return 404 for favicon requests
return; // Do not log or process further
}
// Log the request with client details and full URL
String requestMethod = exchange.getRequestMethod();
logger.info("Received {} request for URL: {} from {}:{}", requestMethod, fullUrl, clientAddress, clientPort);
// Write "Hello, World!" to the response
String response = "Hello, World!";
exchange.sendResponseHeaders(200, response.length());
try (OutputStream os = exchange.getResponseBody()) {
os.write(response.getBytes());
}
// Log the response
logger.info("Sent response: {}", response);
});
// Start the server
server.start();
logger.info("Server started on port 8080");
// Keep the application running
logger.info("Application is running. Waiting for incoming requests...");
}
}

Explanation of the code:
- Logging Setup
- Uses SLF4J with Logback for structured logging (
logger.info()
).
- Uses SLF4J with Logback for structured logging (
- HTTP Server Initialization
- Creates an HTTP server on port 8080 using
HttpServer.create()
. - Defines a request handler for the root path (
/
).
- Creates an HTTP server on port 8080 using
- Request Handling & Logging
- Retrieves:
- Client’s IP address and port.
- The full URL of the request.
- The request method (GET, POST, etc.).
- Ignores
/favicon.ico
requests by returning 404. - Logs details of the incoming request.
- Retrieves:
- Response Handling
- Sends an HTTP 200 response with the text
"Hello, World!"
. - Logs the sent response.
- Sends an HTTP 200 response with the text
- Server Execution
- Starts the server (
server.start()
). - Keeps running, waiting for requests.
- Starts the server (
Create the resources directory and add a Logback configuration file.
mkdir -p src/main/resources

nano src/main/resources/logback.xml

Add the following Logback configuration.
<configuration>
<!-- Appender to log to the console -->
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<!-- Appender to log to a file -->
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>logs/my-app.log</file>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<!-- Root logger to log to both console and file -->
<root level="info">
<appender-ref ref="CONSOLE" />
<appender-ref ref="FILE" />
</root>
</configuration>

Explanation of the code:
- Console Appender (
CONSOLE
)- Logs messages to the console (stdout).
- Uses a pattern format:
[Timestamp] [Log Level] [Logger Name] - [Message]
Example log output:2024-03-21 12:00:00 INFO com.example.App - Server started on port 8080
- File Appender (
FILE
)- Logs messages to a file (
logs/my-app.log
). - Uses the same log format as the console appender.
- Logs messages to a file (
- Root Logger Configuration
- Log level:
info
(logs messages of level INFO and higher, like WARN and ERROR). - Outputs logs to both:
- Console (
CONSOLE
) - File (
FILE
)
- Console (
- Log level:
Step #4:Build the Application
Use Maven to build the project and create a JAR file.
mvn clean package

You will see the build seccess message like shown below.

Step #5:Setting Up Elastic Stack for Log Monitoring
Now that the Java application is running and generating logs, let’s set up the Elastic Stack to monitor these logs.
Import the Elasticsearch GPG key.
curl -fsSL https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elasticsearch-keyring.gpg

Add the Elasticsearch repository.
echo "deb [signed-by=/usr/share/keyrings/elasticsearch-keyring.gpg] https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-8.x.list

Now lets update the package list again. The repository is added to the system’s package sources.
sudo apt update

Install Elasticsearch.
sudo apt install -y elasticsearch

Modify Elasticsearch configuration for remote access.
sudo nano /etc/elasticsearch/elasticsearch.yml

Find the network.host
setting, uncomment it, and set it to 0.0.0.0
to bind to all available IP addresses and uncomment the discovery
section to specify the initial nodes for cluster formation discovery.seed_hosts: []

For a basic setup (not recommended for production), disable security features.
xpack.security.enabled: false

Save and exit the editor.
Enable and start Elasticsearch.
sudo systemctl enable elasticsearch sudo systemctl start elasticsearch

Check the status of the elasticsearch to ensure it is running.
sudo systemctl status elasticsearch

Send a GET request to check if Elasticsearch is running and responding. If successful, you should see a JSON response with cluster information.
curl -X GET "localhost:9200"

You can access it using browser with your Public IP address:9200 port which is a default port for Elasticsearch.

Kibana provides visualization for Elasticsearch data. Install Kibana on the system.
sudo apt install -y kibana

Open the Kibana configuration file for editing.
sudo nano /etc/kibana/kibana.yml

Uncomment and adjust the following lines to bind Kibana to all IP addresses and connect it to Elasticsearch.
server.port: 5601 server.host: "0.0.0.0" elasticsearch.hosts: ["http://localhost:9200"]

Enable and start Kibana.
sudo systemctl enable kibana
sudo systemctl start kibana

Checks the status of Kibana.
sudo systemctl status kibana

Access the Kibana interface by navigating to http://<your-server-ip>:5601
in your web browser. This will open the Kibana dashboard where you can start exploring your data..

Filebeat collects and forwards log data to Elasticsearch or Logstash. Install Filebeat on the system.
sudo apt install -y filebeat

Edit the Filebeat configuration file to ship logs from logs/my-app.log
to Elasticsearch.
sudo nano /etc/filebeat/filebeat.yml

Update the filebeat.inputs
and output.elasticsearch
sections as shown below.
Change the type to log and change enable to true from false also comment out the id: my-filestream-id and give the path to our custome log file.
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/ubuntu/my-java-app/my-app/logs/my-app.log

output.elasticsearch:
hosts: ["localhost:9200"]

Test the configuration.
sudo filebeat test config

Apply Filebeat setup changes.
sudo filebeat setup

Start and enable the Filebeat service.
sudo systemctl enable filebeat
sudo systemctl start filebeat

Checks the status of filebeat.
sudo systemctl status filebeat

Ensure Elasticsearch is receiving data from Filebeat by checking the indices.
curl -XGET "localhost:9200/_cat/indices?v"
You should see output indicating the presence of indices created by Filebeat.

Step #6:Run the Application
Execute the JAR file to start the application.
java -jar target/my-app-1.0-SNAPSHOT-jar-with-dependencies.jar

Open your browser and run http://<Public-IP-Address>:8080 to access the application. You will see the Hello, World! message.

The logs will be written to both the console and the logs/my-app.log
file.

cat logs/my-app.log

Step #7:Visualize Logs in Kibana
Now go back to Kibana. Scroll down and click on the Logs option in Obeservability in the left-hand navigation menu. If the menu is collapsed, click the Expand icon at the bottom left to reveal the options.

Kibana displays logs data from the last 15 minutes, visualized as a histogram along with individual log messages below. (You may need to adjust the time range.)

You can even see the details of the your Java logs. You can see the details of our Cloud provider also some other details.


Conclusion:
By following this guide, we successfully built a Java application using Maven and implemented structured logging with Logback. We then set up Filebeat to collect logs and forward them to Elasticsearch, where they can be analyzed using Kibana. This setup provides a scalable and efficient way to monitor application logs, helping developers gain insights and quickly resolve issues. With Elastic Stack in place, you can extend the system further by adding alerts, dashboards, and advanced log analytics.
Related Articles:
How to Install Elastic Stack on Ubuntu 24.04 LTS
Install Elastic Stack on Amazon Linux 2
Send Apache2 Logs to Elastic Stack and Filebeat
Reference: