Friday, November 8, 2019

Streaming Spring boot logs to ELK stack

In my previous blog, we have done ELK installation on windows 10 and we have even tried to push messages from input console to Elastic Search and finally viewed on Kibana Server.

I will write a separate blog on why do we need ELK?

In this blog, I’ll show you how can we push spring boot application log directly to Elastic search using Logstash which we can analyze on Kibana and If you don’t know how to install ELK on windows 10 then you can refer my previous blog and start Elastic Search and Kibana server.

Prerequisite


  • Elastic Search and Kibana running on your machine
  • Basic knowledge of Spring boot application


If you don’t want to start your application from scratch then you can download one spring boot application from my GitHub repository as well.

I am assuming that the Elastic Search and Kibana server are running on your machine and you have a fair idea of how to start the Logstash server and what is Logstash conf file.

So, to push spring boot logs continuously to Elastic Server, We have to open one TCP port in Logstash server and for that we have to create one Logstash config file (say elklogstash.conf) under ${LOGSTASH_HOME}/conf directory mentioning on which port TCP port should be listening under input filter and where to push the data once we received under Output filter.

For simplicity, I am skipping the filter tag as it is optional.

elklogstash.conf




Now start the Logstash server bypassing newly created conf file.
   bin\logstash -f .\config\elklogstash.conf



Cool! Now Logstash server is also up and running and if you observe the log, you will realize that it is also listening on port 4560 as mentioned in the conf file. Configure the newly created index (elkbootlogs) on Kibana as we have done during the ELK setup.

Now let's do some changes to spring boot application so that it can push all the logs to 4056 TCP port.

For this tutorial, I am using spring-logger project from my Github repository.

Add below dependency to the pom.xml file. We need Logstash encoder to encode messages.

<!-- Added for logstash Encoder-->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>6.2</version>

</dependency>

Open logback-spring.xml file which is under the resource folder and create new appender (say elk). The task of this appender is to push logs to the destination TCP socket and under this appender, compulsory use LogstashEncoder.

<appender name="elk" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
    <destination>localhost:4560</destination>
    <!-- encoder is required -->
    <encoder class="net.logstash.logback.encoder.LogstashEncoder" />

</appender>

Add new appender to root level

<!-- LOGGING everything at INFO level -->
<root level="info">
<appender-ref ref="RollingFile" />
<appender-ref ref="Console" />
<appender-ref ref="elk" />
</root>

Save all files and start your application. So, we are done with all the setup. Its time to check whether all the changes are done properly or not.

Open Kibana on your browser (http://localhost:5601) and select your index under the Discover tab. You will see all logs are populating on Kibana as well.



Congratulations! Our configuration is working absolutely fine and it is pushing logs to Elastic Search. 

You can download the source code from here, ELK code chnages are under elkstack branch.






Sunday, November 3, 2019

ELK (ElasticSearch Logstash and Kibana ) Installation on Windows 10

In this blog, I’ll show you how can we install ELK on our windows 10 machine - that is ElasticSearch, Logstash, and Kibana.

These three different products are most commonly used together for log analysis. Using ELK stack, we can achieve centralized logging which helps in identifying the problems. 

ELK is heavily used in microservices architecture where your docker images are running on 1000's of POD and you can't go to each pod to trace the logs.

Logstash:


It is the data collection pipeline tool. It is the first component of ELK Stack which collects data inputs and feeds it to the Elasticsearch. It collects various types of data from different sources, all at once and makes it available immediately for further use.

Elasticsearch:


It is a NoSQL database which is based on Lucene search engine and is built with RESTful APIs. It is a highly flexible and distributed search and analytics engine. Also, it provides simple deployment, maximum reliability, and easy management through horizontal scalability. It provides advanced queries to perform detailed analysis and stores all the data centrally for quick search of the documents.

Kibana:


Kibana is a data visualization tool. It is used for visualizing the Elasticsearch documents and helps the developers to have an immediate insight into it. Kibana dashboard provides various interactive diagrams, geospatial data, timelines, and graphs to visualize the complex queries done using Elasticsearch. Using Kibana you can create and save custom graphs according to your specific needs.

Unzip all the three folders to get their folder files.

Install ElasticSearch

  • Open ElasticSearch folder and go to its bin folder.
  • Run ElasticSearch to start the ElasticSearch server.

  • Once started, go to browser and type localhost:9200

Install Kibana

  • Open Kibana folder and go to its bin folder.
  • Run kibana.bat file to start the Kibana server.

  • Once started, go to browser and type localhost:5601

Install Logstash

  • Logstash is a pipeline that pushes data to elasticSearch. So, before starting the Logstash. We have to create one config file.
  • Logstash config file will be having three parts i.e. input, filter (Optional) & output.
  • Create logstash.conf file under ${logstash}/conf folder. it will simply take the input from the console and push it to ElasticSearch.

  • Run below command to start Logstash server and it will wait for the input to push it to elasticsearch.

bin\logstash -f .\config\logstash.conf

  • Once started, go to browser and type localhost:9600
  • To push the data, I have copied the log file of one project to the console.

  • Once done, go to Kibana portal, Management-> Index Patterns -> Create Index pattern. You will observe that logstashdemo which we have set in logstash.conf file is already present here. Now define Kibana index by setting the same name and click on next step button.

  • Add @timestamp to set Default time and click on Create Index Pattern.

  • Index has been created. Now, to view the data, go to discover tab and click on message (All the logs will be pushed under message index)



Congratulation! We are done with the ELK Setup on Windows 10


How to create Docker Image and push java app in a Docker Engine

In this blog, I am going to share my knowledge on the creation of a docker image and how can we run in a Docker Engine.

Prerequisite

  • Basic Knowledge of Docker
  • Docker must be running on your machine.
  • Good to aware of Spring boot application.
I already have one spring boot application in my IntelliJ which expose one endpoint /users/{id}. We will see how can we push and run this application in a docker container. 



We need to create one file named Dockerfile to add docker instruction (Check above image).


Now go to Terminal and check whether the docker is running or not on your machine.


Run docker build to create an image and push it to the container using the command.
docker build -f Dockerfile -t docker-spring-ehcache .

The above command will execute all the operations that we have mentioned in our Dockerfile like pulling OpenJDK 8 from the docker hub if not exist.

Let's see if our image got pushed to docker containers or not by listing all docker images.
docker images


Great! Our image is present in the docker container. Let's run it.
docker run -p 8070:8085 docker-spring-ehcache

Over here, we are telling the Docker to start the application and map docker container port 8085 to our local port 8070. 

Note: Make sure your application has started at 8085 in docker container else it won't be able to map it. Spring boot by default start application on port 8080 so please specify server.port-8050 in application.properties file.



Now go to your browser and hit the endpoint and see if we are getting the response from localhost:8070 or not.



Happy Coding!!!

Friday, November 1, 2019

Sonar Integration with Maven

In my previous blog, we have already seen how to setup SonarQube server on Windows 10. We have also seen that how can we generate sonar report using sonar-scanner. In this blog, I’ll show you how to generate sonar report by configuring sonar dependency to maven project. 

Steps to setup sonar in Maven

  • We have to configure pluginManagement and Profile for Sonar in pom.xml file
  • Add below pluginManagement dependency to your pom.xml 
<pluginManagement>

<plugins>

<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-compiler-plugin</artifactId>

<version>3.8.1</version>

</plugin>

<plugin>

<groupId>org.sonarsource.scanner.maven</groupId>

<artifactId>sonar-maven-plugin</artifactId>

<version>3.6.0.1398</version>

</plugin>

<plugin>

<groupId>org.jacoco</groupId>

<artifactId>jacoco-maven-plugin</artifactId>

<version>0.8.4</version>

</plugin>

</plugins>

</pluginManagement>


  • Add Profile to pom.xml file

<profiles>

<profile>

<id>coverage</id>

<activation>

<activeByDefault>true</activeByDefault>

</activation>

<build>

<plugins>

<plugin>

<groupId>org.jacoco</groupId>

<artifactId>jacoco-maven-plugin</artifactId>

<executions>

<execution>

<id>prepare-agent</id>

<goals>

<goal>prepare-agent</goal>

</goals>

</execution>

<execution>

<id>report</id>

<goals>

<goal>report</goal>

</goals>

</execution>

</executions>

</plugin>

</plugins>

</build>

</profile>

</profiles>


  • Build the project, execute all the tests and analyze the project with SonarQube Scanner for Maven:
                 mvn clean verify sonar:sonar

  • Once done, Check your SonarQube which will generate the code analysis report for your current project. My project name was api-gateway so it generated with the name api-gateway.


Code Analysis using SonarScanner on Windows 10

In my previous blog, we have already seen how to setup SonarQube server on Windows 10. In this blog, I’ll show you how to generate sonar report using SonarScanner. 

Step to setup SonarQube


  • Unzip it and open sonar-scanner.properties which are under conf directory.
  • Edit the below lines

  • Now, go to your project folder directory, open command prompt and run sonar-scanner.bat.

  • It will do the analysis and then post the result to the SonarQube server http://locathost:9000/ having the project name as sonar key that we have configured in sonar-scanner.properties file.


  • You can check the JUnit test code coverage as well by clicking on Coverage.

Happy Coding..!!!

SonarQube setup on windows 10

Overview

SonarQube is an automatic code review tool to detect bugs, vulnerabilities and code smell in your code. It can integrate with your existing workflow to enable continuous code inspection across your project branches and pull requests. 

Prerequisite

Make sure you have JAVA 11 or higher version installed on your window machine.


Step to setup SonarQube

  • Download Community edition from https://www.sonarqube.org/downloads/
  • Extract it and go to the bin folder.
  • Choose windows-x86–32 or windows-x86–64 based on your machine configuration.
  • Run StartSonar.bat which will start the SonarQube server. 
  • Open browser and hit http://localhost:9000

  • If you want, you can start the sonarQube server to a different port by just updating the port number (sonar.web.port=9070) to sonar.properties which is present under conf directory.
  • You can login to the portal using default credential (admin:admin).

Congratulation! SonarQube server is up and running on localhost:9000.

Saturday, October 12, 2019

Postman API – Tips and Tricks

I have seen many people using POSTMAN tools but very few of them know how to exactly use all the features of POSTMAN application. So, in this article, I am going to share a few tips/tricks which can be really helpful in our API testing.

  • Set Environment Variables


Let’s assume we have an endpoint to get some data after authenticating yourself bypassing username/password.

Now, you have to test this endpoint which is deployed on multiple environments like a local machine, test environment and on SIT environment as well. Also, the credential is different for each environment.

So, how are we going to test it?

Most of the time, I have seen that people create one-one requests for each environment or create one request and then modify the existing username/password and hostname to point to a different environment.

The Simple Solution is to set changing parameters as an Environment variable and switch environment to test for different regions.

E.g. http://<HOSTNAME:POST>/token and <username>/<password> as basic Authentication 


How to set environment variables?

  • Click "Manage Environments" icon in the upper right corner of the Postman app.
  • Click the Add button.
  • Click "Environment" and enter a name for the new environment.
  • Add the variables you want to save as key-value pairs.
  • Click the Add button.





Selecting an active environment

  • Click the dropdown menu in the upper right corner of the Postman app to select an active environment, or type in the environment name.
  • Once you select an environment, you can access variables in the active environment scope.

All the parameters get changed as we switch from one environment to another one.

  • Set Global variable


Global variables provide a set of variables that are available in all scopes. You can have multiple environments, but only one environment can be active at a time with one set of global variables.

To manage global variables, click the gear icon in the upper right corner of the Postman app and select "Manage Environments".

Click the Globals button at the bottom of the modal to bring up a key-value editor to add, edit, and delete global variables.




 

  • Set Response to the global variable


There are so many scenarios where we wanted to add one of the API responses as a request parameter to the second request like getting token from one API and then passing this token to another endpoint to access the resources.

Assuming, I am hitting API to get access token and then setting access-token as a global variable. 




  • Once you get the access_token in your response then go to Tests tab.
  • Click on ‘set a global variable’ or add scripts as shown below
    • let response = pm.response.json();
    • pm.globals.set("access_token", response.access_token);
  • To access this variable in some other request, just pass the access_token parameter as {{access_token}} as we have done something similar for dev_hostname.
E.g. 


  • How to pass the path variable dynamically

I guess this is one of the major mistakes I have seen people doing it i.e. passing hardcoded path variable in the URL request

The below diagram will show you the correct way of passing it. Using this, you can update params value very easily.

http://hostname/users/:userId

To edit the path variable, click on Params to see it already entered as the key. Update the value as needed.



  • Save custom headers

You can save commonly used headers together in a header preset. Under the Headers tab, you can add a header preset to your request when you select "Manage Presets" from the Presets dropdown on the right. 





  • Check all request/response send to API

Goto View-> Show Postman Console



How TOPT Works: Generating OTPs Without Internet Connection

Introduction Have you ever wondered how authentication apps like RSA Authenticator generate One-Time Passwords (OTPs) without requiring an i...