Rasmus Olsson

Quality matters – Setting up Sonarqube

July 17, 2020

As a developer you regularly find bad code. The definition of bad code is a very broad topic and many books have been written to cover different techniques to write better code… and reading them is great for learning how to do that.

The main problem I’ve experience when reading about these techniques is that in practice the single outstanding argument for not writing good code is not that you don’t know how to do it, but rather that you don’t take or have the time to do it.

You may have a hard deadline, that if you don’t deploy this on Monday morning your business will lose money. That’s a valid argument. So you, the product manager or scrum master is pointing out to create a task in the backlog to clean up that technical dept after the release. This is also a great approach! We have now fulfilled the business needs and the technical dept. But what about those tech debts that you as a developer may have missed or the team conventions that you didn’t follow or that OWASP top 10 issue that you didn’t know about? How can we minimize the risk of them occurring without spending time during a rushed project?

One tool that can help here is Sonarqube.

Sonarqube is a continuous code inspection tool. Whose idea is to statically analyze the source code to prompt for hints, quality improvements, conventions and potential bugs. You can and have to configure Sonarqube to meet your teams expectations and convention. Sonarqube supports many languages and is easy to get continuous by integrating it with your CI pipelines.

To get started, we need to set up the sonarqube server. The docker-compose file below will set up sonarqube together with the dependent PostgreSQL database.

version: '3' services: sonarqube: image: sonarqube expose: - 9000 ports: - '127.0.0.1:9000:9000' networks: - sonarnet environment: - SONARQUBE_JDBC_URL=jdbc:postgresql://db:5432/sonar - SONARQUBE_JDBC_USERNAME=sonar - SONARQUBE_JDBC_PASSWORD=sonar volumes: - sonarqube_conf:/opt/sonarqube/conf - sonarqube_data:/opt/sonarqube/data - sonarqube_extensions:/opt/sonarqube/extensions - sonarqube_bundled-plugins:/opt/sonarqube/lib/bundled-plugins db: image: postgres networks: - sonarnet environment: - POSTGRES_USER=sonar - POSTGRES_PASSWORD=sonar volumes: - postgresql:/var/lib/postgresql - postgresql_data:/var/lib/postgresql/data networks: sonarnet: volumes: sonarqube_conf: sonarqube_data: sonarqube_extensions: sonarqube_bundled-plugins: postgresql: postgresql_data:

Copy and paste this into a docker-compose.yml file

docker-compose up

Great we now have the server running!

Next step is to upload the data. Sonarqube team provides a concept called “scanners” which will help with that.

For .NET core we can use the dotnet-sonnarscanner which I will show you in this example. For more scanners’ checkout https://docs.sonarqube.org/latest/analysis/overview/

To install the .NET core scanner we can simply add it though the dotnet CLI.

dotnet tool install --global dotnet-sonarscanner

Next step is to use the scanner to upload data to sonarqube server.

To start, we will tell the scanner to attach itself to the Roslyn compiler. We need to provide /k: “project-key” where project-key is the name of the project on which sonarqube server will place its analysis.

dotnet-sonarscanner begin /k:"project-key"

We trigger the build.

dotnet build

And lastly, we turn off the scanner and upload the result. By default, the scanner will look for localhost:9000 which is the same port specified in the docker-compose file.

dotnet-sonarscanner end

That’s pretty much it. You can now start configuring the rules together with your team. Note though, that you may find already preconfigured templates that can give you a headstart on this. The configuration of rules may be an iterative process so don’t give your self high hopes on the first iteration.

https://www.sonarqube.org/

Happy Coding!

please share