Home > Blog > Up and Running with Lacework and Jenkins

Up and Running with Lacework and Jenkins

Up and Running with Lacework and Jenkins

In a recent blog post, I talked about how security teams need to focus on collaborating with Developers, Operations, and SREs by bridging the gap with relevant security data to drive testing and automation in CI/CD tooling. In this hands-on blog post, we will put that philosophy into practice by integrating Lacework Container Vulnerability Scanning APIs into a Jenkins pipeline to shift-left and scan images at build time.

Jenkins is about as ubiquitous a product in DevOps tooling as there is. It is extremely versatile to configure continuous delivery pipelines to build and deploy just about any artifact out there, and it provides a ton of flexibility for integrating with other products. Additionally, it is also a great opportunity to “shift-left” and inject security earlier into the software development lifecycle (SDLC) through the use of automated testing, with the outcome being you spend less time, effort, and potentially significantly less money than trying to fix in production.

There are already a multitude of articles, books, and online tutorials on Jenkins, CI/CD, and Docker, so rather than try to cover each of those topics in any depth, this article is going to focus on a very simple pipeline design to understand the basics of integrating Jenkins and Lacework.

What we are Going to Build

Before we begin, it is important to understand the high level workflow here. We are going to spin up a Jenkins pipeline that connects to GitHub and builds a Docker image from a Dockerfile, publishes the image to Docker Hub, and then initiates a container vulnerability scan of that image using the Lacework command line interface image.

Jenkins Pipeline

We’ll use Docker to provision Jenkins locally and connect to GitHub, Docker Hub, and Lacework!

No access to Jenkins?

NOT TO WORRY…We’ve got you covered there as well! We have created a Git repo that you will clone in a minute that contains a docker-compose file to provision Jenkins using Docker locally, and after applying a few configurations you will be good to go!

Prerequisites

To run this tutorial you should feel comfortable on the command-line and have some very basic understanding of both Git, GitHub, and Docker. Additionally, there are a few items you’ll need to have installed, and at the ready:

 

Once you have the prereqs checked off, you are good to go and we can get started!

Fork and Clone the Repository

The first thing you will want to do is to fork the example Git repo and then clone it to your workstation with Git and Docker Desktop installed.

Jenkins

 

 

 

You will need to fork the example repo to your GitHub account in order to connect Jenkins to it

Open a terminal, clone the forked repo, and change directory into jenkins-lacework-tutorial

$ git clone https://github.com/<YOUR USERNAME>/jenkins-lacework-tutorial.git
$ cd jenkins-lacework-tutorial

 

Lacework API Token

In order to authenticate with Lacework and request an on-demand container vulnerability scan we will need to have an API key and secret.

  1. Login to Lacework
  2. Click on “Settings”“API Keys”
  3. Click on “Create New”
  4. Create a new API key
  5. Click the “Download” button next to your key and save the JSON file somewhere as we will need it when we configure Jenkins

The contents of your API key contain a “keyId” and “secret” and will look something like this:

  “keyId”: “ACCOUNT_86858622520DB3B8E6C171247820FA724CDDB19DDDDDDD”,
  “secret”: “_412a4c080e5c8a2e069a4144444444444”
}

 

Jenkins with docker-compose

The root of the jenkins-lacework-tutorial repo contains a docker-compose.yml file that we can use to bring up a test environment comprised of a docker network, two docker volumes, Jenkins running in a docker container, and a Docker in Docker (dind) container to run our build jobs.

docker-compose up

Make sure you are in the root directory of the project we cloned above and execute the following command:

$ docker-compose -p lacework up -d
Creating network “lacework_jenkins” with the default driver
Creating volume “lacework_jenkins-docker-certs” with default driver
Creating volume “lacework_jenkins-data” with default driver
Creating lacework_jenkins_1     … done
Creating lacework_docker-dind_1 … done

 

When Jenkins starts up the first time, it automatically creates an Administrator password that we will need to log in. We can get that password by running the following command:

$ docker exec lacework_jenkins_1 cat /var/jenkins_home/secrets/initialAdminPassword                 

2f8b8d35d6fd41f9804a6de40a4b847e <— output will return the initial admin password

 

Copy that password, open a web browser and go to http://localhost:8080

unlock-jenkins

Paste the Administrator password and click “Continue”

Next, click “Install suggested plugins.” This process takes about a minute or so depending on your Internet connection.

jenkins-install-suggested-plugins

After the plugins are installed you have the option to create an Admin User, or you can just use the default Administrator, go ahead and click “Skip and continue as admin.”

jenkins-instance-configuration

Next click “Save and Finish.”

jenkins-ready

Now, just click “Restart” to finish the initial setup. (Note: you may need to Refresh your browser to get Jenkins reload).

After that you should be ready to login using the Admin password we used initially.

Configuring Jenkins

With Jenkins running locally we are ready to configure the pipeline. The repository that you cloned has a Jenkinsfile that already defines the pipeline we are going to run. That Jenkinsfile makes use of both Environment Variables and Secret Credentials that we will need to set up before configuring the job. Let’s get those going…

Environment Variables

We are going to define three global environments variables. Those variables are as follows:

  • LW_ACCOUNT = The name of your Lacework account
  • LW_API_KEY = This can be found in the JSON file we downloaded from
  • DOCKER_HUB = The username for your Docker Hub account. We will use this to publish docker image we build to Docker Hub

NOTE: Environment variable names are case sensitive

You can jump straight to Manage Jenkins / Configure System by clicking here.

  1. Scroll down to Global Properties
  2. Check the box next to Environment Variables
  3. Add the following three Environment Variables:
    1. Name: DOCKER_HUB, Value: <YOUR DOCKER HUB USERNAME>
    2. Name: LW_ACCOUNT, Value: <YOUR LACEWORK ACCOUNT>
    3. Name: LW_API_KEY, Value: <YOUR LACEWORK API KEY>
  4. Click “Save”

jenkins-configure-system

Credentials

You will need to add your username and password for Docker Hub, as well as another type of credential Jenkins refers to as a Secret Text for your Lacework API Secret. You can jump right to that configuration page by clicking here.

Add your Docker Hub Credentials as follows and use the ID ‘docker_hub’ as that is how it is referred to in the Jenkinsfile

docker-hub-credentials

Click “Ok.”

Next, on the left click “Add Credentials” and add the following credentials:

  • Kind: Secret Text
  • Scope: Global
  • Secret: <Paste your Lacework API Secret from the downloaded JSON>
  • ID: lacework_api_secret
  • Description: lacework_api_secret

jenkins-credentials

Click “Ok.”

At this point you should see the two credentials stored in Jenkins:

jenkins-global-credentials

We are now ready to move on to configuring the pipeline.

Create Pipeline with Blue Ocean

The version of Jenkins we used with our Docker container has a user interface called Blue Ocean installed that makes it really easy to create new pipelines. There is a ton of documentation on Blue Ocean if you want to learn more, but for now let’s get going…

On the left hand side click “Open Blue Ocean.”

Jenkins-blue-ocean

Next click “Create New Pipeline.”

blue-ocean-pipeline

When asked where you store your code choose GitHub.

blue-ocean-github

Next we need to allow Jenkins to monitor our repo for changes and for that we are going to need to create GitHub Personal Access Token. Click on Create an access token here.” 

jenkins-github-access-token

You will be redirected to login to GitHub and immediately taken to the Personal Access Token generation page. Give you token a name that you’ll remember (you can delete after if you wish), and then click “Generate Token.”

jenkins-lacework-tutorial

jenkins-lacework-tutorial

You can now copy the token to your clipboard and paste it back in Jenkins.

jenkins-lacework-tutorial

Once authenticated you should be able to navigate any of the Github Orgs you have access to so you can select the org where you forked the example Git repo at the beginning of the tutorial.

github-org

Select the “jenkins-lacework-tutorial” repo from your list of repositories, and then click “Create Pipeline.”

jenkins-repositority

Jenkins will automatically find the Jenkinsfile in the root of the directory and kick off the pipeline.

Jenkinsfile

You can click on the pipeline to watch the build, publish and finally the scan from Lacework!

jenkins-build

The output here shows a human readable summary of the vulnerabilities found. If no vulnerabilities are found the output will reflect that as well. There is actually a ton of data that comes back via this API call, and we are really just scratching the surface. For more information on the Lacework CLI and what you can do with it click here.

Spinning Jenkins Down

Once you are ready to tear down the test environment you have two options.

Save the State

If you are going to continue to play around with Jenkins as it is configured and want to save the state, you can run the following command:

$ docker-compose -p lacework down

When you are ready to continue your work you just run:

$ docker-compose -p lacework up -d

Delete State

If you want to complete tear down Jenkins and remove the docker volumes and network, run the following command:

$ docker-compose -p lacework down –volumes

Conclusion

Injecting security into your SDLC should be a priority. Companies adopting these best practices are reaping the benefits by catching security vulnerabilities earlier in the development process, rather than constantly chasing their tails trying to find and fix in production. Moreover, the adoption of this pattern fosters better collaboration across teams as security, operations, and development work together to keep products shipping fast, efficiently, and securely.

We are just getting started with here, so stay tuned for more from this series, Up and Running with Lacework.

 

 

Share this with your network
FacebookTwitterLinkedInShare