Tuesday, February 18, 2020

Using Gitlab for backing up data to AWS S3 and Google Cloud Storage

aws-s3-gitlab-cicd-gcp-storage
Gitlab has a generous free tier for CICD called "Gitlab Pipelines", that can be used to store build artifacts (or anything) in AWS S3 and Google Cloud Storage for free.

Grab the code here: GitlabToCloud.

How to use Gitlab to store code artifacts like jars or zip files?

  1. You will need a Gitlab account. You will also need an AWS account that has suitable access keys, (Secret key and Access key )to write to a bucket in AWS S3. 
  2. Similarly we also need to have proper service accounts for writing data in a bucket for Google Cloud.
  3. You will also need to setup Gitlab environment variables. The variables are as follows:
    • $AWS_ACCESS - AWS access key
    • $AWS_SECRET  - AWS secret key 
    •  $DAYS - Set it to 1 days
    •  $GCP_PROJECT_ID - The Google Project id
    • $GCP_SERVICE_KEY - The GCP service account created in step 2 above. Just copy all the file contents

Grab the code here: GitlabToCloud.


As you can see, we are using the Google Cloud image as it has python and gsutils already installed. We are installing aws-cli using pip so that we can easily copy the required files. All the files are deleted for security reasons after execution.


No comments:

Post a Comment

What do you think?.

© 2007-2020 DMCA.com Protection Status
The content is copyrighted to Sundeep Machado


Note: The author is not responsible for damages related to improper use of software, techniques, tips and copyright claims.