Cloud Bucket Configuration

Scope3 can integrate with cloud providers to pull from and push data into storage buckets

Overview

This document outlines the requirements and steps needed to allow Scope3 to synchronize data from your cloud provider to Scope3 integrations. Depending on your cloud provider the setup will be different.

Google Cloud Storage

What you will need to get started:

  • A bucket in Google Cloud Storage

Service Account:

project-449115859788@storage-transfer-service.iam.gserviceaccount.com

Roles:

Storage Object Viewer (roles/storage.objectViewer)
Storage Legacy Bucket Reader (roles/storage.legacyBucketReader)
  1. In the Google Cloud console, go to the Cloud Storage Buckets page.
  2. Click the Bucket overflow menu button associated with the bucket to which you want to grant a principal a role.
  3. Choose Edit access.
  4. Click the + Add principal button.
  5. In the New principals field, enter the Scope3 transfer service account email: project-449115859788@storage-transfer-service.iam.gserviceaccount.com.
  6. Select Storage Object Viewer from the Select a role drop-down menu.
  7. Click Add another role.
  8. Select Storage Legacy Bucket Reader.
  9. Click Save.
  10. Items to send to Scope3:
    1. GCS bucket URI e.g gs://example-bucket-name
    2. Optional - a sub-folder in the bucket all items will be stored in if different from root directory
    3. Optional - file prefix(es) that indicate the only files we should consider in synchronizing
    4. Optional - A date in which to only synchronize files that are created/modified after

Write permissions (optional)

As well as reports being available in the CSP, data can also be written back to your cloud bucket. Please add the following permissions to your bucket to make this possible:

  1. In the Google Cloud console, go to the Cloud Storage Buckets page.
  2. Click the Bucket overflow menu button associated with the bucket to which you want to grant a principal a role.
  3. Choose Edit access.
  4. Click the + Add principal button.
  5. In the New principals field, enter the Scope3 service account email: [email protected].
  6. Select Storage Object Creator Writer from the Select a role drop-down menu.
  7. Select Storage Object Viewer Writer from the Select a role drop-down menu.
  8. Click Save.

Amazon S3

What you will need to get started:

  • A bucket in Amazon S3

Supported AWS regions:

af-south-1
ap-east-1
ap-northeast-1
ap-northeast-2
ap-northeast-3
ap-south-1
ap-south-2
ap-southeast-1
ap-southeast-2
ap-southeast-3
ca-central-1
eu-central-1
eu-central-2
eu-north-1
eu-south-1
eu-south-2
eu-west-1
eu-west-2
eu-west-3
me-central-1
me-south-1
sa-east-1
us-east-1
us-east-2
us-west-1
us-west-2
  1. Create a new IAM role in AWS.

  2. Select Custom trust policy as the trusted entity type.

  3. Copy and paste the following trust policy:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Principal": {
            "Federated": "accounts.google.com"
          },
          "Action": "sts:AssumeRoleWithWebIdentity",
          "Condition": {
            "StringEquals": {
              "accounts.google.com:sub": "109477913720839849913"
            }
          }
        }
      ]
    }
    
  4. Grant the following permissions policies to the role:

    Replace YOUR_AWS_BUCKET_NAME with the name of your AWS S3 bucket.

    {
      "Version": "2012-10-17",
      "Statement": [
          {
              "Effect": "Allow",
              "Action": [
                  "s3:GetObject",
                  "s3:ListBucket",
                  "s3:GetBucketLocation"
              ],
              "Resource": [
                  "arn:aws:s3:::YOUR_AWS_BUCKET_NAME/*",
                  "arn:aws:s3:::YOUR_AWS_BUCKET_NAME"
              ]
          }
      ]
    }
    
  5. Assign a name to the role and create the role.

  6. Once created, view the role details to retrieve the Amazon Resource Name (ARN) and provide this to Scope3. Note this value; it has the format arn:aws:iam::**AWS_ACCOUNT**:role/**ROLE_NAME**

  7. Items to send to Scope3:

    1. S3 bucket URI e.g s3://example-bucket-name
    2. The role ARN created above e.g arn:aws:iam::**AWS_ACCOUNT**:role/**ROLE_NAME**
    3. Optional - a sub-folder in the bucket all items will be stored in if different from root directory
    4. Optional - file prefix(es) that indicate the only files we should consider in synchronizing
    5. Optional - A date in which to only synchronize files that are created/modified after

Write permissions (optional)

As well as reports being available in the CSP, data can also be written back to your cloud bucket. Please add the following bucket policy to make this possible:

{
"Version": "2012-10-17",
"Statement": [
        {
            "Sid": "Scope3 Data Sync access permissions",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::948454267882:user/scope3-data-sync-service"
            },
            "Action": [
								"s3:ListBucket",
                "s3:GetObject",
                "s3:PutObject"
            ],

            "Resource": [
                "arn:aws:s3:::YOUR_AWS_BUCKET_NAME",
                "arn:aws:s3:::YOUR_AWS_BUCKET_NAME/*"
            ]
        }
    ]
}

Making sure to swap "YOUR_AWS_BUCKET_NAME" with your bucket name.

Microsoft Azure Blob Storage

What you will need to get started:

  • Microsoft Azure Blob Storage Account
  • Microsoft Azure Blob Storage Container

Supported Azure Blob Storage regions:

Americas: 
East US
East US 2
West US
West US 2
West US 3
Central US
North Central US
South Central US
West Central US
Canada Central
Canada East
Brazil South

Asia-Pacific:
Australia Central
Australia East
Australia Southeast
Central India
South India
West India
Southeast Asia
East Asia
Japan East
Japan West
Korea South
Korea Central

Europe, Middle East, Africa (EMEA):
France Central
Germany West Central
Norway East
Sweden Central
Switzerland North
North Europe
West Europe
UK South
UK West
Qatar Central
UAE North
South Africa North

Follow these steps to configure access to a Microsoft Azure Storage container:

  1. Create or use an existing Microsoft Azure Storage user to access the storage account for your Microsoft Azure Storage Blob container.
  2. Create an SAS (shared access signature) token at the container level. See Grant limited access to Azure Storage resources using shared access signatures for instructions.
    1. The Allowed services must include Blob.
    2. For Allowed resource types select both Container and Object.
    3. The Allowed permissions must include Read and List.
    4. The default expiration time for SAS tokens is 8 hours. Set a reasonable expiration time that enables you to successfully complete your transfer. We recommend at least 6 months.
    5. Do not specify any IP addresses in the Allowed IP addresses field. Storage Transfer Service uses various IP addresses and doesn't support IP address restriction.
    6. The Allowed protocols should be HTTPS only.
  3. Once the token is created, note the SAS token value that is returned. You need this value when communicating with Scope3.

Caution: Basic SAS tokens can't be revoked, and the only way to invalidate a basic SAS token is to remove the storage access key of your account. We strongly recommend that you create SAS tokens from stored access policies, so that you can revoke a policy to invalidate an SAS token. For more information, see Best practices when using SAS.

  1. Items to send to Scope3:
  • Storage account name
  • Storage container name
  • The SAS (shared access signature) created above
  • Optional - a sub-folder in the bucket all items will be stored in if different from root directory
  • Optional - file prefix(es) that indicate the only files we should consider in synchronizing
  • Optional - A date in which to only synchronize files that are created/modified after

Configure your bucket using the Integrate UI

Once you have permissions correctly setup on your bucket its time to switch to the Integrate UI

Integrate is a browser based tool that allows setup of bucket automations. Head to the Integrate cloud storage page and click "add bucket".

Pick your cloud provider and enter the required information about your bucket. For example if you are using AWS you will enter the bucket URI and ARN. Once the bucket is added you will see a green icon to indicate successful setup or a red failure icon with helpful hints on fixing the bucket configuration.

📘

Note: In order to check bucket write permissions, Scope3 will add a simple file to your bucket. The file be named .scope3ignore and contain a small amount of test data. This file can be safely ignored.

If you see the green success icon, you are now able to configure bucket based automations.