I am doing some work on validated restore scenarios using the new Veeam Cloud Tier that backed by an Object Storage Repository pointing at an Amazon S3 Bucket. So that I am not messing with the live data I wanted a way to copy and access the objects from another bucket or folder. There is no option at the moment to achieve this via the AWS Console, however it can be done via the AWS CLI.
First step was to ensure I had the AWS CLI installed on my MBP and it was at the latest version:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
Anthonys-MBP-2:vmc_vpc_subnet_master anthonyspiteri$ aws --version aws-cli/1.15.14 Python/2.7.13 Darwin/18.2.0 botocore/1.10.14 Anthonys-MBP-2:vmc_vpc_subnet_master anthonyspiteri$ pip install --upgrade awscli Collecting awscli Downloading https://files.pythonhosted.org/packages/cd/29/4f5cee313320f0c284d83c0111bd5b3a26398316d09df3daf883354c9554/awscli-1.16.99-py2.py3-none-any.whl (1.4MB) 100% |████████████████████████████████| 1.4MB 1.4MB/s Requirement already satisfied, skipping upgrade: docutils>=0.10 in /usr/local/lib/python2.7/site-packages (from awscli) (0.14) Collecting botocore==1.12.89 (from awscli) Downloading https://files.pythonhosted.org/packages/c8/6c/2058039815eb4eac4f2f7462ecae3e352e994d6618ba1f27114d9b985618/botocore-1.12.89-py2.py3-none-any.whl (5.2MB) 100% |████████████████████████████████| 5.3MB 587kB/s Collecting s3transfer<0.3.0,>=0.2.0 (from awscli) Downloading https://files.pythonhosted.org/packages/d7/de/5737f602e22073ecbded7a0c590707085e154e32b68d86545dcc31004c02/s3transfer-0.2.0-py2.py3-none-any.whl (69kB) 100% |████████████████████████████████| 71kB 2.3MB/s Requirement already satisfied, skipping upgrade: rsa<=3.5.0,>=3.1.2 in /usr/local/lib/python2.7/site-packages (from awscli) (3.4.2) Requirement already satisfied, skipping upgrade: colorama<=0.3.9,>=0.2.5 in /usr/local/lib/python2.7/site-packages (from awscli) (0.3.7) Requirement already satisfied, skipping upgrade: PyYAML<=3.13,>=3.10 in /Users/anthonyspiteri/Library/Python/2.7/lib/python/site-packages (from awscli) (3.12) Requirement already satisfied, skipping upgrade: urllib3<1.25,>=1.20; python_version == "2.7" in /Users/anthonyspiteri/Library/Python/2.7/lib/python/site-packages (from botocore==1.12.89->awscli) (1.22) Requirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= "2.7" in /usr/local/lib/python2.7/site-packages (from botocore==1.12.89->awscli) (2.6.1) Requirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in /usr/local/lib/python2.7/site-packages (from botocore==1.12.89->awscli) (0.9.3) Requirement already satisfied, skipping upgrade: futures<4.0.0,>=2.2.0; python_version == "2.6" or python_version == "2.7" in /usr/local/lib/python2.7/site-packages (from s3transfer<0.3.0,>=0.2.0->awscli) (3.2.0) Requirement already satisfied, skipping upgrade: pyasn1>=0.1.3 in /usr/local/lib/python2.7/site-packages (from rsa<=3.5.0,>=3.1.2->awscli) (0.1.9) Requirement already satisfied, skipping upgrade: six>=1.5 in /Users/anthonyspiteri/Library/Python/2.7/lib/python/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore==1.12.89->awscli) (1.11.0) Installing collected packages: botocore, s3transfer, awscli Found existing installation: botocore 1.10.14 Uninstalling botocore-1.10.14: Successfully uninstalled botocore-1.10.14 Found existing installation: s3transfer 0.1.13 Uninstalling s3transfer-0.1.13: Successfully uninstalled s3transfer-0.1.13 Found existing installation: awscli 1.15.14 Uninstalling awscli-1.15.14: Successfully uninstalled awscli-1.15.14 Successfully installed awscli-1.16.99 botocore-1.12.89 s3transfer-0.2.0 Anthonys-MBP-2:vmc_vpc_subnet_master anthonyspiteri$ aws --version aws-cli/1.16.99 Python/2.7.13 Darwin/18.2.0 botocore/1.12.89 |
For the first part of the copy process, I cheated and created a new Bucket from the AWS Console that was based on the one I wanted to copy.
Next step is to make sure that the AWS CLI is configured with the correct AWS Access and Secret keys. Once done, the command to copy/sync buckets is a simple one.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
$ aws configure AWS Access Key ID [****************5Z3A]: AWS Secret Access Key [****************QPuh]: Default region name [None]: Default output format [JSON]: $ aws s3 sync s3://veeam-ps-ct01 s3://veeam-ps-ct01-copy copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/objs/LOCAL-01.vbm.240 to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/objs/LOCAL-01.vbm.240 copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/2.part to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/2.part copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/1.part to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/1.part copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/3.part to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/3.part copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/4.part to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/4.part copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/5.part to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/5.part copy: s3://veeam-ps-ct01/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/6.part to s3://veeam-ps-ct01-copy/Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/0f8d270b-1f5e-1a1f-9e41-7d814f15426d.171.parts/6.part |
Obviously the time to complete the operation will depend on the amount of Objects in the Bucket and whether its cross region or local. It took about 4 hours to copy across ~50GB of data from US-EAST-2 to US-WEST-2 going at about 4MB/s. By default the process is shown on the screen.
Once the first pass was complete I ran the same command again which will this time look for differences between the source and destination and only sync the differences. You can run the command below to view the Total Objects and Total Size of both buckets for comparison.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
$ aws s3 ls --recursive s3://veeam-ps-ct01 --summarize 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/5.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/6.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/7.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/8.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/9.part Total Objects: 156423 Total Size: 51282309713 $ aws s3 ls --recursive s3://veeam-ps-ct01-copy --summarize 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/5.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/6.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/7.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/8.part 2019-02-06 12:06:30 5247002 Veeam/Archive/tenant-01/f07298b4-512b-4002-a323-9f1c2c799053/edb69c4e-bbee-18d6-b80e-2db974e4ba3e/storages/f521f2b7-d174-0b3c-8b0b-fb41e49864e6.120.parts/9.part Total Objects: 156423 Total Size: 51282309713 |
That is it! Pretty simple process. I’ll blog around the actual reason behind the Veeam Cloud Tier requirement and put this into action at a later date!
References:
https://docs.aws.amazon.com/cli/latest/userguide/install-macos.html
https://aws.amazon.com/premiumsupport/knowledge-center/move-objects-s3-bucket